WorldWideScience

Sample records for modeling framework enabling

  1. A Smart Modeling Framework for Integrating BMI-enabled Models as Web Services

    Science.gov (United States)

    Jiang, P.; Elag, M.; Kumar, P.; Peckham, S. D.; Liu, R.; Marini, L.; Hsu, L.

    2015-12-01

    Serviced-oriented computing provides an opportunity to couple web service models using semantic web technology. Through this approach, models that are exposed as web services can be conserved in their own local environment, thus making it easy for modelers to maintain and update the models. In integrated modeling, the serviced-oriented loose-coupling approach requires (1) a set of models as web services, (2) the model metadata describing the external features of a model (e.g., variable name, unit, computational grid, etc.) and (3) a model integration framework. We present the architecture of coupling web service models that are self-describing by utilizing a smart modeling framework. We expose models that are encapsulated with CSDMS (Community Surface Dynamics Modeling System) Basic Model Interfaces (BMI) as web services. The BMI-enabled models are self-describing by uncovering models' metadata through BMI functions. After a BMI-enabled model is serviced, a client can initialize, execute and retrieve the meta-information of the model by calling its BMI functions over the web. Furthermore, a revised version of EMELI (Peckham, 2015), an Experimental Modeling Environment for Linking and Interoperability, is chosen as the framework for coupling BMI-enabled web service models. EMELI allows users to combine a set of component models into a complex model by standardizing model interface using BMI as well as providing a set of utilities smoothing the integration process (e.g., temporal interpolation). We modify the original EMELI so that the revised modeling framework is able to initialize, execute and find the dependencies of the BMI-enabled web service models. By using the revised EMELI, an example will be presented on integrating a set of topoflow model components that are BMI-enabled and exposed as web services. Reference: Peckham, S.D. (2014) EMELI 1.0: An experimental smart modeling framework for automatic coupling of self-describing models, Proceedings of HIC 2014

  2. Enabling the dynamic coupling between sensor web and Earth system models - The Self-Adaptive Earth Predictive Systems (SEPS) framework

    Science.gov (United States)

    di, L.; Yu, G.; Chen, N.

    2007-12-01

    The self-adaptation concept is the central piece of the control theory widely and successfully used in engineering and military systems. Such a system contains a predictor and a measurer. The predictor takes initial condition and makes an initial prediction and the measurer then measures the state of a real world phenomenon. A feedback mechanism is built in that automatically feeds the measurement back to the predictor. The predictor takes the measurement against the prediction to calculate the prediction error and adjust its internal state based on the error. Thus, the predictor learns from the error and makes a more accurate prediction in the next step. By adopting the self-adaptation concept, we proposed the Self-adaptive Earth Predictive System (SEPS) concept for enabling the dynamic coupling between the sensor web and the Earth system models. The concept treats Earth System Models (ESM) and Earth Observations (EO) as integral components of the SEPS coupled by the SEPS framework. EO measures the Earth system state while ESM predicts the evolution of the state. A feedback mechanism processes EO measurements and feeds them into ESM during model runs or as initial conditions. A feed-forward mechanism analyzes the ESM predictions against science goals for scheduling optimized/targeted observations. The SEPS framework automates the Feedback and Feed-forward mechanisms (the FF-loop). Based on open consensus-based standards, a general SEPS framework can be developed for supporting the dynamic, interoperable coupling between ESMs and EO. Such a framework can support the plug-in-and-play capability of both ESMs and diverse sensors and data systems as long as they support the standard interfaces. This presentation discusses the SEPS concept, the service-oriented architecture (SOA) of SEPS framework, standards of choices for the framework, and the implementation. The presentation also presents examples of SEPS to demonstrate dynamic, interoperable, and live coupling of

  3. A framework for structural modelling of an RFID-enabled intelligent distributed manufacturing control system

    Directory of Open Access Journals (Sweden)

    Barenji, Ali Vatankhah

    2014-08-01

    Full Text Available A modern manufacturing facility typically contains several distributed control systems, such as machining stations, assembly stations, and material handling and storage systems. Integrating Radio Frequency Identification (RFID technology into these control systems provides a basis for monitoring and configuring their components in real-time. With the right structural modelling, it is then possible to evaluate designs and translate them into new operational applications almost immediately. This paper proposes an architecture for the structural modelling of an intelligent distributed control system for a manufacturing facility, by utilising RFID technology. Emphasis is placed on a requirements analysis of the manufacturing system, the design of RFID-enabled intelligent distributed control systems using Unified Modelling Language (UML diagrams, and the use of efficient algorithms and tools for the implementation of these systems.

  4. A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures

    Science.gov (United States)

    Bednarcyk, Brett A.; Arnold, Steven M.

    2012-01-01

    A framework for the multiscale design and analysis of composite materials and structures is presented. The ImMAC software suite, developed at NASA Glenn Research Center, embeds efficient, nonlinear micromechanics capabilities within higher scale structural analysis methods such as finite element analysis. The result is an integrated, multiscale tool that relates global loading to the constituent scale, captures nonlinearities at this scale, and homogenizes local nonlinearities to predict their effects at the structural scale. Example applications of the multiscale framework are presented for the stochastic progressive failure of a SiC/Ti composite tensile specimen and the effects of microstructural variations on the nonlinear response of woven polymer matrix composites.

  5. The OGC Sensor Web Enablement framework

    Science.gov (United States)

    Cox, S. J.; Botts, M.

    2006-12-01

    Sensor observations are at the core of natural sciences. Improvements in data-sharing technologies offer the promise of much greater utilisation of observational data. A key to this is interoperable data standards. The Open Geospatial Consortium's (OGC) Sensor Web Enablement initiative (SWE) is developing open standards for web interfaces for the discovery, exchange and processing of sensor observations, and tasking of sensor systems. The goal is to support the construction of complex sensor applications through real-time composition of service chains from standard components. The framework is based around a suite of standard interfaces, and standard encodings for the message transferred between services. The SWE interfaces include: Sensor Observation Service (SOS)-parameterized observation requests (by observation time, feature of interest, property, sensor); Sensor Planning Service (SPS)-tasking a sensor- system to undertake future observations; Sensor Alert Service (SAS)-subscription to an alert, usually triggered by a sensor result exceeding some value. The interface design generally follows the pattern established in the OGC Web Map Service (WMS) and Web Feature Service (WFS) interfaces, where the interaction between a client and service follows a standard sequence of requests and responses. The first obtains a general description of the service capabilities, followed by obtaining detail required to formulate a data request, and finally a request for a data instance or stream. These may be implemented in a stateless "REST" idiom, or using conventional "web-services" (SOAP) messaging. In a deployed system, the SWE interfaces are supplemented by Catalogue, data (WFS) and portrayal (WMS) services, as well as authentication and rights management. The standard SWE data formats are Observations and Measurements (O&M) which encodes observation metadata and results, Sensor Model Language (SensorML) which describes sensor-systems, Transducer Model Language (TML) which

  6. A New Approach to Predict Microbial Community Assembly and Function Using a Stochastic, Genome-Enabled Modeling Framework

    Science.gov (United States)

    King, E.; Brodie, E.; Anantharaman, K.; Karaoz, U.; Bouskill, N.; Banfield, J. F.; Steefel, C. I.; Molins, S.

    2016-12-01

    Characterizing and predicting the microbial and chemical compositions of subsurface aquatic systems necessitates an understanding of the metabolism and physiology of organisms that are often uncultured or studied under conditions not relevant for one's environment of interest. Cultivation-independent approaches are therefore important and have greatly enhanced our ability to characterize functional microbial diversity. The capability to reconstruct genomes representing thousands of populations from microbial communities using metagenomic techniques provides a foundation for development of predictive models for community structure and function. Here, we discuss a genome-informed stochastic trait-based model incorporated into a reactive transport framework to represent the activities of coupled guilds of hypothetical microorganisms. Metabolic pathways for each microbe within a functional guild are parameterized from metagenomic data with a unique combination of traits governing organism fitness under dynamic environmental conditions. We simulate the thermodynamics of coupled electron donor and acceptor reactions to predict the energy available for cellular maintenance, respiration, biomass development, and enzyme production. While `omics analyses can now characterize the metabolic potential of microbial communities, it is functionally redundant as well as computationally prohibitive to explicitly include the thousands of recovered organisms into biogeochemical models. However, one can derive potential metabolic pathways from genomes along with trait-linkages to build probability distributions of traits. These distributions are used to assemble groups of microbes that couple one or more of these pathways. From the initial ensemble of microbes, only a subset will persist based on the interaction of their physiological and metabolic traits with environmental conditions, competing organisms, etc. Here, we analyze the predicted niches of these hypothetical microbes and

  7. Complexity Science Framework for Big Data: Data-enabled Science

    Science.gov (United States)

    Surjalal Sharma, A.

    2016-07-01

    The ubiquity of Big Data has stimulated the development of analytic tools to harness the potential for timely and improved modeling and prediction. While much of the data is available near-real time and can be compiled to specify the current state of the system, the capability to make predictions is lacking. The main reason is the basic nature of Big Data - the traditional techniques are challenged in their ability to cope with its velocity, volume and variability to make optimum use of the available information. Another aspect is the absence of an effective description of the time evolution or dynamics of the specific system, derived from the data. Once such dynamical models are developed predictions can be made readily. This approach of " letting the data speak for itself " is distinct from the first-principles models based on the understanding of the fundamentals of the system. The predictive capability comes from the data-derived dynamical model, with no modeling assumptions, and can address many issues such as causality and correlation. This approach provides a framework for addressing the challenges in Big Data, especially in the case of spatio-temporal time series data. The reconstruction of dynamics from time series data is based on recognition that in most systems the different variables or degrees of freedom are coupled nonlinearly and in the presence of dissipation the state space contracts, effectively reducing the number of variables, thus enabling a description of its dynamical evolution and consequently prediction of future states. The predictability is analysed from the intrinsic characteristics of the distribution functions, such as Hurst exponents and Hill estimators. In most systems the distributions have heavy tails, which imply higher likelihood for extreme events. The characterization of the probabilities of extreme events are critical in many cases e. g., natural hazards, for proper assessment of risk and mitigation strategies. Big Data with

  8. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...... are generated through the template in ICAS-MoT and translated into a model object. Once in ICAS-MoT, the model is numerical analyzed, solved and identified. A computer-aided modeling framework integrating systematic model derivation and development tools has been developed. It includes features for model...

  9. Computer-Aided Modeling Framework

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    with them. As the required models may be complex and require multiple time and/or length scales, their development and application for product-process design is not trivial. Therefore, a systematic modeling framework can contribute by significantly reducing the time and resources needed for model...... development and application. The proposed work is a part of the project for development of methods and tools that will allow systematic generation, analysis and solution of models for various objectives. It will use the computer-aided modeling framework that is based on a modeling methodology, which combines....... In this contribution, the concept of template-based modeling is presented and application is highlighted for the specific case of catalytic membrane fixed bed models. The modeling template is integrated in a generic computer-aided modeling framework. Furthermore, modeling templates enable the idea of model reuse...

  10. An Integrated Conceptual Framework for RFID Enabled Healthcare

    Directory of Open Access Journals (Sweden)

    Gaurav Gupta

    2015-12-01

    Full Text Available Radio frequency identification (RFID technology is a wireless communication technology that facilitates automatic identification and data capture without human intervention. Since 2000s, RFID applications in the health care industry are increasing.  RFID has brought many improvements in areas like patient care, patient safety, equipment tracking, resource utilization, processing time reduction and so on. On the other hand, often deployment of RFID is questioned on the issues like high capital investment, technological complexity, and privacy concerns. Exploration of existing literature indicates the presence of works on the topics like asset management, patient management, staff management, institutional advantages, and organizational issues. However, most of the works are focused on a particular issue. Still now, scholarly attempts to integrate all the facades of RFID-enabled healthcare are limited. In this paper, we propose a conceptual framework that represents the scope for implementation of this technology and the various dimensions of RFID-enabled healthcare and demonstrate them in detail. Also, we have discussed the critical issues that can prove to be potential barriers to its successful implementation and current approaches to resolving these. We also discuss some of the regulatory initiatives encouraging its adoption in the healthcare industry. Also, we have highlighted the future research opportunities in this domain.

  11. A Working Framework for Enabling International Science Data System Interoperability

    Science.gov (United States)

    Hughes, J. Steven; Hardman, Sean; Crichton, Daniel J.; Martinez, Santa; Law, Emily; Gordon, Mitchell K.

    2016-07-01

    For diverse scientific disciplines to interoperate they must be able to exchange information based on a shared understanding. To capture this shared understanding, we have developed a knowledge representation framework that leverages ISO level reference models for metadata registries and digital archives. This framework provides multi-level governance, evolves independent of the implementation technologies, and promotes agile development, namely adaptive planning, evolutionary development, early delivery, continuous improvement, and rapid and flexible response to change. The knowledge representation is captured in an ontology through a process of knowledge acquisition. Discipline experts in the role of stewards at the common, discipline, and project levels work to design and populate the ontology model. The result is a formal and consistent knowledge base that provides requirements for data representation, integrity, provenance, context, identification, and relationship. The contents of the knowledge base are translated and written to files in suitable formats to configure system software and services, provide user documentation, validate input, and support data analytics. This presentation will provide an overview of the framework, present a use case that has been adopted by an entire science discipline at the international level, and share some important lessons learned.

  12. Realising the Uncertainty Enabled Model Web

    Science.gov (United States)

    Cornford, D.; Bastin, L.; Pebesma, E. J.; Williams, M.; Stasch, C.; Jones, R.; Gerharz, L.

    2012-12-01

    conversion between uncertainty types, and between the spatial / temporal support of service inputs / outputs. Finally we describe the tools being generated within the UncertWeb project, considering three main aspects: i) Elicitation of uncertainties on model inputs. We are developing tools to enable domain experts to provide judgements about input uncertainties from UncertWeb model components (e.g. parameters in meteorological models) which allow panels of experts to engage in the process and reach a consensus view on the current knowledge / beliefs about that parameter or variable. We are developing systems for continuous and categorical variables as well as stationary spatial fields. ii) Visualisation of the resulting uncertain outputs from the end of the workflow, but also at intermediate steps. At this point we have prototype implementations driven by the requirements from the use cases that motivate UncertWeb. iii) Sensitivity and uncertainty analysis on model outputs. Here we show the design of the overall system we are developing, including the deployment of an emulator framework to allow computationally efficient approaches. We conclude with a summary of the open issues and remaining challenges we are facing in UncertWeb, and provide a brief overview of how we plan to tackle these.

  13. A framework for sustainable interorganizational business model

    OpenAIRE

    Neupane, Ganesh Prasad; Haugland, Sven A.

    2016-01-01

    Drawing on literature on business model innovations and sustainability, this paper develops a framework for sustainable interorganizational business models. The aim of the framework is to enhance the sustainability of firms’ business models by enabling firms to create future value by taking into account environmental, social and economic factors. The paper discusses two themes: (1) application of the term sustainability to business model innovation, and (2) implications of integrating sustain...

  14. SDN-Enabled Communication Network Framework for Energy Internet

    Directory of Open Access Journals (Sweden)

    Zhaoming Lu

    2017-01-01

    Full Text Available To support distributed energy generators and improve energy utilization, energy Internet has attracted global research focus. In China, energy Internet has been proposed as an important issue of government and institutes. However, managing a large amount of distributed generators requires smart, low-latency, reliable, and safe networking infrastructure, which cannot be supported by traditional networks in power grids. In order to design and construct smart and flexible energy Internet, we proposed a software defined network framework with both microgrid cluster level and global grid level designed by a hierarchical manner, which will bring flexibility, efficiency, and reliability for power grid networks. Finally, we evaluate and verify the performance of this framework in terms of latency, reliability, and security by both theoretical analysis and real-world experiments.

  15. An Enabling Framework for Reflexive Learning: Experiential Learning and Reflexivity in Contemporary Modernity

    Science.gov (United States)

    Dyke, Martin

    2009-01-01

    The paper presents an enabling framework for experiential learning that connects with reflexive modernity. This framework places an emphasis on learning with others and on the role of theory, practice and reflection. A sociological argument is constructed for an alternative framework for experiential learning that derives from social theory. It is…

  16. A KBE-enabled design framework for cost/weight optimization study of aircraft composite structures

    Science.gov (United States)

    Wang, H.; La Rocca, G.; van Tooren, M. J. L.

    2014-10-01

    Traditionally, minimum weight is the objective when optimizing airframe structures. This optimization, however, does not consider the manufacturing cost which actually determines the profit of the airframe manufacturer. To this purpose, a design framework has been developed able to perform cost/weight multi-objective optimization of an aircraft component, including large topology variations of the structural configuration. The key element of the proposed framework is a dedicated knowledge based engineering (KBE) application, called multi-model generator, which enables modelling very different product configurations and variants and extract all data required to feed the weight and cost estimation modules, in a fully automated fashion. The weight estimation method developed in this research work uses Finite Element Analysis to calculate the internal stresses of the structural elements and an analytical composite plate sizing method to determine their minimum required thicknesses. The manufacturing cost estimation module was developed on the basis of a cost model available in literature. The capability of the framework was successfully demonstrated by designing and optimizing the composite structure of a business jet rudder. The study case indicates the design framework is able to find the Pareto optimal set for minimum structural weight and manufacturing costin a very quick way. Based on the Pareto set, the rudder manufacturer is in conditions to conduct both internal trade-off studies between minimum weight and minimum cost solutions, as well as to offer the OEM a full set of optimized options to choose, rather than one feasible design.

  17. A framework for WRF to WRF-IBM grid nesting to enable multiscale simulations

    Energy Technology Data Exchange (ETDEWEB)

    Wiersema, David John [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Univ. of California, Berkeley, CA (United States); Lundquist, Katherine A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chow, Fotini Katapodes [Univ. of California, Berkeley, CA (United States)

    2016-09-29

    With advances in computational power, mesoscale models, such as the Weather Research and Forecasting (WRF) model, are often pushed to higher resolutions. As the model’s horizontal resolution is refined, the maximum resolved terrain slope will increase. Because WRF uses a terrain-following coordinate, this increase in resolved terrain slopes introduces additional grid skewness. At high resolutions and over complex terrain, this grid skewness can introduce large numerical errors that require methods, such as the immersed boundary method, to keep the model accurate and stable. Our implementation of the immersed boundary method in the WRF model, WRF-IBM, has proven effective at microscale simulations over complex terrain. WRF-IBM uses a non-conforming grid that extends beneath the model’s terrain. Boundary conditions at the immersed boundary, the terrain, are enforced by introducing a body force term to the governing equations at points directly beneath the immersed boundary. Nesting between a WRF parent grid and a WRF-IBM child grid requires a new framework for initialization and forcing of the child WRF-IBM grid. This framework will enable concurrent multi-scale simulations within the WRF model, improving the accuracy of high-resolution simulations and enabling simulations across a wide range of scales.

  18. A Framework for BIM-enabled Life-cycle Information Management of Construction Project

    OpenAIRE

    Xun Xu; Ling Ma; Lieyun Ding

    2014-01-01

    BIM has been widely used in project management, but on the whole the applications have been scattered and the BIM models have not been deployed throughout the whole project life-cycle. Each participant builds their own BIM, so there is a major problem in how to integrate these dynamic and fragmented data together. In order to solve this problem, this paper focuses on BIM- based life-cycle information management and builds a framework for BIM-enabled life-cycle information management. To organ...

  19. A Framework for BIM-enabled Life-cycle Information Management of Construction Project

    OpenAIRE

    Xu, Xun; Ma, Ling; Ding, Lieyun

    2014-01-01

    BIM has been widely used in project management, but on the whole the applications have been scattered and the BIM models have not been deployed throughout the whole project life-cycle. Each participant builds their own BIM, so there is a major problem in how to integrate these dynamic and fragmented data together. In order to solve this problem, this paper focuses on BIM- based life-cycle information management and builds a framework for BIM-enabled life-cycle information management. To organ...

  20. Towards Cache-Enabled, Order-Aware, Ontology-Based Stream Reasoning Framework

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Rui; Praggastis, Brenda L.; Smith, William P.; McGuinness, Deborah L.

    2016-08-16

    While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQL is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.

  1. A Framework for BIM-Enabled Life-Cycle Information Management of Construction Project

    Directory of Open Access Journals (Sweden)

    Xun Xu

    2014-08-01

    Full Text Available BIM has been widely used in project management, but on the whole the applications have been scattered and the BIM models have not been deployed throughout the whole project life-cycle. Each participant builds their own BIM, so there is a major problem in how to integrate these dynamic and fragmented data together. In order to solve this problem, this paper focuses on BIM-based life-cycle information management and builds a framework for BIM-enabled life-cycle information management. To organize the life-cycle information well, the information components and information flow during the project life-cycle are defined. Then, the application of BIM in life-cycle information management is analysed. This framework will provide a unified platform for information management and ensure data integrity.

  2. Grid—Enabled Data Access in the ATLAS Athena Framework

    Institute of Scientific and Technical Information of China (English)

    D.Malon; S.Resconi; 等

    2001-01-01

    Athena is the common framework used by the ATLAS experiment for simulation,reconstruction,and analysis,By design,Athena supports multiple persistence services,and insulates users from technology-specific persistence details.Athena users and even most Athena package developers should neither know nor care whether data come from the grid or from local filesystems.nor whether data reside in object databases,in ROOT or ZEBRA files,or in ASCII files.In this paper we describe how Athena applications may transparently take advantage of emerging services provided by grid software today-how data generated by Athea jobs are registered in grid replica catalogs and other collection management services,and the means by which input data are identified and located in a grid-aware collection management environment.We outline an evolutionary path toward incorporation of grid-based virtual data services,whereby locating data may be replaced by locating a recipe according to which that dta may be generated.Several implementation scenarios,ranging from lowlevel grid catalog services(e.g.,from Globus)through higher-level services such as the Grid Data Management Pilot (under development as part of the European DataGrid porject,in collaboration,with the Particle Physics Data Grid)to more conventional database services,and a common architecture to support these various scenarios,are also described.

  3. Web-Enabled Framework for Real-Time Scheduler Simulator: A Teaching Too

    Directory of Open Access Journals (Sweden)

    C. Yaashuwanth

    2010-01-01

    Full Text Available Problem statement: A Real-Time System (RTS is one which controls an environment by receiving data, processing it, and returning the results quickly enough to affect the functioning of the environment at that time. The main objective of this research was to develop an architectural model for the simulation of real time tasks to implement in distributed environment through web, and to make comparison between various scheduling algorithms. The proposed model can be used for preprogrammed scheduling policies for uniprocessor systems. This model provided user friendly Graphical User Interface (GUI. Approach: Though a lot of scheduling algorithms have been developed, just a few of them are available to be implemented in real-time applications. In order to use, test and evaluate a scheduling policy it must be integrated into an operating system, which is a complex task. Simulation is another alternative to evaluate a scheduling policy. Unfortunately, just a few real-time scheduling simulators have been developed to date and most of them require the use of a specific simulation language. Results: Task ID, deadline, priority, period, computation time and phase are the input task attributes to the scheduler simulator and chronograph imitating the real-time execution of the input task set and computational statistics of the schedule are the output. Conclusion: The Web-enabled framework proposed in this study gave the developer to evaluate the schedulability of the real time application. Numerous benefits were quoted in support of the Web-based deployment. The proposed framework can be used as an invaluable teaching tool. Further, the GUI of the framework will allow for easy comparison of the framework of existing scheduling policies and also simulate the behavior and verify the suitability of custom defined schedulers for real-time applications.

  4. Rethinking Sustainability, Scaling Up, and Enabling Environment: A Framework for Their Implementation in Drinking Water Supply

    Directory of Open Access Journals (Sweden)

    Urooj Q. Amjad

    2015-04-01

    Full Text Available The terms sustainability, scaling up, and enabling environment are inconsistently used in implementing water supply projects. To clarify these terms we develop a framework based on Normalization Process Theory, and apply the framework to a hypothetical water supply project in schools. The resulting framework provides guidance on how these terms could be implemented and analyzed in water supply projects. We conclude that effective use of the terms sustainability, scaling up, and enabling environment would focus on purpose, process, and perspective. This is the first known attempt to analyze the implementation of the three terms together in the context of water supply services.

  5. Knowledge Encapsulation Framework for Collaborative Social Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Cowell, Andrew J.; Gregory, Michelle L.; Marshall, Eric J.; McGrath, Liam R.

    2009-03-24

    This paper describes the Knowledge Encapsulation Framework (KEF), a suite of tools to enable knowledge inputs (relevant, domain-specific facts) to modeling and simulation projects, as well as other domains that require effective collaborative workspaces for knowledge-based task. This framework can be used to capture evidence (e.g., trusted material such as journal articles and government reports), discover new evidence (covering both trusted and social media), enable discussions surrounding domain-specific topics and provide automatically generated semantic annotations for improved corpus investigation. The current KEF implementation is presented within a wiki environment, providing a simple but powerful collaborative space for team members to review, annotate, discuss and align evidence with their modeling frameworks. The novelty in this approach lies in the combination of automatically tagged and user-vetted resources, which increases user trust in the environment, leading to ease of adoption for the collaborative environment.

  6. An Android-enabled mobile framework for ensuring quality of life through patient-centric care.

    Science.gov (United States)

    Koufi, Vassiliki; Malamateniou, Flora; Vassilacopoulos, George

    2012-01-01

    The drive to achieve excellence in healthcare delivery while containing costs, underlies the need for a new generation of applications which facilitate the realization of a patient-centric care model. Under this emerging care model healthcare delivery can be integrated across the continuum of services, from prevention to follow up, and care can be coordinated across all settings. With care moving out into the community, health systems require real-time information to deliver coordinated care to patients. The integration of leading-edge technologies, such as mobile technology, with Personal Health Records (PHRs) can meet this requirement by making comprehensive and unified health information available to authorized users at any point of care or decision making through familiar environments such as Google's Android. This paper presents a framework that provides ubiquitous access to patients' PHRs via Android-enabled mobile devices. Where possible health information access and management is performed in a transparent way, thus enabling healthcare professionals to devote more time on practicing medicine and patients to manage their own health with the least possible intervention. This depends heavily on the context, which is collected by both Android-specific core system services and special purpose software agents with the latter being also responsible for preserving PHR data privacy.

  7. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Directory of Open Access Journals (Sweden)

    Zhenlong Li

    Full Text Available Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA. Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  8. Enabling big geoscience data analytics with a cloud-based, MapReduce-enabled and service-oriented workflow framework.

    Science.gov (United States)

    Li, Zhenlong; Yang, Chaowei; Jin, Baoxuan; Yu, Manzhu; Liu, Kai; Sun, Min; Zhan, Matthew

    2015-01-01

    Geoscience observations and model simulations are generating vast amounts of multi-dimensional data. Effectively analyzing these data are essential for geoscience studies. However, the tasks are challenging for geoscientists because processing the massive amount of data is both computing and data intensive in that data analytics requires complex procedures and multiple tools. To tackle these challenges, a scientific workflow framework is proposed for big geoscience data analytics. In this framework techniques are proposed by leveraging cloud computing, MapReduce, and Service Oriented Architecture (SOA). Specifically, HBase is adopted for storing and managing big geoscience data across distributed computers. MapReduce-based algorithm framework is developed to support parallel processing of geoscience data. And service-oriented workflow architecture is built for supporting on-demand complex data analytics in the cloud environment. A proof-of-concept prototype tests the performance of the framework. Results show that this innovative framework significantly improves the efficiency of big geoscience data analytics by reducing the data processing time as well as simplifying data analytical procedures for geoscientists.

  9. Dicyanometallates as Model Extended Frameworks

    Science.gov (United States)

    2016-01-01

    We report the structures of eight new dicyanometallate frameworks containing molecular extra-framework cations. These systems include a number of hybrid inorganic–organic analogues of conventional ceramics, such as Ruddlesden–Popper phases and perovskites. The structure types adopted are rationalized in the broader context of all known dicyanometallate framework structures. We show that the structural diversity of this family can be understood in terms of (i) the charge and coordination preferences of the particular metal cation acting as framework node, and (ii) the size, shape, and extent of incorporation of extra-framework cations. In this way, we suggest that dicyanometallates form a particularly attractive model family of extended frameworks in which to explore the interplay between molecular degrees of freedom, framework topology, and supramolecular interactions. PMID:27057759

  10. Geologic Framework Model (GFM2000)

    Energy Technology Data Exchange (ETDEWEB)

    T. Vogt

    2004-08-26

    The purpose of this report is to document the geologic framework model, version GFM2000 with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, and the differences between GFM2000 and previous versions. The version number of this model reflects the year during which the model was constructed. This model supersedes the previous model version, documented in Geologic Framework Model (GFM 3.1) (CRWMS M&O 2000 [DIRS 138860]). The geologic framework model represents a three-dimensional interpretation of the geology surrounding the location of the monitored geologic repository for spent nuclear fuel and high-level radioactive waste at Yucca Mountain. The geologic framework model encompasses and is limited to an area of 65 square miles (168 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the geologic framework model (shown in Figure 1-1) were chosen to encompass the exploratory boreholes and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The upper surface of the model is made up of the surface topography and the depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The geologic framework model was constructed from geologic map and borehole data. Additional information from measured stratigraphic sections, gravity profiles, and seismic profiles was also considered. The intended use of the geologic framework model is to provide a geologic framework over the area of interest consistent with the level of detailed needed for hydrologic flow and radionuclide transport modeling through the UZ and for repository design. The model is limited by the availability of data and relative amount of geologic complexity found in an area. The geologic framework model is inherently limited by scale and content. The grid spacing used in the

  11. RFID- enabled Supply Chain Business Model Research: A Theoretical Analysis Framework%RFID技术供应链应用商业模式研究:一个理论分析框架

    Institute of Scientific and Technical Information of China (English)

    戴勇

    2012-01-01

    本文以RFID技术供应链应用的商业模式为研究对象,建立了基于价值主张、价值网络、价值创造、价值评估四维的商业模式分析框架,构建了商业模式的识别与评估模型,提出了RFID供应链应用的三种商业模式:标签型、平台型、定制型,并分析了商业模式的实施策略。%This paper takes the RFID - enabled supply chain application business model as the object of study and establishes the four -dimensional business model theoretical analysis frame based on the four value po- sition, the value network, the value creation, the value assessment, then constructs the business model recogni- tion and the evaluation model. Finally, three RFID -enabled supply chain business models are concluded as: La- bel, Platform, Custom -Made and the business model implementation strategies are analyzed.

  12. Environmental modeling framework invasiveness: analysis and implications

    Science.gov (United States)

    Environmental modeling frameworks support scientific model development by providing an Application Programming Interface (API) which model developers use to implement models. This paper presents results of an investigation on the framework invasiveness of environmental modeling frameworks. Invasiven...

  13. A Model for Rearchitecting Frameworks

    Directory of Open Access Journals (Sweden)

    Galal H. Galal-Edeen

    2009-07-01

    Full Text Available Software rearchitecting is the process of obtaining a documented architecture for an existing system. There are many software rearchitecting frameworks which are based upon different concepts and context-related issues for a specific application or programming language, such as Rigi, Ciao, SPOOL, and Symphony, and Software Rearchitecting Action Framework (SRAF. Most of the frameworks focus on the reverse engineering process of source code. They neglect the role of stakeholders in enhancing and developing their systems. This paper presents a systematic analysis and comparative study for rearchitecting frameworks using generic architecture characteristics or elements. Based on the major requirements that should be available in the rearchitecting frameworks, the comparative study proceeds. An efficient model is proposed based on the trends that resulted from the comparative analysis. It considers the evaluation criteria of the compared frameworks. Conclusions and remarks are highlighted.

  14. Semantics-enabled service discovery framework in the SIMDAT pharma grid.

    Science.gov (United States)

    Qu, Cangtao; Zimmermann, Falk; Kumpf, Kai; Kamuzinzi, Richard; Ledent, Valérie; Herzog, Robert

    2008-03-01

    We present the design and implementation of a semantics-enabled service discovery framework in the data Grids for process and product development using numerical simulation and knowledge discovery (SIMDAT) Pharma Grid, an industry-oriented Grid environment for integrating thousands of Grid-enabled biological data services and analysis services. The framework consists of three major components: the Web ontology language (OWL)-description logic (DL)-based biological domain ontology, OWL Web service ontology (OWL-S)-based service annotation, and semantic matchmaker based on the ontology reasoning. Built upon the framework, workflow technologies are extensively exploited in the SIMDAT to assist biologists in (semi)automatically performing in silico experiments. We present a typical usage scenario through the case study of a biological workflow: IXodus.

  15. Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs

    Energy Technology Data Exchange (ETDEWEB)

    Potter, Jennifer; Cappers, Peter

    2017-08-28

    The Demand Response Advanced Controls Framework and Assessment of Enabling Technology Costs research describe a variety of DR opportunities and the various bulk power system services they can provide. The bulk power system services are mapped to a generalized taxonomy of DR “service types”, which allows us to discuss DR opportunities and bulk power system services in fewer yet broader categories that share similar technological requirements which mainly drive DR enablement costs. The research presents a framework for the costs to automate DR and provides descriptions of the various elements that drive enablement costs. The report introduces the various DR enabling technologies and end-uses, identifies the various services that each can provide to the grid and provides the cost assessment for each enabling technology. In addition to a report, this research includes a Demand Response Advanced Controls Database and User Manual. They are intended to provide users with the data that underlies this research and instructions for how to use that database more effectively and efficiently.

  16. CMAQ Model Evaluation Framework

    Science.gov (United States)

    CMAQ is tested to establish the modeling system’s credibility in predicting pollutants such as ozone and particulate matter. Evaluation of CMAQ has been designed to assess the model’s performance for specific time periods and for specific uses.

  17. Creating an enabling environment for adolescent sexual and reproductive health: a framework and promising approaches.

    Science.gov (United States)

    Svanemyr, Joar; Amin, Avni; Robles, Omar J; Greene, Margaret E

    2015-01-01

    This article provides a conceptual framework and points out the key elements for creating enabling environments for adolescent sexual and reproductive health (ASRH). An ecological framework is applied to organize the key elements of enabling environments for ASRH. At the individual level, strategies that are being implemented and seem promising are those that empower girls, build their individual assets, and create safe spaces. At the relationship level, strategies that are being implemented and seem promising include efforts to build parental support and communication as well as peer support networks. At the community level, strategies to engage men and boys and the wider community to transform gender and other social norms are being tested and may hold promise. Finally, at the broadest societal level, efforts to promote laws and policies that protect and promote human rights and address societal awareness about ASRH issues, including through mass media approaches, need to be considered.

  18. Distilling the Antecedents and Enabling Dynamics of Leader Moral Courage: A Framework to Guide Action.

    Science.gov (United States)

    Hutchinson, Marie; Jackson, Debra; Daly, John; Usher, Kim

    2015-05-01

    Intelligent, robust and courageous nursing leadership is essential in all areas of nursing, including mental health. However, in the nursing leadership literature, the theoretical discourse regarding how leaders recognise the need for action and make the choice to act with moral purpose is currently limited. Little has been written about the cognitions, capabilities and contextual factors that enable leader courage. In particular, the interplay between leader values and actions that are characterised as good or moral remains underexplored in the nursing leadership literature. In this article, through a discursive literature synthesis we seek to distill a more detailed understanding of leader moral courage; specifically, what factors contribute to leaders' ability to act with moral courage, what factors impede such action, and what factors do leaders need to foster within themselves and others to enable action that is driven by moral courage. From the analysis, we distilled a multi-level framework that identifies a range of individual characteristics and capabilities, and enabling contextual factors that underpin leader moral courage. The framework suggests leader moral courage is more complex than often posited in theories of leadership, as it comprises elements that shape moral thought and conduct. Given the complexity and challenges of nursing work, the framework for moral action derived from our analysis provides insight and suggestions for strengthening individual and group capacity to assist nurse leaders and mental health nurses to act with integrity and courage.

  19. Cytoview: Development of a cell modelling framework

    Indian Academy of Sciences (India)

    Prashant Khodade; Samta Malhotra; Nirmal Kumar; M Sriram Iyengar; N Balakrishnan; Nagasuma Chandra

    2007-08-01

    The biological cell, a natural self-contained unit of prime biological importance, is an enormously complex machine that can be understood at many levels. A higher-level perspective of the entire cell requires integration of various features into coherent, biologically meaningful descriptions. There are some efforts to model cells based on their genome, proteome or metabolome descriptions. However, there are no established methods as yet to describe cell morphologies, capture similarities and differences between different cells or between healthy and disease states. Here we report a framework to model various aspects of a cell and integrate knowledge encoded at different levels of abstraction, with cell morphologies at one end to atomic structures at the other. The different issues that have been addressed are ontologies, feature description and model building. The framework describes dotted representations and tree data structures to integrate diverse pieces of data and parametric models enabling size, shape and location descriptions. The framework serves as a first step in integrating different levels of data available for a biological cell and has the potential to lead to development of computational models in our pursuit to model cell structure and function, from which several applications can flow out.

  20. Aggregate driver model to enable predictable behaviour

    Science.gov (United States)

    Chowdhury, A.; Chakravarty, T.; Banerjee, T.; Balamuralidhar, P.

    2015-09-01

    The categorization of driving styles, particularly in terms of aggressiveness and skill is an emerging area of interest under the broader theme of intelligent transportation. There are two possible discriminatory techniques that can be applied for such categorization; a microscale (event based) model and a macro-scale (aggregate) model. It is believed that an aggregate model will reveal many interesting aspects of human-machine interaction; for example, we may be able to understand the propensities of individuals to carry out a given task over longer periods of time. A useful driver model may include the adaptive capability of the human driver, aggregated as the individual propensity to control speed/acceleration. Towards that objective, we carried out experiments by deploying smartphone based application to be used for data collection by a group of drivers. Data is primarily being collected from GPS measurements including position & speed on a second-by-second basis, for a number of trips over a two months period. Analysing the data set, aggregate models for individual drivers were created and their natural aggressiveness were deduced. In this paper, we present the initial results for 12 drivers. It is shown that the higher order moments of the acceleration profile is an important parameter and identifier of journey quality. It is also observed that the Kurtosis of the acceleration profiles stores major information about the driving styles. Such an observation leads to two different ranking systems based on acceleration data. Such driving behaviour models can be integrated with vehicle and road model and used to generate behavioural model for real traffic scenario.

  1. Green communication: The enabler to multiple business models

    DEFF Research Database (Denmark)

    Lindgren, Peter; Clemmensen, Suberia; Taran, Yariv

    2010-01-01

    Companies stand at the forefront of a new business model reality with new potentials - that will change their basic understanding and practice of running their business models radically. One of the drivers to this change is green communication, its strong relation to green business models and its...... possibility to enable lower energy consumption. This paper shows how green communication enables innovation of green business models and multiple business models running simultaneously in different markets to different customers....

  2. A framework for benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J. T.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J. B.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-10-01

    Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1) targeted aspects of model performance to be evaluated, (2) a set of benchmarks as defined references to test model performance, (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4) model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties of land models

  3. A framework for benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-10-01

    Full Text Available Land models, which have been developed by the modeling community in the past few decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land model performances and, meanwhile, highlights major challenges at this infant stage of benchmark analysis. The framework includes (1 targeted aspects of model performance to be evaluated, (2 a set of benchmarks as defined references to test model performance, (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies, and (4 model improvement. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and land surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics in response to climate change across broad temporal and spatial scales. Thus, one major challenge is to select and define a limited number of benchmarks to effectively evaluate land model performance. The second challenge is to develop metrics of measuring mismatches between models and benchmarks. The metrics may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data–model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance to guide future development, thus enabling improved predictions of future states of ecosystems and climate. The near-future research effort should be on development of a set of widely acceptable benchmarks that can be used to objectively, effectively, and reliably evaluate fundamental properties

  4. Modeling-Enabled Systems Nutritional Immunology

    Directory of Open Access Journals (Sweden)

    Meghna eVerma

    2016-02-01

    Full Text Available This review highlights the fundamental role of nutrition in the maintenance of health, the immune response and disease prevention. Emerging global mechanistic insights in the field of nutritional immunology cannot be gained through reductionist methods alone or by analyzing a single nutrient at a time. We propose to investigate nutritional immunology as a massively interacting system of interconnected multistage and multiscale networks that encompass hidden mechanisms by which nutrition, microbiome, metabolism, genetic predisposition and the immune system interact to delineate health and disease. The review sets an unconventional path to applying complex science methodologies to nutritional immunology research, discovery and development through ‘use cases’ centered around the impact of nutrition on the gut microbiome and immune responses. Our systems nutritional immunology analyses, that include modeling and informatics methodologies in combination with pre-clinical and clinical studies, have the potential to discover emerging systems-wide properties at the interface of the immune system, nutrition, microbiome, and metabolism.

  5. Conceptual Framework to Enable Early Warning of Relevant Phenomena (Emerging Phenomena and Big Data)

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Abercrombie, Robert K [ORNL; Hively, Lee M [ORNL

    2013-01-01

    Graphs are commonly used to represent natural and man-made dynamic systems such as food webs, economic and social networks, gene regulation, and the internet. We describe a conceptual framework to enable early warning of relevant phenomena that is based on an artificial time-based, evolving network graph that can give rise to one or more recognizable structures. We propose to quantify the dynamics using the method of delays through Takens Theorem to produce another graph we call the Phase Graph. The Phase Graph enables us to quantify changes of the system that form a topology in phase space. Our proposed method is unique because it is based on dynamic system analysis that incorporates Takens Theorem, Graph Theory, and Franzosi-Pettini (F-P) theorem about topology and phase transitions. The F-P Theorem states that the necessary condition for phase transition is a change in the topology. By detecting a change in the topology that we represent as a set of M-order Phase Graphs, we conclude a corresponding change in the phase of the system. The onset of this phase change enables early warning of emerging relevant phenomena.

  6. Mobile agent-enabled framework for structuring and building distributed systems on the internet

    Institute of Scientific and Technical Information of China (English)

    CAO Jiannong; ZHOU Jingyang; ZHU Weiwei; LI Xuhui

    2006-01-01

    Mobile agent has shown its promise as a powerful means to complement and enhance existing technology in various application areas. In particular, existing work has demonstrated that MA can simplify the development and improve the performance of certain classes of distributed applications, especially for those running on a wide-area, heterogeneous, and dynamic networking environment like the Internet. In our previous work, we extended the application of MA to the design of distributed control functions, which require the maintenance of logical relationship among and/or coordination of processing entities in a distributed system. A novel framework is presented for structuring and building distributed systems, which use cooperating mobile agents as an aid to carry out coordination and cooperation tasks in distributed systems. The framework has been used for designing various distributed control functions such as load balancing and mutual exclusion in our previous work. In this paper, we use the framework to propose a novel approach to detecting deadlocks in distributed system by using mobile agents, which demonstrates the advantage of being adaptive and flexible of mobile agents. We first describe the MAEDD (Mobile Agent Enabled Deadlock Detection) scheme, in which mobile agents are dispatched to collect and analyze deadlock information distributed across the network sites and, based on the analysis, to detect and resolve deadlocks. Then the design of an adaptive hybrid algorithm derived from the framework is presented. The algorithm can dynamically adapt itself to the changes in system state by using different deadlock detection strategies. The performance of the proposed algorithm has been evaluated using simulations. The results show that the algorithm can outperform existing algorithms that use a fixed deadlock detection strategy.

  7. A Unified Framework for Systematic Model Improvement

    DEFF Research Database (Denmark)

    Kristensen, Niels Rode; Madsen, Henrik; Jørgensen, Sten Bay

    2003-01-01

    A unified framework for improving the quality of continuous time models of dynamic systems based on experimental data is presented. The framework is based on an interplay between stochastic differential equation (SDE) modelling, statistical tests and multivariate nonparametric regression...

  8. Internet enabled modelling of extended manufacturing enterprises using the process based techniques

    OpenAIRE

    Cheng, K; Popov, Y

    2004-01-01

    The paper presents the preliminary results of an ongoing research project on Internet enabled process-based modelling of extended manufacturing enterprises. It is proposed to apply the Open System Architecture for CIM (CIMOSA) modelling framework alongside with object-oriented Petri Net models of enterprise processes and object-oriented techniques for extended enterprises modelling. The main features of the proposed approach are described and some components discussed. Elementary examples of ...

  9. Presenting a framework for knowledge management within a web-enabled Living Lab

    Directory of Open Access Journals (Sweden)

    Lizette de Jager

    2012-02-01

    Full Text Available Background: The background to this study showed that many communities, countries and continents are only now realising the importance of discovering innovative collaborative knowledge. Knowledge management (KM enables organisations to retain tacit knowledge. It has many advantages, like competitiveness, retaining workers’ knowledge as corporate assets and assigning value to it. The value of knowledge can never depreciate. It can only grow and become more and more valuable because new knowledge is added continuously to existing knowledge.Objective: The objective of this study was to present a framework for KM processes and using social media tools in a Living Lab (LL environment.Methods: In order to find a way to help organisations to retain tacit knowledge, the researchers conducted in-depth research. They used case studies and Grounded Theory (GT to explore KM, social media tools and technologies as well as the LL environment. They emailed an online questionnaire and followed it up telephonically. The study targeted academic, support and administrative staff in higher education institutions nationwide to establish their level of KM knowledge, understanding of concepts and levels of application.Results: The researchers concluded that the participants did not know the term KM and therefore were not using KM. They only used information hubs, or general university systems, like Integrated Technology Software (ITS, to capture and store information. The researchers suggested including social media and managing them as tools to help CoPs to meet their knowledge requirements. Therefore, the researchers presented a framework that uses semantic technologies and the social media to address the problem.Conclusion: The success of the LL approach in developing new web-enabled LLs allows organisations to amalgamate various networks. The social media help organisations to gather, classify and verify knowledge.

  10. Enabling Sustainability: Hierarchical Need-Based Framework for Promoting Sustainable Data Infrastructure in Developing Countries

    Directory of Open Access Journals (Sweden)

    David O. Yawson

    2009-11-01

    Full Text Available The paper presents thoughts on Sustainable Data Infrastructure (SDI development, and its user requirements bases. It brings Maslow's motivational theory to the fore, and proposes it as a rationalization mechanism for entities (mostly governmental that aim at realizing SDI. Maslow's theory, though well-known, is somewhat new in geospatial circles; this is where the novelty of the paper resides. SDI has been shown to enable and aid development in diverse ways. However, stimulating developing countries to appreciate the utility of SDI, implement, and use SDI in achieving sustainable development has proven to be an imposing challenge. One of the key reasons for this could be the absence of a widely accepted psychological theory to drive needs assessment and intervention design for the purpose of SDI development. As a result, it is reasonable to explore Maslow’s theory of human motivation as a psychological theory for promoting SDI in developing countries. In this article, we review and adapt Maslow’s hierarchy of needs as a framework for the assessment of the needs of developing nations. The paper concludes with the implications of this framework for policy with the view to stimulating the implementation of SDI in developing nations.

  11. GeoSpark SQL: An Effective Framework Enabling Spatial Queries on Spark

    Directory of Open Access Journals (Sweden)

    Zhou Huang

    2017-09-01

    Full Text Available In the era of big data, Internet-based geospatial information services such as various LBS apps are deployed everywhere, followed by an increasing number of queries against the massive spatial data. As a result, the traditional relational spatial database (e.g., PostgreSQL with PostGIS and Oracle Spatial cannot adapt well to the needs of large-scale spatial query processing. Spark is an emerging outstanding distributed computing framework in the Hadoop ecosystem. This paper aims to address the increasingly large-scale spatial query-processing requirement in the era of big data, and proposes an effective framework GeoSpark SQL, which enables spatial queries on Spark. On the one hand, GeoSpark SQL provides a convenient SQL interface; on the other hand, GeoSpark SQL achieves both efficient storage management and high-performance parallel computing through integrating Hive and Spark. In this study, the following key issues are discussed and addressed: (1 storage management methods under the GeoSpark SQL framework, (2 the spatial operator implementation approach in the Spark environment, and (3 spatial query optimization methods under Spark. Experimental evaluation is also performed and the results show that GeoSpark SQL is able to achieve real-time query processing. It should be noted that Spark is not a panacea. It is observed that the traditional spatial database PostGIS/PostgreSQL performs better than GeoSpark SQL in some query scenarios, especially for the spatial queries with high selectivity, such as the point query and the window query. In general, GeoSpark SQL performs better when dealing with compute-intensive spatial queries such as the kNN query and the spatial join query.

  12. A Simulation and Modeling Framework for Space Situational Awareness

    Energy Technology Data Exchange (ETDEWEB)

    Olivier, S S

    2008-09-15

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. The framework is based on a flexible, scalable architecture to enable efficient, physics-based simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. The details of the modeling and simulation framework are described, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical brightness calculations, generic radar system models, generic optical system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The use of this integrated simulation and modeling framework on a specific scenario involving space debris is demonstrated.

  13. Smart Cities as Organizational Fields: A Framework for Mapping Sustainability-Enabling Configurations

    Directory of Open Access Journals (Sweden)

    Paul Pierce

    2017-08-01

    Full Text Available Despite the impressive growth of smart city initiatives worldwide, an organizational theory of smart city has yet to be developed, and we lack models addressing the unprecedented organizational and management challenges that emerge in smart city contexts. Traditional models are often of little use, because smart cities pursue different goals than traditional organizations, are based on networked, cross-boundary activity systems, rely on distributed innovation processes, and imply adaptive policy-making. Complex combinations of factors may lead to vicious or virtuous cycles in smart city initiatives, but we know very little about how these factors may be identified and mapped. Based on an inductive study of a set of primary and secondary sources, we develop a framework for the configurational analysis of smart cities viewed as place-specific organizational fields. This framework identifies five key dimensions in the configurations of smart city fields; these five dimensions are mapped through five sub-frameworks, which can be used both separately as well as for an integrated analysis. Our contribution is conceived to support longitudinal studies, natural experiments and comparative analyses on smart city fields, and to improve our understanding of how different combinations of factors affect the capability of smart innovations to translate into city resilience, sustainability and quality of life. In addition, our results suggest that new forms of place-based entrepreneurship constitute the engine that allows for the dynamic collaboration between government, citizens and research centers in successful smart city organizational fields.

  14. Developing a framework for understanding and enabling open source drug discovery.

    Science.gov (United States)

    Allarakhia, Minna

    2010-08-01

    Open source drug discovery is increasingly being sought as a solution for managing product development complexities. Three drivers encouraging the use of the open source strategy include: upstream knowledge-based complexities associated with complementary assets, technological complexities given the scale of research and interdependencies between disciplines and downstream commercialization complexities. While literature currently discusses the need for open source strategies and their outcomes, we have reached a critical stage for a framework to cohesively understand how the drivers affect the open source models chosen as well as the governance strategies to ensure a successful outcome both in terms of knowledge access and product development. In this paper, an initial framework is designed with a focus on the type of participant as impacting the motivation to participate in an open source initiative, the objective of any open source strategy as impacting the structural model adopted and the structure of knowledge produced as impacting its management. It is anticipated that this framework should then provide an opportunity to develop governance rules for open source drug discovery initiatives.

  15. A framework for list representation, enabling list stabilization through incorporation of gene exchangeabilities

    CERN Document Server

    Soneson, Charlotte

    2011-01-01

    Analysis of multivariate data sets from e.g. microarray studies frequently results in lists of genes which are associated with some response of interest. The biological interpretation is often complicated by the statistical instability of the obtained gene lists with respect to sampling variations, which may partly be due to the functional redundancy among genes, implying that multiple genes can play exchangeable roles in the cell. In this paper we use the concept of exchangeability of random variables to model this functional redundancy and thereby account for the instability attributable to sampling variations. We present a flexible framework to incorporate the exchangeability into the representation of lists. The proposed framework supports straightforward robust comparison between any two lists. It can also be used to generate new, more stable gene rankings incorporating more information from the experimental data. Using a microarray data set from lung cancer patients we show that the proposed method prov...

  16. Crystallization Kinetics within a Generic Modelling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist

    2013-01-01

    An existing generic modelling framework has been expanded with tools for kinetic model analysis. The analysis of kinetics is carried out within the framework where kinetic constitutive models are collected, analysed and utilized for the simulation of crystallization operations. A modelling...... procedure is proposed to gain the information of crystallization operation kinetic model analysis and utilize this for faster evaluation of crystallization operations....

  17. Geologic Framework Model Analysis Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R. Clayton

    2000-12-19

    The purpose of this report is to document the Geologic Framework Model (GFM), Version 3.1 (GFM3.1) with regard to data input, modeling methods, assumptions, uncertainties, limitations, and validation of the model results, qualification status of the model, and the differences between Version 3.1 and previous versions. The GFM represents a three-dimensional interpretation of the stratigraphy and structural features of the location of the potential Yucca Mountain radioactive waste repository. The GFM encompasses an area of 65 square miles (170 square kilometers) and a volume of 185 cubic miles (771 cubic kilometers). The boundaries of the GFM were chosen to encompass the most widely distributed set of exploratory boreholes (the Water Table or WT series) and to provide a geologic framework over the area of interest for hydrologic flow and radionuclide transport modeling through the unsaturated zone (UZ). The depth of the model is constrained by the inferred depth of the Tertiary-Paleozoic unconformity. The GFM was constructed from geologic map and borehole data. Additional information from measured stratigraphy sections, gravity profiles, and seismic profiles was also considered. This interim change notice (ICN) was prepared in accordance with the Technical Work Plan for the Integrated Site Model Process Model Report Revision 01 (CRWMS M&O 2000). The constraints, caveats, and limitations associated with this model are discussed in the appropriate text sections that follow. The GFM is one component of the Integrated Site Model (ISM) (Figure l), which has been developed to provide a consistent volumetric portrayal of the rock layers, rock properties, and mineralogy of the Yucca Mountain site. The ISM consists of three components: (1) Geologic Framework Model (GFM); (2) Rock Properties Model (RPM); and (3) Mineralogic Model (MM). The ISM merges the detailed project stratigraphy into model stratigraphic units that are most useful for the primary downstream models and the

  18. A novel machine learning-enabled framework for instantaneous heart rate monitoring from motion-artifact-corrupted electrocardiogram signals.

    Science.gov (United States)

    Zhang, Qingxue; Zhou, Dian; Zeng, Xuan

    2016-11-01

    This paper proposes a novel machine learning-enabled framework to robustly monitor the instantaneous heart rate (IHR) from wrist-electrocardiography (ECG) signals continuously and heavily corrupted by random motion artifacts in wearable applications. The framework includes two stages, i.e. heartbeat identification and refinement, respectively. In the first stage, an adaptive threshold-based auto-segmentation approach is proposed to select out heartbeat candidates, including the real heartbeats and large amounts of motion-artifact-induced interferential spikes. Then twenty-six features are extracted for each candidate in time, spatial, frequency and statistical domains, and evaluated by a spare support vector machine (SVM) to select out ten critical features which can effectively reveal residual heartbeat information. Afterwards, an SVM model, created on the training data using the selected feature set, is applied to find high confident heartbeats from a large number of candidates in the testing data. In the second stage, the SVM classification results are further refined by two steps: (1) a rule-based classifier with two attributes named 'continuity check' and 'locality check' for outlier (false positives) removal, and (2) a heartbeat interpolation strategy for missing-heartbeat (false negatives) recovery. The framework is evaluated on a wrist-ECG dataset acquired by a semi-customized platform and also a public dataset. When the signal-to-noise ratio is as low as  -7 dB, the mean absolute error of the estimated IHR is 1.4 beats per minute (BPM) and the root mean square error is 6.5 BPM. The proposed framework greatly outperforms well-established approaches, demonstrating that it can effectively identify the heartbeats from ECG signals continuously corrupted by intense motion artifacts and robustly estimate the IHR. This study is expected to contribute to robust long-term wearable IHR monitoring for pervasive heart health and fitness management.

  19. A comprehensive database and analysis framework to incorporate multiscale data types and enable integrated analysis of bioactive polyphenols.

    Science.gov (United States)

    Pasinetti, Giulio M; Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qing-Li; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke

    2017-06-30

    The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking datasets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in: (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these datasets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites, their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative

  20. Deriving Framework Usages Based on Behavioral Models

    Science.gov (United States)

    Zenmyo, Teruyoshi; Kobayashi, Takashi; Saeki, Motoshi

    One of the critical issue in framework-based software development is a huge introduction cost caused by technical gap between developers and users of frameworks. This paper proposes a technique for deriving framework usages to implement a given requirements specification. By using the derived usages, the users can use the frameworks without understanding the framework in detail. Requirements specifications which describe definite behavioral requirements cannot be related to frameworks in as-is since the frameworks do not have definite control structure so that the users can customize them to suit given requirements specifications. To cope with this issue, a new technique based on satisfiability problems (SAT) is employed to derive the control structures of the framework model. In the proposed technique, requirements specifications and frameworks are modeled based on Labeled Transition Systems (LTSs) with branch conditions represented by predicates. Truth assignments of the branch conditions in the framework models are not given initially for representing the customizable control structure. The derivation of truth assignments of the branch conditions is regarded as the SAT by assuming relations between termination states of the requirements specification model and ones of the framework model. This derivation technique is incorporated into a technique we have proposed previously for relating actions of requirements specifications to ones of frameworks. Furthermore, this paper discuss a case study of typical use cases in e-commerce systems.

  1. Surfactant-Free Shape Control of Gold Nanoparticles Enabled by Unified Theoretical Framework of Nanocrystal Synthesis.

    Science.gov (United States)

    Wall, Matthew A; Harmsen, Stefan; Pal, Soumik; Zhang, Lihua; Arianna, Gianluca; Lombardi, John R; Drain, Charles Michael; Kircher, Moritz F

    2017-06-01

    Gold nanoparticles have unique properties that are highly dependent on their shape and size. Synthetic methods that enable precise control over nanoparticle morphology currently require shape-directing agents such as surfactants or polymers that force growth in a particular direction by adsorbing to specific crystal facets. These auxiliary reagents passivate the nanoparticles' surface, and thus decrease their performance in applications like catalysis and surface-enhanced Raman scattering. Here, a surfactant- and polymer-free approach to achieving high-performance gold nanoparticles is reported. A theoretical framework to elucidate the growth mechanism of nanoparticles in surfactant-free media is developed and it is applied to identify strategies for shape-controlled syntheses. Using the results of the analyses, a simple, green-chemistry synthesis of the four most commonly used morphologies: nanostars, nanospheres, nanorods, and nanoplates is designed. The nanoparticles synthesized by this method outperform analogous particles with surfactant and polymer coatings in both catalysis and surface-enhanced Raman scattering. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Frameworks for understanding and describing business models

    DEFF Research Database (Denmark)

    Nielsen, Christian; Roslender, Robin

    2014-01-01

    This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models ...... Maps (2001) • Intellectual Capital Statements (2003) • Chesbrough’s framework for Open Business Models (2006) • Business Model Canvas (2008)......This chapter provides in a chronological fashion an introduction to six frameworks that one can apply to describing, understanding and also potentially innovating business models. These six frameworks have been chosen carefully as they represent six very different perspectives on business models...... and in this manner “complement” each other. There are a multitude of varying frameworks that could be chosen from and we urge the reader to search and trial these for themselves. The six chosen models (year of release in parenthesis) are: • Service-Profit Chain (1994) • Strategic Systems Auditing (1997) • Strategy...

  3. A UML profile for framework modeling

    Institute of Scientific and Technical Information of China (English)

    XU Xiao-liang(徐小良); WANG Le-yu(汪乐宇); ZHOU Hong(周泓)

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendibility adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  4. A UML profile for framework modeling.

    Science.gov (United States)

    Xu, Xiao-liang; Wang, Le-yu; Zhou, Hong

    2004-01-01

    The current standard Unified Modeling Language(UML) could not model framework flexibility and extendability adequately due to lack of appropriate constructs to distinguish framework hot-spots from kernel elements. A new UML profile that may customize UML for framework modeling was presented using the extension mechanisms of UML, providing a group of UML extensions to meet the needs of framework modeling. In this profile, the extended class diagrams and sequence diagrams were defined to straightforwardly identify the hot-spots and describe their instantiation restrictions. A transformation model based on design patterns was also put forward, such that the profile based framework design diagrams could be automatically mapped to the corresponding implementation diagrams. It was proved that the presented profile makes framework modeling more straightforwardly and therefore easier to understand and instantiate.

  5. Detailed Modeling and Response of Demand Response Enabled Appliances

    Energy Technology Data Exchange (ETDEWEB)

    Vyakaranam, Bharat; Fuller, Jason C.

    2014-04-14

    Proper modeling of end use loads is very important in order to predict their behavior, and how they interact with the power system, including voltage and temperature dependencies, power system and load control functions, and the complex interactions that occur between devices in such an interconnected system. This paper develops multi-state time variant residential appliance models with demand response enabled capabilities in the GridLAB-DTM simulation environment. These models represent not only the baseline instantaneous power demand and energy consumption, but the control systems developed by GE Appliances to enable response to demand response signals and the change in behavior of the appliance in response to the signal. These DR enabled appliances are simulated to estimate their capability to reduce peak demand and energy consumption.

  6. Enabling pathways to health equity: developing a framework for implementing social capital in practice.

    Science.gov (United States)

    Putland, Christine; Baum, Fran; Ziersch, Anna; Arthurson, Kathy; Pomagalska, Dorota

    2013-05-29

    relationship requires long term vision, endorsement for cross-sectoral work, well-developed relationships and theoretical and practical knowledge. Attention to the practical application of social capital theory shows that community projects require structural support in their efforts to improve health and wellbeing and reduce health inequities. Sound community development techniques are essential but do not operate independently from frameworks and policies at the highest levels of government. Recognition of the interdependence of policy and practice will enable government to achieve these goals more effectively.

  7. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    Science.gov (United States)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  8. CUDAMPF: a multi-tiered parallel framework for accelerating protein sequence search in HMMER on CUDA-enabled GPU.

    Science.gov (United States)

    Jiang, Hanyu; Ganesan, Narayan

    2016-02-27

    HMMER software suite is widely used for analysis of homologous protein and nucleotide sequences with high sensitivity. The latest version of hmmsearch in HMMER 3.x, utilizes heuristic-pipeline which consists of MSV/SSV (Multiple/Single ungapped Segment Viterbi) stage, P7Viterbi stage and the Forward scoring stage to accelerate homology detection. Since the latest version is highly optimized for performance on modern multi-core CPUs with SSE capabilities, only a few acceleration attempts report speedup. However, the most compute intensive tasks within the pipeline (viz., MSV/SSV and P7Viterbi stages) still stand to benefit from the computational capabilities of massively parallel processors. A Multi-Tiered Parallel Framework (CUDAMPF) implemented on CUDA-enabled GPUs presented here, offers a finer-grained parallelism for MSV/SSV and Viterbi algorithms. We couple SIMT (Single Instruction Multiple Threads) mechanism with SIMD (Single Instructions Multiple Data) video instructions with warp-synchronism to achieve high-throughput processing and eliminate thread idling. We also propose a hardware-aware optimal allocation scheme of scarce resources like on-chip memory and caches in order to boost performance and scalability of CUDAMPF. In addition, runtime compilation via NVRTC available with CUDA 7.0 is incorporated into the presented framework that not only helps unroll innermost loop to yield upto 2 to 3-fold speedup than static compilation but also enables dynamic loading and switching of kernels depending on the query model size, in order to achieve optimal performance. CUDAMPF is designed as a hardware-aware parallel framework for accelerating computational hotspots within the hmmsearch pipeline as well as other sequence alignment applications. It achieves significant speedup by exploiting hierarchical parallelism on single GPU and takes full advantage of limited resources based on their own performance features. In addition to exceeding performance of other

  9. A Simulation and Modeling Framework for Space Situational Awareness

    Science.gov (United States)

    Olivier, S.

    This paper describes the development and initial demonstration of a new, integrated modeling and simulation framework, encompassing the space situational awareness enterprise, for quantitatively assessing the benefit of specific sensor systems, technologies and data analysis techniques. This framework includes detailed models for threat scenarios, signatures, sensors, observables and knowledge extraction algorithms. The framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the details of the modeling and simulation framework, including hydrodynamic models of satellite intercept and debris generation, orbital propagation algorithms, radar cross section calculations, optical and infra-red brightness calculations, generic radar system models, generic optical and infra-red system models, specific Space Surveillance Network models, object detection algorithms, orbit determination algorithms, and visualization tools. The specific modeling of the Space Surveillance Network is performed in collaboration with the Air Force Space Command Space Control Group. We will demonstrate the use of this integrated simulation and modeling framework on specific threat scenarios, including space debris and satellite maneuvers, and we will examine the results of case studies involving the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.

  10. Plasma Modeling Enabled Technology Development Empowered by Fundamental Scattering Data

    Science.gov (United States)

    Kushner, Mark J.

    2016-05-01

    Technology development increasingly relies on modeling to speed the innovation cycle. This is particularly true for systems using low temperature plasmas (LTPs) and their role in enabling energy efficient processes with minimal environmental impact. In the innovation cycle, LTP modeling supports investigation of fundamental processes that seed the cycle, optimization of newly developed technologies, and prediction of performance of unbuilt systems for new applications. Although proof-of-principle modeling may be performed for idealized systems in simple gases, technology development must address physically complex systems that use complex gas mixtures that now may be multi-phase (e.g., in contact with liquids). The variety of fundamental electron and ion scattering, and radiation transport data (FSRD) required for this modeling increases as the innovation cycle progresses, while the accuracy required of that data depends on the intended outcome. In all cases, the fidelity, depth and impact of the modeling depends on the availability of FSRD. Modeling and technology development are, in fact, empowered by the availability and robustness of FSRD. In this talk, examples of the impact of and requirements for FSRD in the innovation cycle enabled by plasma modeling will be discussed using results from multidimensional and global models. Examples of fundamental studies and technology optimization will focus on microelectronics fabrication and on optically pumped lasers. Modeling of systems as yet unbuilt will address the interaction of atmospheric pressure plasmas with liquids. Work supported by DOE Office of Fusion Energy Science and the National Science Foundation.

  11. A community-based framework for aquatic ecosystem models

    DEFF Research Database (Denmark)

    Trolle, Didde; Hamilton, D. P.; Hipsey, M. R.;

    2012-01-01

    aim to (i) advance collaboration within the aquatic ecosystem modelling community, (ii) enable increased use of models for research, policy and ecosystem-based management, (iii) facilitate a collective framework using common (standardised) code to ensure that model development is incremental, (iv......Here, we communicate a point of departure in the development of aquatic ecosystem models, namely a new community-based framework, which supports an enhanced and transparent union between the collective expertise that exists in the communities of traditional ecologists and model developers. Through...... a literature survey, we document the growing importance of numerical aquatic ecosystem models while also noting the difficulties, up until now, of the aquatic scientific community to make significant advances in these models during the past two decades. Through a common forum for aquatic ecosystem modellers we...

  12. Towards A Framework For ICT-Enabled Materials Management In Complex Projects

    Directory of Open Access Journals (Sweden)

    N. B. Kasim

    2011-10-01

    Full Text Available This paper describes a research project, aimed at developing a system to integrate RFID-based materials management with resources modelling in project management to improve on-site materials tracking and inventory management processes. In order to develop the system, a comprehensive literature review and exploratory case studies were conducted to investigate current practices, problems, implementation of ICT and potential use of emerging technologies (such as RFID and wireless technologies in overcoming the logistical difficulties associated with materials management. An initial assessment revealed that there is a potential to improve the tracking and management of materials using modern ICT, thus will enhance the operational efficiency of the project delivery process. Moreover, sophisticated technologies such as wireless systems and tagging are not generally used to overcome human error in materials identification and the space constraints inherent in many projects. This paper concludes the finding from case studies for developing a real-time materials tracking framework to support construction professional in handling materials more effectively.

  13. A Generic Framework for Enabling the Flow of Sensor Observations to Archives: O2A

    Science.gov (United States)

    Gerchow, Peter; Koppe, Roland; Macario, Ana; Haas, Antonie; Shäfer-Neth, Christian

    2015-04-01

    Over the last two decades, the Alfred Wegener Institute (AWI) has been continuously committing to develop and sustain an e-Infrastructure for coherent discovery, visualization, dissemination and archival of scientific information in polar and marine regions. Most of the data originates from research activities being carried out in a wide range of AWI-operated research platforms: vessels, land-based stations, ocean-based stations and aircrafts. Archival and publishing in PANGAEA repository along with DOI assignment to individual datasets is a typical end-of-line step for most data owners. Within AWI, a workflow for data acquisition from vessel-mounted devices along with ingestion procedures for the raw data into the institutional archives has been well-established for many years. However, the increasing number of ocean-based stations and respective sensors along with heterogeneous project-driven requirements towards satellite communication, sensor monitoring, QA/QC control and validation, processing algorithms, visualization and dissemination has recently lead us to build a more generic and cost-effective framework. This framework, hereafter named O2A, has as main strength its seamless flow of sensor observation to archives and the fact that it complies with internationally used OGC standards and thus assuring interoperability in international context (e.g. SOS/SWE, WPS, WMS WFS,..). O2A is comprised of several extensible and exchangeable modules (e.g. controlled vocabularies and gazetteers, file type and structure validation, aggregation solutions, processing algorithms, etc) as well as various interoperability services. At the first data tier level, not only each sensor is being described following SensorML data model standards but the data is being fed to an SOS interface offering streaming solutions along with support to O&M encoding. Project administrators or data specialists are now able to monitor the individual sensors displayed in a map by simply clicking

  14. Crystallization Kinetics within a Generic Modeling Framework

    DEFF Research Database (Denmark)

    Meisler, Kresten Troelstrup; von Solms, Nicolas; Gernaey, Krist V.

    2014-01-01

    to the modeling of various kinetic phenomena like nucleation, growth, agglomeration, and breakage are discussed in terms of model forms, model parameters, their availability and/or estimation, and their selection and application for specific crystallization operational scenarios under study. The advantages......A new and extended version of a generic modeling framework for analysis and design of crystallization operations is presented. The new features of this framework are described, with focus on development, implementation, identification, and analysis of crystallization kinetic models. Issues related...... of employing a well-structured model library for storage, use/reuse, and analysis of the kinetic models are highlighted. Examples illustrating the application of the modeling framework for kinetic model discrimination related to simulation of specific crystallization scenarios and for kinetic model parameter...

  15. Introducing the Leadership in Enabling Occupation (LEO) model.

    Science.gov (United States)

    Townsend, Elizabeth A; Polatajko, Helene J; Craik, Janet M; von Zweck, Claudia M

    2011-10-01

    Occupational therapy is a broad profession yet access to services remains restricted and uneven across Canada. Access to the potential breadth of occupational therapy is severely restrained by complex supply, retention, and funding challenges. To improve access to occupational therapy, widespread leadership is needed by all practitioners. This brief report introduces the Leadership in Enabling Occupation (LEO) Model, which displays the inter-relationship of four elements of everyday leadership as described in "Positioning Occupational Therapy for Leadership," Section IV, of Enabling Occupation II: Advancing a Vision of Health, Well-being and Justice through Occupation (Townsend & Polatajko, 2007). All occupational therapists have the power to develop leadership capacity within and beyond designated leadership positions. LEO is a leadership tool to extend all occupational therapists' strategic use of scholarship, new accountability approaches, existing and new funding, and workforce planning to improve access to occupational therapy.

  16. BIM-enabled Conceptual Modelling and Representation of Building Circulation

    Directory of Open Access Journals (Sweden)

    Jin Kook Lee

    2014-08-01

    Full Text Available This paper describes how a building information modelling (BIM-based approach for building circulation enables us to change the process of building design in terms of its computational representation and processes, focusing on the conceptual modelling and representation of circulation within buildings. BIM has been designed for use by several BIM authoring tools, in particular with the widely known interoperable industry foundation classes (IFCs, which follow an object-oriented data modelling methodology. Advances in BIM authoring tools, using space objects and their relations defined in an IFC’s schema, have made it possible to model, visualize and analyse circulation within buildings prior to their construction. Agent-based circulation has long been an interdisciplinary topic of research across several areas, including design computing, computer science, architectural morphology, human behaviour and environmental psychology. Such conventional approaches to building circulation are centred on navigational knowledge about built environments, and represent specific circulation paths and regulations. This paper, however, places emphasis on the use of ‘space objects’ in BIM-enabled design processes rather than on circulation agents, the latter of which are not defined in the IFCs’ schemas. By introducing and reviewing some associated research and projects, this paper also surveys how such a circulation representation is applicable to the analysis of building circulation-related rules.

  17. Supply chain risk management enablers - A framework development through systematic review of the literature from 2000 to 2015

    Directory of Open Access Journals (Sweden)

    Kilubi, I.

    2015-08-01

    Full Text Available The present paper delivers a robust and systematic literature review (SLR on supply chain risk management (SCRM with the purpose to a review and analyse the literature concerning definitions and research methodologies applied, to b develop a classificatory framework which clusters existing enablers on SCRM, and to c examine the linkage between SCRM and performance. The findings reveal that not only is SCRM loosely defined, but that there are various fragmented supply chain risks enablers and that there is a strong need for a clear terminology for its building enablers. In addition to that, the review points to a lack of empirical confirmation concerning the connection between SCRM and performance. This paper contributes an overview of 80 peer-reviewed journal articles on SCRM from 2000 to the beginning of 2015. We offer an overarching definition of SCRM, synthesise and assemble the numerous enablers into preventive and responsive strategies by means of a conceptual framework. Moreover, indicating the social network theory (SNT as a potential theoretical foundation for SCRM, we further contribute to the supply chain management (SCM literature by providing propositions that guide future research.

  18. A Volunteered Geographic Information Framework to Enable Bottom-Up Disaster Management Platforms

    Directory of Open Access Journals (Sweden)

    Mohammad Ebrahim Poorazizi

    2015-08-01

    Full Text Available Recent disasters, such as the 2010 Haiti earthquake, have drawn attention to the potential role of citizens as active information producers. By using location-aware devices such as smartphones to collect geographic information in the form of geo-tagged text, photos, or videos, and sharing this information through online social media, such as Twitter, citizens create Volunteered Geographic Information (VGI. To effectively use this information for disaster management, we developed a VGI framework for the discovery of VGI. This framework consists of four components: (i a VGI brokering module to provide a standard service interface to retrieve VGI from multiple resources based on spatial, temporal, and semantic parameters; (ii a VGI quality control component, which employs semantic filtering and cross-referencing techniques to evaluate VGI; (iii a VGI publisher module, which uses a service-based delivery mechanism to disseminate VGI, and (iv a VGI discovery component to locate, browse, and query metadata about available VGI datasets. In a case study we employed a FOSS (Free and Open Source Software strategy, open standards/specifications, and free/open data to show the utility of the framework. We demonstrate that the framework can facilitate data discovery for disaster management. The addition of quality metrics and a single aggregated source of relevant crisis VGI will allow users to make informed policy choices that could save lives, meet basic humanitarian needs earlier, and perhaps limit environmental and economic damage.

  19. Advancing a framework to enable characterization and evaluation of data streams useful for biosurveillance.

    Directory of Open Access Journals (Sweden)

    Kristen J Margevicius

    Full Text Available In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation.

  20. Advancing a framework to enable characterization and evaluation of data streams useful for biosurveillance.

    Science.gov (United States)

    Margevicius, Kristen J; Generous, Nicholas; Taylor-McCabe, Kirsten J; Brown, Mac; Daniel, W Brent; Castro, Lauren; Hengartner, Andrea; Deshpande, Alina

    2014-01-01

    In recent years, biosurveillance has become the buzzword under which a diverse set of ideas and activities regarding detecting and mitigating biological threats are incorporated depending on context and perspective. Increasingly, biosurveillance practice has become global and interdisciplinary, requiring information and resources across public health, One Health, and biothreat domains. Even within the scope of infectious disease surveillance, multiple systems, data sources, and tools are used with varying and often unknown effectiveness. Evaluating the impact and utility of state-of-the-art biosurveillance is, in part, confounded by the complexity of the systems and the information derived from them. We present a novel approach conceptualizing biosurveillance from the perspective of the fundamental data streams that have been or could be used for biosurveillance and to systematically structure a framework that can be universally applicable for use in evaluating and understanding a wide range of biosurveillance activities. Moreover, the Biosurveillance Data Stream Framework and associated definitions are proposed as a starting point to facilitate the development of a standardized lexicon for biosurveillance and characterization of currently used and newly emerging data streams. Criteria for building the data stream framework were developed from an examination of the literature, analysis of information on operational infectious disease biosurveillance systems, and consultation with experts in the area of biosurveillance. To demonstrate utility, the framework and definitions were used as the basis for a schema of a relational database for biosurveillance resources and in the development and use of a decision support tool for data stream evaluation.

  1. Cooperative cognitive radio networking system model, enabling techniques, and performance

    CERN Document Server

    Cao, Bin; Mark, Jon W

    2016-01-01

    This SpringerBrief examines the active cooperation between users of Cooperative Cognitive Radio Networking (CCRN), exploring the system model, enabling techniques, and performance. The brief provides a systematic study on active cooperation between primary users and secondary users, i.e., (CCRN), followed by the discussions on research issues and challenges in designing spectrum-energy efficient CCRN. As an effort to shed light on the design of spectrum-energy efficient CCRN, they model the CCRN based on orthogonal modulation and orthogonally dual-polarized antenna (ODPA). The resource allocation issues are detailed with respect to both models, in terms of problem formulation, solution approach, and numerical results. Finally, the optimal communication strategies for both primary and secondary users to achieve spectrum-energy efficient CCRN are analyzed.

  2. Futures Business Models for an IoT Enabled Healthcare Sector: A Causal Layered Analysis Perspective

    Directory of Open Access Journals (Sweden)

    Julius Francis Gomes

    2016-12-01

    Full Text Available Purpose: To facilitate futures business research by proposing a novel way to combine business models as a conceptual tool with futures research techniques. Design: A futures perspective is adopted to foresight business models of the Internet of Things (IoT enabled healthcare sector by using business models as a futures business research tool. In doing so, business models is coupled with one of the most prominent foresight methodologies, Causal Layered Analysis (CLA. Qualitative analysis provides deeper understanding of the phenomenon through the layers of CLA; litany, social causes, worldview and myth. Findings: It is di cult to predict the far future for a technology oriented sector like healthcare. This paper presents three scenarios for short-, medium- and long-term future. Based on these scenarios we also present a set of business model elements for different future time frames. This paper shows a way to combine business models with CLA, a foresight methodology; in order to apply business models in futures business research. Besides offering early results for futures business research, this study proposes a conceptual space to work with individual business models for managerial stakeholders. Originality / Value: Much research on business models has offered conceptualization of the phenomenon, innovation through business model and transformation of business models. However, existing literature does not o er much on using business model as a futures research tool. Enabled by futures thinking, we collected key business model elements and building blocks for the futures market and ana- lyzed them through the CLA framework.

  3. MDM: A Mode Diagram Modeling Framework

    DEFF Research Database (Denmark)

    Wang, Zheng; Pu, Geguang; Li, Jianwen

    2012-01-01

    systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way...... checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems....

  4. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin

    2010-01-01

    Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model......Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... and check the running status of the system, which offers a debugging capability on a higher level of abstraction. The framework intends to contribute a tool to the Eclipse society, especially suitable for model-driven development of embedded systems....

  5. Framework Architecture Enabling an Agent-Based Inter-Company Integration with XML

    Directory of Open Access Journals (Sweden)

    Klement Fellner

    2000-11-01

    Full Text Available More and more cooperating companies utilize the World Wide Web (WWW to federate and further integrate their heterogeneous business application systems. At the same time, innovative business strategies, like virtual organizations, supply chain management or one-to-one marketing as well as trendsetting competitive strategies, like mass customisation are realisable. Both, the necessary integration and the innovative concepts are demanding software supporting automation of communication as well as coordination across system boundaries. In this paper, we describe a framework architecture for intercompany integration of business processes based on commonly accepted and (partially standardized concepts and techniques. Further on, it is shown how the framework architecture helps to automate procurement processes and how a cost-saving black-box re-use is achieved following a component oriented implementation paradigm.

  6. Enabling Sustainability: Hierarchical Need-Based Framework for Promoting Sustainable Data Infrastructure in Developing Countries

    OpenAIRE

    David O. Yawson; Armah, Frederick A.; Alex N. M. Pappoe

    2009-01-01

    The paper presents thoughts on Sustainable Data Infrastructure (SDI) development, and its user requirements bases. It brings Maslow's motivational theory to the fore, and proposes it as a rationalization mechanism for entities (mostly governmental) that aim at realizing SDI. Maslow's theory, though well-known, is somewhat new in geospatial circles; this is where the novelty of the paper resides. SDI has been shown to enable and aid development in diverse ways. However, stimulating developing ...

  7. An RFID-enabled framework to support Ambient Home Care Services

    OpenAIRE

    Martín Rodríguez, Henar; Metola Moreno, Eduardo; Bergesio, Luca; Bernardos Barbolla, Ana M.; Iglesias Alvarez, Josué; Casar Corredera, Jose Ramon

    2010-01-01

    The growing number of elderly in modern societies is encouraging advances in remote assistive solutions to enable sustainable and safe ‘ageing in place’. Among the many technologies which may serve to support Ambient Home Care Systems, RFID is offering a set of differential features which make it suitable to build new interaction schemes while supporting horizontal system’s features such as localization. This paper details the design of a passive RFID-based AHCS, composed by an infras...

  8. Space Partitioning for Privacy Enabled 3D City Models

    Science.gov (United States)

    Filippovska, Y.; Wichmann, A.; Kada, M.

    2016-10-01

    Due to recent technological progress, data capturing and processing of highly detailed (3D) data has become extensive. And despite all prospects of potential uses, data that includes personal living spaces and public buildings can also be considered as a serious intrusion into people's privacy and a threat to security. It becomes especially critical if data is visible by the general public. Thus, a compromise is needed between open access to data and privacy requirements which can be very different for each application. As privacy is a complex and versatile topic, the focus of this work particularly lies on the visualization of 3D urban data sets. For the purpose of privacy enabled visualizations of 3D city models, we propose to partition the (living) spaces into privacy regions, each featuring its own level of anonymity. Within each region, the depicted 2D and 3D geometry and imagery is anonymized with cartographic generalization techniques. The underlying spatial partitioning is realized as a 2D map generated as a straight skeleton of the open space between buildings. The resulting privacy cells are then merged according to the privacy requirements associated with each building to form larger regions, their borderlines smoothed, and transition zones established between privacy regions to have a harmonious visual appearance. It is exemplarily demonstrated how the proposed method generates privacy enabled 3D city models.

  9. Perspectives on Modelling BIM-enabled Estimating Practices

    Directory of Open Access Journals (Sweden)

    Willy Sher

    2014-12-01

    Full Text Available BIM-enabled estimating processes do not replace or provide a substitute for the traditional approaches used in the architecture, engineering and construction industries. This paper explores the impact of BIM on these traditional processes.  It identifies differences between the approaches used with BIM and other conventional methods, and between the various construction professionals that prepare estimates. We interviewed 17 construction professionals from client organizations, contracting organizations, consulting practices and specialist-project firms. Our analyses highlight several logical relationships between estimating processes and BIM attributes. Estimators need to respond to the challenges BIM poses to traditional estimating practices. BIM-enabled estimating circumvents long-established conventions and traditional approaches, and focuses on data management.  Consideration needs to be given to the model data required for estimating, to the means by which these data may be harnessed when exported, to the means by which the integrity of model data are protected, to the creation and management of tools that work effectively and efficiently in multi-disciplinary settings, and to approaches that narrow the gap between virtual reality and actual reality.  Areas for future research are also identified in the paper.

  10. Graphical Model Debugger Framework for Embedded Systems

    DEFF Research Database (Denmark)

    Zeng, Kebin; Guo, Yu; Angelov, Christo K.

    2010-01-01

    Model Driven Software Development has offered a faster way to design and implement embedded real-time software by moving the design to a model level, and by transforming models to code. However, the testing of embedded systems has remained at the code level. This paper presents a Graphical Model...... Debugger Framework, providing an auxiliary avenue of analysis of system models at runtime by executing generated code and updating models synchronously, which allows embedded developers to focus on the model level. With the model debugger, embedded developers can graphically test their design model...

  11. Framework of Distributed Coupled Atmosphere-Ocean-Wave Modeling System

    Institute of Scientific and Technical Information of China (English)

    WEN Yuanqiao; HUANG Liwen; DENG Jian; ZHANG Jinfeng; WANG Sisi; WANG Lijun

    2006-01-01

    In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed.The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.

  12. A Framework for Modelling Software Requirements

    Directory of Open Access Journals (Sweden)

    Dhirendra Pandey

    2011-05-01

    Full Text Available Requirement engineering plays an important role in producing quality software products. In recent past years, some approaches of requirement framework have been designed to provide an end-to-end solution for system development life cycle. Textual requirements specifications are difficult to learn, design, understand, review, and maintain whereas pictorial modelling is widely recognized as an effective requirement analysis tool. In this paper, we will present a requirement modelling framework with the analysis of modern requirements modelling techniques. Also, we will discuss various domains of requirement engineering with the help of modelling elements such as semantic map of business concepts, lifecycles of business objects, business processes, business rules, system context diagram, use cases and their scenarios, constraints, and user interface prototypes. The proposed framework will be illustrated with the case study of inventory management system.

  13. Submicrometric Magnetic Nanoporous Carbons Derived from Metal-Organic Frameworks Enabling Automated Electromagnet-Assisted Online Solid-Phase Extraction.

    Science.gov (United States)

    Frizzarin, Rejane M; Palomino Cabello, Carlos; Bauzà, Maria Del Mar; Portugal, Lindomar A; Maya, Fernando; Cerdà, Víctor; Estela, José M; Turnes Palomino, Gemma

    2016-07-19

    We present the first application of submicrometric magnetic nanoporous carbons (μMNPCs) as sorbents for automated solid-phase extraction (SPE). Small zeolitic imidazolate framework-67 crystals are obtained at room temperature and directly carbonized under an inert atmosphere to obtain submicrometric nanoporous carbons containing magnetic cobalt nanoparticles. The μMNPCs have a high contact area, high stability, and their preparation is simple and cost-effective. The prepared μMNPCs are exploited as sorbents in a microcolumn format in a sequential injection analysis (SIA) system with online spectrophotometric detection, which includes a specially designed three-dimensional (3D)-printed holder containing an automatically actuated electromagnet. The combined action of permanent magnets and an automatically actuated electromagnet enabled the movement of the solid bed of particles inside the microcolumn, preventing their aggregation, increasing the versatility of the system, and increasing the preconcentration efficiency. The method was optimized using a full factorial design and Doehlert Matrix. The developed system was applied to the determination of anionic surfactants, exploiting the retention of the ion-pairs formed with Methylene Blue on the μMNPC. Using sodium dodecyl sulfate as a model analyte, quantification was linear from 50 to 1000 μg L(-1), and the detection limit was equal to 17.5 μg L(-1), the coefficient of variation (n = 8; 100 μg L(-1)) was 2.7%, and the analysis throughput was 13 h(-1). The developed approach was applied to the determination of anionic surfactants in water samples (natural water, groundwater, and wastewater), yielding recoveries of 93% to 110% (95% confidence level).

  14. Enabling Cross-Discipline Collaboration Via a Functional Data Model

    Science.gov (United States)

    Lindholm, D. M.; Wilson, A.; Baltzer, T.

    2016-12-01

    Many research disciplines have very specialized data models that are used to express the detailed semantics that are meaningful to that community and easily utilized by their data analysis tools. While invaluable to members of that community, such expressive data structures and metadata are of little value to potential collaborators from other scientific disciplines. Many data interoperability efforts focus on the difficult task of computationally mapping concepts from one domain to another to facilitate discovery and use of data. Although these efforts are important and promising, we have found that a great deal of discovery and dataset understanding still happens at the level of less formal, personal communication. However, a significant barrier to inter-disciplinary data sharing that remains is one of data access.Scientists and data analysts continue to spend inordinate amounts of time simply trying to get data into their analysis tools. Providing data in a standard file format is often not sufficient since data can be structured in many ways. Adhering to more explicit community standards for data structure and metadata does little to help those in other communities.The Functional Data Model specializes the Relational Data Model (used by many database systems)by defining relations as functions between independent (domain) and dependent (codomain) variables. Given that arrays of data in many scientific data formats generally represent functionally related parameters (e.g. temperature as a function of space and time), the Functional Data Model is quite relevant for these datasets as well. The LaTiS software framework implements the Functional Data Model and provides a mechanism to expose an existing data source as a LaTiS dataset. LaTiS datasets can be manipulated using a Functional Algebra and output in any number of formats.LASP has successfully used the Functional Data Model and its implementation in the LaTiS software framework to bridge the gap between

  15. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  16. Titan: An Enabling Framework for Activity-Aware "Pervasive Apps" in Opportunistic Personal Area Networks

    Directory of Open Access Journals (Sweden)

    Roggen Daniel

    2011-01-01

    Full Text Available Upcoming ambient intelligence environments will boast ever larger number of sensor nodes readily available on body, in objects, and in the user's surroundings. We envision "Pervasive Apps", user-centric activity-aware pervasive computing applications. They use available sensors for activity recognition. They are downloadable from application repositories, much like current Apps for mobile phones. A key challenge is to provide Pervasive Apps in open-ended environments where resource availability cannot be predicted. We therefore introduce Titan, a service-oriented framework supporting design, development, deployment, and execution of activity-aware Pervasive Apps. With Titan, mobile devices inquire surrounding nodes about available services. Internet-based application repositories compose applications based on available services as a service graph. The mobile device maps the service graph to Titan Nodes. The execution of the service graph is distributed and can be remapped at run time upon changing resource availability. The framework is geared to streaming data processing and machine learning, which is key for activity recognition. We demonstrate Titan in a pervasive gaming application involving smart dice and a sensorized wristband. We comparatively present the implementation cost and performance and discuss how novel machine learning methodologies may enhance the flexibility of the mapping of service graphs to opportunistically available nodes.

  17. Enabling Proactivity in Context-aware Middleware Systems by means of a Planning Framework based on HTN Planning

    Directory of Open Access Journals (Sweden)

    Preeti Bhargava

    2015-08-01

    Full Text Available Today’s context-aware systems tend to be reactive or ‘pull’ based - the user requests or queries for some information and the system responds with the requested information. However, none of the systems anticipate the user’s intent and behavior, or take into account his current events and activities to pro-actively ‘push’ relevant information to the user. On the other hand, Proactive context-aware systems can predict and anticipate user intent and behavior, and act proactively on the users’ behalf without explicit requests from them. Two fundamental capabilities of such systems are: prediction and autonomy. In this paper, we address the second capability required by a context-aware system to act proactively i.e. acting autonomously without an explicit user request. To address it, we present a new paradigm for enabling proactivity in context-aware middleware systems by means of a Planning Framework based on HTN planning. We present the design of a Planning Framework within the infrastructure of our intelligent context-aware middleware called Rover II. We also implement this framework and evaluate its utility with several use cases. We also highlight the benefits of using such a framework in dynamic ubiquitous systems.

  18. A legal framework to enable sharing of Clinical Decision Support knowledge and services across institutional boundaries.

    Science.gov (United States)

    Hongsermeier, Tonya; Maviglia, Saverio; Tsurikova, Lana; Bogaty, Dan; Rocha, Roberto A; Goldberg, Howard; Meltzer, Seth; Middleton, Blackford

    2011-01-01

    The goal of the CDS Consortium (CDSC) is to assess, define, demonstrate, and evaluate best practices for knowledge management and clinical decision support in healthcare information technology at scale - across multiple ambulatory care settings and Electronic Health Record technology platforms. In the course of the CDSC research effort, it became evident that a sound legal foundation was required for knowledge sharing and clinical decision support services in order to address data sharing, intellectual property, accountability, and liability concerns. This paper outlines the framework utilized for developing agreements in support of sharing, accessing, and publishing content via the CDSC Knowledge Management Portal as well as an agreement in support of deployment and consumption of CDSC developed web services in the context of a research project under IRB oversight.

  19. Enabling Advanced Modeling and Simulations for Fuel-Flexible Combustors

    Energy Technology Data Exchange (ETDEWEB)

    Heinz Pitsch

    2010-05-31

    The overall goal of the present project is to enable advanced modeling and simulations for the design and optimization of fuel-flexible turbine combustors. For this purpose we use a high-fidelity, extensively-tested large-eddy simulation (LES) code and state-of-the-art models for premixed/partially-premixed turbulent combustion developed in the PI's group. In the frame of the present project, these techniques are applied, assessed, and improved for hydrogen enriched premixed and partially premixed gas-turbine combustion. Our innovative approaches include a completely consistent description of flame propagation, a coupled progress variable/level set method to resolve the detailed flame structure, and incorporation of thermal-diffusion (non-unity Lewis number) effects. In addition, we have developed a general flamelet-type transformation holding in the limits of both non-premixed and premixed burning. As a result, a model for partially premixed combustion has been derived. The coupled progress variable/level method and the general flamelet tranformation were validated by LES of a lean-premixed low-swirl burner that has been studied experimentally at Lawrence Berkeley National Laboratory. The model is extended to include the non-unity Lewis number effects, which play a critical role in fuel-flexible combustor with high hydrogen content fuel. More specifically, a two-scalar model for lean hydrogen and hydrogen-enriched combustion is developed and validated against experimental and direct numerical simulation (DNS) data. Results are presented to emphasize the importance of non-unity Lewis number effects in the lean-premixed low-swirl burner of interest in this project. The proposed model gives improved results, which shows that the inclusion of the non-unity Lewis number effects is essential for accurate prediction of the lean-premixed low-swirl flame.

  20. Enabling Advanced Modeling and Simulations for Fuel-Flexible Combustors

    Energy Technology Data Exchange (ETDEWEB)

    Pitsch, Heinz

    2010-05-31

    The overall goal of the present project is to enable advanced modeling and simulations for the design and optimization of fuel-flexible turbine combustors. For this purpose we use a high fidelity, extensively-tested large-eddy simulation (LES) code and state-of-the-art models for premixed/partially-premixed turbulent combustion developed in the PI's group. In the frame of the present project, these techniques are applied, assessed, and improved for hydrogen enriched premixed and partially premixed gas-turbine combustion. Our innovative approaches include a completely consistent description of flame propagation; a coupled progress variable/level set method to resolve the detailed flame structure, and incorporation of thermal-diffusion (non-unity Lewis number) effects. In addition, we have developed a general flamelet-type transformation holding in the limits of both non-premixed and premixed burning. As a result, a model for partially premixed combustion has been derived. The coupled progress variable/level method and the general flamelet transformation were validated by LES of a lean-premixed low-swirl burner that has been studied experimentally at Lawrence Berkeley National Laboratory. The model is extended to include the non-unity Lewis number effects, which play a critical role in fuel-flexible combustor with high hydrogen content fuel. More specifically, a two-scalar model for lean hydrogen and hydrogen-enriched combustion is developed and validated against experimental and direct numerical simulation (DNS) data. Results are presented to emphasize the importance of non-unity Lewis number effects in the lean-premixed low-swirl burner of interest in this project. The proposed model gives improved results, which shows that the inclusion of the non-unity Lewis number effects is essential for accurate prediction of the lean-premixed low-swirl flame.

  1. A Model-Driven Framework to Develop Personalized Health Monitoring

    Directory of Open Access Journals (Sweden)

    Algimantas Venčkauskas

    2016-07-01

    Full Text Available Both distributed healthcare systems and the Internet of Things (IoT are currently hot topics. The latter is a new computing paradigm to enable advanced capabilities in engineering various applications, including those for healthcare. For such systems, the core social requirement is the privacy/security of the patient information along with the technical requirements (e.g., energy consumption and capabilities for adaptability and personalization. Typically, the functionality of the systems is predefined by the patient’s data collected using sensor networks along with medical instrumentation; then, the data is transferred through the Internet for treatment and decision-making. Therefore, systems creation is indeed challenging. In this paper, we propose a model-driven framework to develop the IoT-based prototype and its reference architecture for personalized health monitoring (PHM applications. The framework contains a multi-layered structure with feature-based modeling and feature model transformations at the top and the application software generation at the bottom. We have validated the framework using available tools and developed an experimental PHM to test some aspects of the functionality of the reference architecture in real time. The main contribution of the paper is the development of the model-driven computational framework with emphasis on the synergistic effect of security and energy issues.

  2. Integration of utilities infrastructures in a future internet enabled smart city framework.

    Science.gov (United States)

    Sánchez, Luis; Elicegui, Ignacio; Cuesta, Javier; Muñoz, Luis; Lanza, Jorge

    2013-10-25

    Improving efficiency of city services and facilitating a more sustainable development of cities are the main drivers of the smart city concept. Information and Communication Technologies (ICT) play a crucial role in making cities smarter, more accessible and more open. In this paper we present a novel architecture exploiting major concepts from the Future Internet (FI) paradigm addressing the challenges that need to be overcome when creating smarter cities. This architecture takes advantage of both the critical communications infrastructures already in place and owned by the utilities as well as of the infrastructure belonging to the city municipalities to accelerate efficient provision of existing and new city services. The paper highlights how FI technologies create the necessary glue and logic that allows the integration of current vertical and isolated city services into a holistic solution, which enables a huge forward leap for the efficiency and sustainability of our cities. Moreover, the paper describes a real-world prototype, that instantiates the aforementioned architecture, deployed in one of the parks of the city of Santander providing an autonomous public street lighting adaptation service. This prototype is a showcase on how added-value services can be seamlessly created on top of the proposed architecture.

  3. Integration of Utilities Infrastructures in a Future Internet Enabled Smart City Framework

    Science.gov (United States)

    Sánchez, Luis; Elicegui, Ignacio; Cuesta, Javier; Muñoz, Luis; Lanza, Jorge

    2013-01-01

    Improving efficiency of city services and facilitating a more sustainable development of cities are the main drivers of the smart city concept. Information and Communication Technologies (ICT) play a crucial role in making cities smarter, more accessible and more open. In this paper we present a novel architecture exploiting major concepts from the Future Internet (FI) paradigm addressing the challenges that need to be overcome when creating smarter cities. This architecture takes advantage of both the critical communications infrastructures already in place and owned by the utilities as well as of the infrastructure belonging to the city municipalities to accelerate efficient provision of existing and new city services. The paper highlights how FI technologies create the necessary glue and logic that allows the integration of current vertical and isolated city services into a holistic solution, which enables a huge forward leap for the efficiency and sustainability of our cities. Moreover, the paper describes a real-world prototype, that instantiates the aforementioned architecture, deployed in one of the parks of the city of Santander providing an autonomous public street lighting adaptation service. This prototype is a showcase on how added-value services can be seamlessly created on top of the proposed architecture. PMID:24233072

  4. Integration of Utilities Infrastructures in a Future Internet Enabled Smart City Framework

    Directory of Open Access Journals (Sweden)

    Luis Sánchez

    2013-10-01

    Full Text Available Improving efficiency of city services and facilitating a more sustainable development of cities are the main drivers of the smart city concept. Information and Communication Technologies (ICT play a crucial role in making cities smarter, more accessible and more open. In this paper we present a novel architecture exploiting major concepts from the Future Internet (FI paradigm addressing the challenges that need to be overcome when creating smarter cities. This architecture takes advantage of both the critical communications infrastructures already in place and owned by the utilities as well as of the infrastructure belonging to the city municipalities to accelerate efficient provision of existing and new city services. The paper highlights how FI technologies create the necessary glue and logic that allows the integration of current vertical and isolated city services into a holistic solution, which enables a huge forward leap for the efficiency and sustainability of our cities. Moreover, the paper describes a real-world prototype, that instantiates the aforementioned architecture, deployed in one of the parks of the city of Santander providing an autonomous public street lighting adaptation service. This prototype is a showcase on how added-value services can be seamlessly created on top of the proposed architecture.

  5. A framework of benchmarking land models

    Science.gov (United States)

    Luo, Y. Q.; Randerson, J.; Abramowitz, G.; Bacour, C.; Blyth, E.; Carvalhais, N.; Ciais, P.; Dalmonech, D.; Fisher, J.; Fisher, R.; Friedlingstein, P.; Hibbard, K.; Hoffman, F.; Huntzinger, D.; Jones, C. D.; Koven, C.; Lawrence, D.; Li, D. J.; Mahecha, M.; Niu, S. L.; Norby, R.; Piao, S. L.; Qi, X.; Peylin, P.; Prentice, I. C.; Riley, W.; Reichstein, M.; Schwalm, C.; Wang, Y. P.; Xia, J. Y.; Zaehle, S.; Zhou, X. H.

    2012-02-01

    Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  6. A framework of benchmarking land models

    Directory of Open Access Journals (Sweden)

    Y. Q. Luo

    2012-02-01

    Full Text Available Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1 targeted aspects of model performance to be evaluated; (2 a set of benchmarks as defined references to test model performance; (3 metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4 model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1 a priori thresholds of acceptable model performance and (2 a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.

  7. A Design for Computationally Enabled Analyses Supporting the Pre-Intervention Analytical Framework (PIAF)

    Science.gov (United States)

    2015-06-01

    feedback loops. It involves activities including information for- aging, encoding, and reasoning. These two factors suggest important re- quirements...implications. First, sensemaking is an iterative process with numerous feedback loops. Pirolli and Card (2005) illustrate this well (see Figure 1). Figure 1...addressed when mod- els are adequately transparent and when model outcomes are interpreted properly. In the case of the PIAF, whether they be argument

  8. Toward Holistic Scene Understanding: Feedback Enabled Cascaded Classification Models.

    Science.gov (United States)

    Li, Congcong; Kowdle, Adarsh; Saxena, Ashutosh; Chen, Tsuhan

    2012-07-01

    Scene understanding includes many related subtasks, such as scene categorization, depth estimation, object detection, etc. Each of these subtasks is often notoriously hard, and state-of-the-art classifiers already exist for many of them. These classifiers operate on the same raw image and provide correlated outputs. It is desirable to have an algorithm that can capture such correlation without requiring any changes to the inner workings of any classifier. We propose Feedback Enabled Cascaded Classification Models (FE-CCM), that jointly optimizes all the subtasks while requiring only a "black box" interface to the original classifier for each subtask. We use a two-layer cascade of classifiers, which are repeated instantiations of the original ones, with the output of the first layer fed into the second layer as input. Our training method involves a feedback step that allows later classifiers to provide earlier classifiers information about which error modes to focus on. We show that our method significantly improves performance in all the subtasks in the domain of scene understanding, where we consider depth estimation, scene categorization, event categorization, object detection, geometric labeling, and saliency detection. Our method also improves performance in two robotic applications: an object-grasping robot and an object-finding robot.

  9. Authentication Model Based Bluetooth-enabled Mobile Phone

    Directory of Open Access Journals (Sweden)

    Rania Abdelhameed

    2005-01-01

    Full Text Available Authentication is a mechanism to establish proof of identities, the authentication process ensure that who a particular user is. Current PC, laptop user authentication systems are always done once and hold until it explicitly revoked by the user, or asking the user to frequently reestablish his identity which encouraging him to disable authentication. Zero-Interaction Authentication (ZIA provides solution to this problem. In ZIA, a user wears a small authentication token that communicates with a laptop over a short-range, wireless link. ZIA combine authentication with a file encryption. Here we proposed a Laptop-user Authentication Based Mobile phone (LABM, in our model of authentication, a user uses his Bluetooth-enabled mobile phone, which work as an authentication token that provides the authentication for laptop over a Bluetooth wireless link, in the concept of transient authentication with out combining it with encryption file system. The user authenticate to the mobile phone infrequently. In turn, the mobile phone continuously authenticates to the laptop by means of the short-range, wireless link.

  10. Object-Relational Mapping Framework to Enable Multi-Tenancy Attributes in SaaS Applications

    Directory of Open Access Journals (Sweden)

    Muhammad Naeem Khan

    2012-11-01

    Full Text Available During the last decade, there has been a major paradigm shift in the way the software services are being provided to the enterprise and corporate sector. Instead of using on-premises LOB (Line of Business applications, corporations and enterprises are switching to off-premises host applications that are now being offered as a service by several software companies. This new concept of providing software service is generally known as SaaS (i.e., Software as a Service. However, the adaptation of such a model necessitates that the applications - that are required to be provided as a service should be generalized for users or groups of users. The users or user groups ordinarily correspond to a company or group of companies/businesses and are termed as tenants. In this regard, the architecture of SaaS applications needs to be customized to support certain characteristics – e.g., configurability, maintainability and scalability – to support a diverse numbers of users. This paper, firstly, analyzes new trends in the present day business environment alongside the hardware and software industry that led to the development of SaaS model; and then looks into the characteristics and features that a multi-tenant system needs to possess in order to put this concept into practice.

  11. An evaluation framework for participatory modelling

    Science.gov (United States)

    Krueger, T.; Inman, A.; Chilvers, J.

    2012-04-01

    Strong arguments for participatory modelling in hydrology can be made on substantive, instrumental and normative grounds. These arguments have led to increasingly diverse groups of stakeholders (here anyone affecting or affected by an issue) getting involved in hydrological research and the management of water resources. In fact, participation has become a requirement of many research grants, programs, plans and policies. However, evidence of beneficial outcomes of participation as suggested by the arguments is difficult to generate and therefore rare. This is because outcomes are diverse, distributed, often tacit, and take time to emerge. In this paper we develop an evaluation framework for participatory modelling focussed on learning outcomes. Learning encompasses many of the potential benefits of participation, such as better models through diversity of knowledge and scrutiny, stakeholder empowerment, greater trust in models and ownership of subsequent decisions, individual moral development, reflexivity, relationships, social capital, institutional change, resilience and sustainability. Based on the theories of experiential, transformative and social learning, complemented by practitioner experience our framework examines if, when and how learning has occurred. Special emphasis is placed on the role of models as learning catalysts. We map the distribution of learning between stakeholders, scientists (as a subgroup of stakeholders) and models. And we analyse what type of learning has occurred: instrumental learning (broadly cognitive enhancement) and/or communicative learning (change in interpreting meanings, intentions and values associated with actions and activities; group dynamics). We demonstrate how our framework can be translated into a questionnaire-based survey conducted with stakeholders and scientists at key stages of the participatory process, and show preliminary insights from applying the framework within a rural pollution management situation in

  12. Talking Cure Models: A Framework of Analysis

    Science.gov (United States)

    Marx, Christopher; Benecke, Cord; Gumz, Antje

    2017-01-01

    Psychotherapy is commonly described as a “talking cure,” a treatment method that operates through linguistic action and interaction. The operative specifics of therapeutic language use, however, are insufficiently understood, mainly due to a multitude of disparate approaches that advance different notions of what “talking” means and what “cure” implies in the respective context. Accordingly, a clarification of the basic theoretical structure of “talking cure models,” i.e., models that describe therapeutic processes with a focus on language use, is a desideratum of language-oriented psychotherapy research. Against this background the present paper suggests a theoretical framework of analysis which distinguishes four basic components of “talking cure models”: (1) a foundational theory (which suggests how linguistic activity can affect and transform human experience), (2) an experiential problem state (which defines the problem or pathology of the patient), (3) a curative linguistic activity (which defines linguistic activities that are supposed to effectuate a curative transformation of the experiential problem state), and (4) a change mechanism (which defines the processes and effects involved in such transformations). The purpose of the framework is to establish a terminological foundation that allows for systematically reconstructing basic properties and operative mechanisms of “talking cure models.” To demonstrate the applicability and utility of the framework, five distinct “talking cure models” which spell out the details of curative “talking” processes in terms of (1) catharsis, (2) symbolization, (3) narrative, (4) metaphor, and (5) neurocognitive inhibition are introduced and discussed in terms of the framework components. In summary, we hope that our framework will prove useful for the objective of clarifying the theoretical underpinnings of language-oriented psychotherapy research and help to establish a more comprehensive

  13. Computer-aided modeling framework – a generic modeling template

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    This work focuses on the development of a computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured on workflows for different modeling tasks. The overall objective is to support model developers and users to generate...... and test models systematically, efficiently and reliably. In this way, development of products and processes can be made faster, cheaper and more efficient. In this contribution, as part of the framework, a generic modeling template for the systematic derivation of problem specific models is presented....... The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene...

  14. `Dhara': An Open Framework for Critical Zone Modeling

    Science.gov (United States)

    Le, P. V.; Kumar, P.

    2016-12-01

    Processes in the Critical Zone, which sustain terrestrial life, are tightly coupled across hydrological, physical, biological, chemical, pedological, geomorphological and ecological domains over both short and long timescales. Observations and quantification of the Earth's surface across these domains using emerging high resolution measurement technologies such as light detection and ranging (lidar) and hyperspectral remote sensing are enabling us to characterize fine scale landscape attributes over large spatial areas. This presents a unique opportunity to develop novel approaches to model the Critical Zone that can capture fine scale intricate dependencies across the different processes in 3D. The development of interdisciplinary tools that transcend individual disciplines and capture new levels of complexity and emergent properties is at the core of Critical Zone science. Here we introduce an open framework for high-performance computing model (`Dhara') for modeling complex processes in the Critical Zone. The framework is designed to be modular in structure with the aim to create uniform and efficient tools to facilitate and leverage process modeling. It also provides flexibility to maintain, collaborate, and co-develop additional components by the scientific community. We show the essential framework that simulates ecohydrologic dynamics, and surface - sub-surface coupling in 3D using hybrid parallel CPU-GPU. We demonstrate that the open framework in Dhara is feasible for detailed, multi-processes, and large-scale modeling of the Critical Zone, which opens up exciting possibilities. We will also present outcomes from a Modeling Summer Institute led by Intensively Managed Critical Zone Observatory (IMLCZO) with representation from several CZOs and international representatives.

  15. An entropic framework for modeling economies

    Science.gov (United States)

    Caticha, Ariel; Golan, Amos

    2014-08-01

    We develop an information-theoretic framework for economic modeling. This framework is based on principles of entropic inference that are designed for reasoning on the basis of incomplete information. We take the point of view of an external observer who has access to limited information about broad macroscopic economic features. We view this framework as complementary to more traditional methods. The economy is modeled as a collection of agents about whom we make no assumptions of rationality (in the sense of maximizing utility or profit). States of statistical equilibrium are introduced as those macrostates that maximize entropy subject to the relevant information codified into constraints. The basic assumption is that this information refers to supply and demand and is expressed in the form of the expected values of certain quantities (such as inputs, resources, goods, production functions, utility functions and budgets). The notion of economic entropy is introduced. It provides a measure of the uniformity of the distribution of goods and resources. It captures both the welfare state of the economy as well as the characteristics of the market (say, monopolistic, concentrated or competitive). Prices, which turn out to be the Lagrange multipliers, are endogenously generated by the economy. Further studies include the equilibrium between two economies and the conditions for stability. As an example, the case of the nonlinear economy that arises from linear production and utility functions is treated in some detail.

  16. MDM: A Mode Diagram Modeling Framework

    Directory of Open Access Journals (Sweden)

    Zheng Wang

    2012-12-01

    Full Text Available Periodic control systems used in spacecrafts and automotives are usually period-driven and can be decomposed into different modes with each mode representing a system state observed from outside. Such systems may also involve intensive computing in their modes. Despite the fact that such control systems are widely used in the above-mentioned safety-critical embedded domains, there is lack of domain-specific formal modelling languages for such systems in the relevant industry. To address this problem, we propose a formal visual modeling framework called mode diagram as a concise and precise way to specify and analyze such systems. To capture the temporal properties of periodic control systems, we provide, along with mode diagram, a property specification language based on interval logic for the description of concrete temporal requirements the engineers are concerned with. The statistical model checking technique can then be used to verify the mode diagram models against desired properties. To demonstrate the viability of our approach, we have applied our modelling framework to some real life case studies from industry and helped detect two design defects for some spacecraft control systems.

  17. Semantic techniques for enabling knowledge reuse in conceptual modelling

    NARCIS (Netherlands)

    Gracia, J.; Liem, J.; Lozano, E.; Corcho, O.; Trna, M.; Gómez-Pérez, A.; Bredeweg, B.

    2010-01-01

    Conceptual modelling tools allow users to construct formal representations of their conceptualisations. These models are typically developed in isolation, unrelated to other user models, thus losing the opportunity of incorporating knowledge from other existing models or ontologies that might enrich

  18. A Framework of Memory Consistency Models

    Institute of Scientific and Technical Information of China (English)

    胡伟武; 施巍松; 等

    1998-01-01

    Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.

  19. A system-level multiprocessor system-on-chip modeling framework

    DEFF Research Database (Denmark)

    Virk, Kashif Munir; Madsen, Jan

    2004-01-01

    We present a system-level modeling framework to model system-on-chips (SoC) consisting of heterogeneous multiprocessors and network-on-chip communication structures in order to enable the developers of today's SoC designs to take advantage of the flexibility and scalability of network-on-chip...

  20. The Marine Virtual Laboratory: enabling efficient ocean model configuration

    Directory of Open Access Journals (Sweden)

    P. R. Oke

    2015-11-01

    Full Text Available The technical steps involved in configuring a regional ocean model are analogous for all community models. All require the generation of a model grid, preparation and interpolation of topography, initial conditions, and forcing fields. Each task in configuring a regional ocean model is straight-forward – but the process of downloading and reformatting data can be time-consuming. For an experienced modeller, the configuration of a new model domain can take as little as a few hours – but for an inexperienced modeller, it can take much longer. In pursuit of technical efficiency, the Australian ocean modelling community has developed the Web-based MARine Virtual Laboratory (WebMARVL. WebMARVL allows a user to quickly and easily configure an ocean general circulation or wave model through a simple interface, reducing the time to configure a regional model to a few minutes. Through WebMARVL, a user is prompted to define the basic options needed for a model configuration, including the: model, run duration, spatial extent, and input data. Once all aspects of the configuration are selected, a series of data extraction, reprocessing, and repackaging services are run, and a "take-away bundle" is prepared for download. Building on the capabilities developed under Australia's Integrated Marine Observing System, WebMARVL also extracts all of the available observations for the chosen time-space domain. The user is able to download the take-away bundle, and use it to run the model of their choice. Models supported by WebMARVL include three community ocean general circulation models, and two community wave models. The model configuration from the take-away bundle is intended to be a starting point for scientific research. The user may subsequently refine the details of the model set-up to improve the model performance for the given application. In this study, WebMARVL is described along with a series of results from test cases comparing Web

  1. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    Science.gov (United States)

    Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas

    2003-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  2. A human brainstem glioma xenograft model enabled for bioluminescence imaging

    OpenAIRE

    Hashizume, Rintaro; Ozawa, Tomoko; Dinca, Eduard B.; Banerjee, Anuradha; Prados, Michael D.; James, Charles D.; Gupta, Nalin

    2009-01-01

    Despite the use of radiation and chemotherapy, the prognosis for children with diffuse brainstem gliomas is extremely poor. There is a need for relevant brainstem tumor models that can be used to test new therapeutic agents and delivery systems in pre-clinical studies. We report the development of a brainstem-tumor model in rats and the application of bioluminescence imaging (BLI) for monitoring tumor growth and response to therapy as part of this model. Luciferase-modified human glioblastoma...

  3. A Procurement Performance Model for Construction Frameworks

    Directory of Open Access Journals (Sweden)

    Terence Y M Lam

    2015-07-01

    Full Text Available Collaborative construction frameworks have been developed in the United Kingdom (UK to create longer term relationships between clients and suppliers in order to improve project outcomes. Research undertaken into highways maintenance set within a major county council has confirmed that such collaborative procurement methods can improve time, cost and quality of construction projects. Building upon this and examining the same single case, this research aims to develop a performance model through identification of performance drivers in the whole project delivery process including pre and post contract phases. A priori performance model based on operational and sociological constructs was proposed and then checked by a pilot study. Factor analysis and central tendency statistics from the questionnaires as well as content analysis from the interview transcripts were conducted. It was confirmed that long term relationships, financial and non-financial incentives and stronger communication are the sociological behaviour factors driving performance. The interviews also established that key performance indicators (KPIs can be used as an operational measure to improve performance. With the posteriori performance model, client project managers can effectively collaboratively manage contractor performance through procurement measures including use of longer term and KPIs for the contract so that the expected project outcomes can be achieved. The findings also make significant contribution to construction framework procurement theory by identifying the interrelated sociological and operational performance drivers. This study is set predominantly in the field of highways civil engineering. It is suggested that building based projects or other projects that share characteristics are grouped together and used for further research of the phenomena discovered.

  4. Conceptual Frameworks in the Doctoral Research Process: A Pedagogical Model

    Science.gov (United States)

    Berman, Jeanette; Smyth, Robyn

    2015-01-01

    This paper contributes to consideration of the role of conceptual frameworks in the doctoral research process. Through reflection on the two authors' own conceptual frameworks for their doctoral studies, a pedagogical model has been developed. The model posits the development of a conceptual framework as a core element of the doctoral…

  5. An Agent Memory Model Enabling Rational and Biased Reasoning

    NARCIS (Netherlands)

    Heuvelink, A.; Klein, M.C.A.; Treur, J.

    2008-01-01

    This paper presents an architecture for a memory model that facilitates versatile reasoning mechanisms over the beliefs stored in an agent's belief base. Based on an approach for belief aggregation, a model is introduced for controlling both the formation of abstract and complex beliefs and the

  6. Medicare Care Choices Model Enables Concurrent Palliative and Curative Care.

    Science.gov (United States)

    2015-01-01

    On July 20, 2015, the federal Centers for Medicare & Medicaid Services (CMS) announced hospices that have been selected to participate in the Medicare Care Choices Model. Fewer than half of the Medicare beneficiaries use hospice care for which they are eligible. Current Medicare regulations preclude concurrent palliative and curative care. Under the Medicare Choices Model, dually eligible Medicare beneficiaries may elect to receive supportive care services typically provided by hospice while continuing to receive curative services. This report describes how CMS has expanded the model from an originally anticipated 30 Medicare-certified hospices to over 140 Medicare-certified hospices and extended the duration of the model from 3 to 5 years. Medicare-certified hospice programs that will participate in the model are listed.

  7. A Conversation Model Enabling Intelligent Agents to Give Emotional Support

    OpenAIRE

    Van der Zwaan, J.M.; Dignum, V; Jonker, C.M.

    2012-01-01

    In everyday life, people frequently talk to others to help them deal with negative emotions. To some extent, everybody is capable of comforting other people, but so far conversational agents are unable to deal with this type of situation. To provide intelligent agents with the capability to give emotional support, we propose a domain-independent conversational model that is based on topics suggested by cognitive appraisal theories of emotion and the 5-phase model that is used to structure onl...

  8. Enabling linear model for the IMGC-02 absolute gravimeter

    CERN Document Server

    Nagornyi, V D; Svitlov, S

    2013-01-01

    Measurement procedures of most rise-and-fall absolute gravimeters has to resolve singularity at the apex of the trajectory caused by the discrete fringe counting in the Michelson-type interferometers. Traditionally the singularity is addressed by implementing non-linear models of the trajectory, but they introduce problems of their own, such as biasness, non-uniqueness, and instability of the gravity estimates. Using IMGC-02 gravimeter as example, we show that the measurement procedure of the rise-and-fall gravimeters can be based on the linear models which successfully resolve the singularity and provide rigorous estimates of the gravity value. The linear models also facilitate further enhancements of the instrument, such as accounting for new types of disturbances and active compensation for the vibrations.

  9. Computer Modeling of Carbon Metabolism Enables Biofuel Engineering (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2011-09-01

    In an effort to reduce the cost of biofuels, the National Renewable Energy Laboratory (NREL) has merged biochemistry with modern computing and mathematics. The result is a model of carbon metabolism that will help researchers understand and engineer the process of photosynthesis for optimal biofuel production.

  10. The Conceptual Integration Modeling Framework: Abstracting from the Multidimensional Model

    CERN Document Server

    Rizzolo, Flavio; Pottinger, Rachel; Wong, Kwok

    2010-01-01

    Data warehouses are overwhelmingly built through a bottom-up process, which starts with the identification of sources, continues with the extraction and transformation of data from these sources, and then loads the data into a set of data marts according to desired multidimensional relational schemas. End user business intelligence tools are added on top of the materialized multidimensional schemas to drive decision making in an organization. Unfortunately, this bottom-up approach is costly both in terms of the skilled users needed and the sheer size of the warehouses. This paper proposes a top-down framework in which data warehousing is driven by a conceptual model. The framework offers both design time and run time environments. At design time, a business user first uses the conceptual modeling language as a multidimensional object model to specify what business information is needed; then she maps the conceptual model to a pre-existing logical multidimensional representation. At run time, a system will tra...

  11. The framework for simulation of dynamics of mechanical aggregates

    OpenAIRE

    Ivankov, Petr R.; Ivankov, Nikolay P.

    2007-01-01

    A framework for simulation of dynamics of mechanical aggregates has been developed. This framework enables us to build model of aggregate from models of its parts. Framework is a part of universal framework for science and engineering.

  12. Model based risk assessment - the CORAS framework

    Energy Technology Data Exchange (ETDEWEB)

    Gran, Bjoern Axel; Fredriksen, Rune; Thunem, Atoosa P-J.

    2004-04-15

    Traditional risk analysis and assessment is based on failure-oriented models of the system. In contrast to this, model-based risk assessment (MBRA) utilizes success-oriented models describing all intended system aspects, including functional, operational and organizational aspects of the target. The target models are then used as input sources for complementary risk analysis and assessment techniques, as well as a basis for the documentation of the assessment results. The EU-funded CORAS project developed a tool-supported methodology for the application of MBRA in security-critical systems. The methodology has been tested with successful outcome through a series of seven trial within the telemedicine and ecommerce areas. The CORAS project in general and the CORAS application of MBRA in particular have contributed positively to the visibility of model-based risk assessment and thus to the disclosure of several potentials for further exploitation of various aspects within this important research field. In that connection, the CORAS methodology's possibilities for further improvement towards utilization in more complex architectures and also in other application domains such as the nuclear field can be addressed. The latter calls for adapting the framework to address nuclear standards such as IEC 60880 and IEC 61513. For this development we recommend applying a trial driven approach within the nuclear field. The tool supported approach for combining risk analysis and system development also fits well with the HRP proposal for developing an Integrated Design Environment (IDE) providing efficient methods and tools to support control room systems design. (Author)

  13. Domain-specific modeling enabling full code generation

    CERN Document Server

    Kelly, Steven

    2007-01-01

    Domain-Specific Modeling (DSM) is the latest approach tosoftware development, promising to greatly increase the speed andease of software creation. Early adopters of DSM have been enjoyingproductivity increases of 500–1000% in production for over adecade. This book introduces DSM and offers examples from variousfields to illustrate to experienced developers how DSM can improvesoftware development in their teams. Two authorities in the field explain what DSM is, why it works,and how to successfully create and use a DSM solution to improveproductivity and quality. Divided into four parts, the book covers:background and motivation; fundamentals; in-depth examples; andcreating DSM solutions. There is an emphasis throughout the book onpractical guidelines for implementing DSM, including how toidentify the nece sary language constructs, how to generate fullcode from models, and how to provide tool support for a new DSMlanguage. The example cases described in the book are available thebook's Website, www.dsmbook....

  14. ARCHITECTURES AND ALGORITHMS FOR COGNITIVE NETWORKS ENABLED BY QUALITATIVE MODELS

    DEFF Research Database (Denmark)

    Balamuralidhar, P.

    2013-01-01

    the qualitative models in a cognitive engine. Further I use the methodology in multiple functional scenarios of cognitive networks including self- optimization and self- monitoring. In the case of self-optimization, I integrate principles from monotonicity analysis to evaluate and enhance qualitative models......Complexity of communication networks is ever increasing and getting complicated by their heterogeneity and dynamism. Traditional techniques are facing challenges in network performance management. Cognitive networking is an emerging paradigm to make networks more intelligent, thereby overcoming...... traditional limitations and potentially achieving better performance. The vision is that, networks should be able to monitor themselves, reason upon changes in self and environment, act towards the achievement of specific goals and learn from experience. The concept of a Cognitive Engine (CE) supporting...

  15. A Smallholder Socio-hydrological Modelling Framework

    Science.gov (United States)

    Pande, S.; Savenije, H.; Rathore, P.

    2014-12-01

    Small holders are farmers who own less than 2 ha of farmland. They often have low productivity and thus remain at subsistence level. A fact that nearly 80% of Indian farmers are smallholders, who merely own a third of total farmlands and belong to the poorest quartile, but produce nearly 40% of countries foodgrains underlines the importance of understanding the socio-hydrology of a small holder. We present a framework to understand the socio-hydrological system dynamics of a small holder. It couples the dynamics of 6 main variables that are most relevant at the scale of a small holder: local storage (soil moisture and other water storage), capital, knowledge, livestock production, soil fertility and grass biomass production. The model incorporates rule-based adaptation mechanisms (for example: adjusting expenditures on food and fertilizers, selling livestocks etc.) of small holders when they face adverse socio-hydrological conditions, such as low annual rainfall, higher intra-annual variability in rainfall or variability in agricultural prices. It allows us to study sustainability of small holder farming systems under various settings. We apply the framework to understand the socio-hydrology of small holders in Aurangabad, Maharashtra, India. This district has witnessed suicides of many sugarcane farmers who could not extricate themselves out of the debt trap. These farmers lack irrigation and are susceptible to fluctuating sugar prices and intra-annual hydroclimatic variability. This presentation discusses two aspects in particular: whether government interventions to absolve the debt of farmers is enough and what is the value of investing in local storages that can buffer intra-annual variability in rainfall and strengthening the safety-nets either by creating opportunities for alternative sources of income or by crop diversification.

  16. Systematic identification of crystallization kinetics within a generic modelling framework

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Meisler, Kresten Troelstrup; Gernaey, Krist

    2012-01-01

    A systematic development of constitutive models within a generic modelling framework has been developed for use in design, analysis and simulation of crystallization operations. The framework contains a tool for model identification connected with a generic crystallizer modelling tool-box, a tool...

  17. Flexible modeling frameworks to replace small ensembles of hydrological models and move toward large ensembles?

    Science.gov (United States)

    Addor, Nans; Clark, Martyn P.; Mizukami, Naoki

    2017-04-01

    Climate change impacts on hydrological processes are typically assessed using small ensembles of hydrological models. That is, a handful of hydrological models are typically driven by a larger number of climate models. Such a setup has several limitations. Because the number of hydrological models is small, only a small proportion of the model space is sampled, likely leading to an underestimation of the uncertainties in the projections. Further, sampling is arbitrary: although hydrological models should be selected to provide a representative sample of existing models (in terms of complexity and governing hypotheses), they are instead usually selected based on legacy reasons. Furthermore, running several hydrological models currently constitutes a practical challenge because each model must be setup and calibrated individually. Finally, and probably most importantly, the differences between the projected impacts cannot be directly related to differences between hydrological models, because the models are different in almost every possible aspect. We are hence in a situation in which different hydrological models deliver different projections, but for reasons that are mostly unclear, and in which the uncertainty in the projections is probably underestimated. To overcome these limitations, we are experimenting with the flexible modeling framework FUSE (Framework for Understanding Model Errors). FUSE enables to construct conceptual models piece by piece (in a "pick and mix" approach), so it can be used to generate a large number of models that mimic existing models and/or models that differ from other models in single targeted respect (e.g. how baseflow is generated). FUSE hence allows for controlled modeling experiments, and for a more systematic and exhaustive sampling of the model space. Here we explore climate change impacts over the contiguous USA on a 12km grid using two groups of three models: the first group involves the commonly used models VIC, PRMS and HEC

  18. raaSAFT: A framework enabling coarse-grained molecular dynamics simulations based on the SAFT- γ Mie force field

    Science.gov (United States)

    Ervik, Åsmund; Serratos, Guadalupe Jiménez; Müller, Erich A.

    2017-03-01

    We describe here raaSAFT, a Python code that enables the setup and running of coarse-grained molecular dynamics simulations in a systematic and efficient manner. The code is built on top of the popular HOOMD-blue code, and as such harnesses the computational power of GPUs. The methodology makes use of the SAFT- γ Mie force field, so the resulting coarse grained pair potentials are both closely linked to and consistent with the macroscopic thermodynamic properties of the simulated fluid. In raaSAFT both homonuclear and heteronuclear models are implemented for a wide range of compounds spanning from linear alkanes, to more complicated fluids such as water and alcohols, all the way up to nonionic surfactants and models of asphaltenes and resins. Adding new compounds as well as new features is made straightforward by the modularity of the code. To demonstrate the ease-of-use of raaSAFT, we give a detailed walkthrough of how to simulate liquid-liquid equilibrium of a hydrocarbon with water. We describe in detail how both homonuclear and heteronuclear compounds are implemented. To demonstrate the performance and versatility of raaSAFT, we simulate a large polymer-solvent mixture with 300 polystyrene molecules dissolved in 42 700 molecules of heptane, reproducing the experimentally observed temperature-dependent solubility of polystyrene. For this case we obtain a speedup of more than three orders of magnitude as compared to atomistically-detailed simulations.

  19. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  20. Help seeking in older Asian people with dementia in Melbourne: using the Cultural Exchange Model to explore barriers and enablers.

    Science.gov (United States)

    Haralambous, Betty; Dow, Briony; Tinney, Jean; Lin, Xiaoping; Blackberry, Irene; Rayner, Victoria; Lee, Sook-Meng; Vrantsidis, Freda; Lautenschlager, Nicola; Logiudice, Dina

    2014-03-01

    The prevalence of dementia is increasing in Australia. Limited research is available on access to Cognitive Dementia and Memory Services (CDAMS) for people with dementia from Culturally and Linguistically Diverse (CALD) communities. This study aimed to determine the barriers and enablers to accessing CDAMS for people with dementia and their families of Chinese and Vietnamese backgrounds. Consultations with community members, community workers and health professionals were conducted using the "Cultural Exchange Model" framework. For carers, barriers to accessing services included the complexity of the health system, lack of time, travel required to get to services, language barriers, interpreters and lack of knowledge of services. Similarly, community workers and health professionals identified language, interpreters, and community perceptions as key barriers to service access. Strategies to increase knowledge included providing information via radio, printed material and education in community group settings. The "Cultural Exchange Model" enabled engagement with and modification of the approaches to meet the needs of the targeted CALD communities.

  1. Making sense of implementation theories, models and frameworks

    National Research Council Canada - National Science Library

    Nilsen, Per

    2015-01-01

    .... The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection...

  2. Modeling Framework for Mining Lifecycle Management

    Directory of Open Access Journals (Sweden)

    Na Lu

    2014-03-01

    Full Text Available In the development process of the information of the mining engineering, it is difficult to directly exchange and share information in the different phases and different application system, which causes the information isolation and information gap due to lack of unified data exchange standards and information integration mechanism. The purpose of this research is to build a modeling framework for mining lifecycle information management. The conception of mining lifecycle management (MLM is proposed based on product lifecycle management (PLM and Hall three dimension structures. The frame system of mining lifecycle management has been established by the application route of the information integration technologies and information standards. The four-layer structure of the realization of MLM system is put forward, which draws up the development method of MLM system. -The application indicates that the proposed theories and technologies have solved the problem of information isolation in different phases and application in mining engineering, and have laid a foundation for information exchange, sharing and integration in mining lifecycle.

  3. An Exploratory Investigation on the Invasiveness of Environmental Modeling Frameworks

    Science.gov (United States)

    This paper provides initial results of an exploratory investigation on the invasiveness of environmental modeling frameworks. Invasiveness is defined as the coupling between application (i.e., model) and framework code used to implement the model. By comparing the implementation of an environmenta...

  4. A Software Service Framework Model Based on Agent

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    This paper presents an agent-based software service framework model called ASF, and definesthe basic concepts and structure of ASF model. It also describes the management and process mechanismsin ASF model.

  5. Business model framework applications in health care: A systematic review.

    Science.gov (United States)

    Fredriksson, Jens Jacob; Mazzocato, Pamela; Muhammed, Rafiq; Savage, Carl

    2017-01-01

    It has proven to be a challenge for health care organizations to achieve the Triple Aim. In the business literature, business model frameworks have been used to understand how organizations are aligned to achieve their goals. We conducted a systematic literature review with an explanatory synthesis approach to understand how business model frameworks have been applied in health care. We found a large increase in applications of business model frameworks during the last decade. E-health was the most common context of application. We identified six applications of business model frameworks: business model description, financial assessment, classification based on pre-defined typologies, business model analysis, development, and evaluation. Our synthesis suggests that the choice of business model framework and constituent elements should be informed by the intent and context of application. We see a need for harmonization in the choice of elements in order to increase generalizability, simplify application, and help organizations realize the Triple Aim.

  6. A Data Driven Framework for Integrating Regional Climate Models

    Science.gov (United States)

    Lansing, C.; Kleese van Dam, K.; Liu, Y.; Elsethagen, T.; Guillen, Z.; Stephan, E.; Critchlow, T.; Gorton, I.

    2012-12-01

    There are increasing needs for research addressing complex climate sensitive issues of concern to decision-makers and policy planners at a regional level. Decisions about allocating scarce water across competing municipal, agricultural, and ecosystem demands is just one of the challenges ahead, along with decisions regarding competing land use priorities such as biofuels, food, and species habitat. Being able to predict the extent of future climate change in the context of introducing alternative energy production strategies requires a new generation of modeling capabilities. We will also need more complete representations of human systems at regional scales, incorporating the influences of population centers, land use, agriculture and existing and planned electrical demand and generation infrastructure. At PNNL we are working towards creating a first-of-a-kind capability known as the Integrated Regional Earth System Model (iRESM). The fundamental goal of the iRESM initiative is the critical analyses of the tradeoffs and consequences of decision and policy making for integrated human and environmental systems. This necessarily combines different scientific processes, bridging different temporal and geographic scales and resolving the semantic differences between them. To achieve this goal, iRESM is developing a modeling framework and supporting infrastructure that enable the scientific team to evaluate different scenarios in light of specific stakeholder questions such as "How do regional changes in mean climate states and climate extremes affect water storage and energy consumption and how do such decisions influence possible mitigation and carbon management schemes?" The resulting capability will give analysts a toolset to gain insights into how regional economies can respond to climate change mitigation policies and accelerated deployment of alternative energy technologies. The iRESM framework consists of a collection of coupled models working with high

  7. LAMMPS Framework for Dynamic Bonding and an Application Modeling DNA

    DEFF Research Database (Denmark)

    Svaneborg, Carsten

    2012-01-01

    and bond types. When breaking bonds, all angular and dihedral interactions involving broken bonds are removed. The framework allows chemical reactions to be modeled, and use it to simulate a simplistic, coarse-grained DNA model. The resulting DNA dynamics illustrates the power of the present framework....

  8. A Framework of Multi Objectives Negotiation for Dynamic Supply Chain Model

    Science.gov (United States)

    Chai, Jia Yee; Sakaguchi, Tatsuhiko; Shirase, Keiichi

    Trends of globalization and advances in Information Technology (IT) have created opportunity in collaborative manufacturing across national borders. A dynamic supply chain utilizes these advances to enable more flexibility in business cooperation. This research proposes a concurrent decision making framework for a three echelons dynamic supply chain model. The dynamic supply chain is formed by autonomous negotiation among agents based on multi agents approach. Instead of generating negotiation aspects (such as amount, price and due date) arbitrary, this framework proposes to utilize the information available at operational level of an organization in order to generate realistic negotiation aspect. The effectiveness of the proposed model is demonstrated by various case studies.

  9. Software Infrastructure to Enable Modeling & Simulation as a Service (M&SaaS) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This SBIR Phase 2 project will produce a software service infrastructure that enables most modeling and simulation (M&S) activities from code development and...

  10. Compact Ocean Models Enable Onboard AUV Autonomy and Decentralized Adaptive Sampling

    Science.gov (United States)

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Compact Ocean Models Enable Onboard AUV Autonomy and...Models Enable Onboard AUV Autonomy and Decentralized Adaptive Sampling 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...onboard autonomy of underwater vehicles”, in Proc. AGU Ocean Science Meeting, Salt Lake City, UT. [published] ● Frolov, S., R., Kudela, J., Bellingham

  11. IDEF method-based simulation model design and development framework

    Directory of Open Access Journals (Sweden)

    Ki-Young Jeong

    2009-09-01

    Full Text Available The purpose of this study is to provide an IDEF method-based integrated framework for a business process simulation model to reduce the model development time by increasing the communication and knowledge reusability during a simulation project. In this framework, simulation requirements are collected by a function modeling method (IDEF0 and a process modeling method (IDEF3. Based on these requirements, a common data model is constructed using the IDEF1X method. From this reusable data model, multiple simulation models are automatically generated using a database-driven simulation model development approach. The framework is claimed to help both requirement collection and experimentation phases during a simulation project by improving system knowledge, model reusability, and maintainability through the systematic use of three descriptive IDEF methods and the features of the relational database technologies. A complex semiconductor fabrication case study was used as a testbed to evaluate and illustrate the concepts and the framework. Two different simulation software products were used to develop and control the semiconductor model from the same knowledge base. The case study empirically showed that this framework could help improve the simulation project processes by using IDEF-based descriptive models and the relational database technology. Authors also concluded that this framework could be easily applied to other analytical model generation by separating the logic from the data.

  12. Modelling Framework of a Neural Object Recognition

    Directory of Open Access Journals (Sweden)

    Aswathy K S

    2016-02-01

    Full Text Available In many industrial, medical and scientific image processing applications, various feature and pattern recognition techniques are used to match specific features in an image with a known template. Despite the capabilities of these techniques, some applications require simultaneous analysis of multiple, complex, and irregular features within an image as in semiconductor wafer inspection. In wafer inspection discovered defects are often complex and irregular and demand more human-like inspection techniques to recognize irregularities. By incorporating neural network techniques such image processing systems with much number of images can be trained until the system eventually learns to recognize irregularities. The aim of this project is to develop a framework of a machine-learning system that can classify objects of different category. The framework utilizes the toolboxes in the Matlab such as Computer Vision Toolbox, Neural Network Toolbox etc.

  13. An object-oriented modelling framework for the arterial wall.

    Science.gov (United States)

    Balaguera, M I; Briceño, J C; Glazier, J A

    2010-02-01

    An object-oriented modelling framework for the arterial wall is presented. The novelty of the framework is the possibility to generate customizable artery models, taking advantage of imaging technology. In our knowledge, this is the first object-oriented modelling framework for the arterial wall. Existing models do not allow close structural mapping with arterial microstructure as in the object-oriented framework. In the implemented model, passive behaviour of the arterial wall was considered and the tunica adventitia was the objective system. As verification, a model of an arterial segment was generated. In order to simulate its deformation, a matrix structural mechanics simulator was implemented. Two simulations were conducted, one for an axial loading test and other for a pressure-volume test. Each simulation began with a sensitivity analysis in order to determinate the best parameter combination and to compare the results with analogue controls. In both cases, the simulated results closely reproduced qualitatively and quantitatively the analogue control plots.

  14. An Ontology-Based Framework for Modeling User Behavior

    DEFF Research Database (Denmark)

    Razmerita, Liana

    2011-01-01

    This paper focuses on the role of user modeling and semantically enhanced representations for personalization. This paper presents a generic Ontology-based User Modeling framework (OntobUMf), its components, and its associated user modeling processes. This framework models the behavior of the users....... The results of this research may contribute to the development of other frameworks for modeling user behavior, other semantically enhanced user modeling frameworks, or other semantically enhanced information systems....... and classifies its users according to their behavior. The user ontology is the backbone of OntobUMf and has been designed according to the Information Management System Learning Information Package (IMS LIP). The user ontology includes a Behavior concept that extends IMS LIP specification and defines...

  15. A Mathematical Modeling Framework for Analysis of Functional Clothing

    Directory of Open Access Journals (Sweden)

    Xiaolin Man

    2007-11-01

    Full Text Available In the analysis and design of functional clothing systems, it is helpful to quantify the effects of a system on a wearer’s physical performance capabilities. Toward this end, a clothing modeling framework for quantifying the mechanical interactions between a given clothing system design and a specific wearer performing defined physical tasks is proposed. The modeling framework consists of three interacting modules: (1 a macroscale fabric mechanics/dynamics model; (2 a collision detection and contact correction module; and (3 a human motion module. In the proposed framework, the macroscopic fabric model is based on a rigorous large deformation continuum-degenerated shell theory representation. Material models that capture the stress-strain behavior of different clothing fabrics are used in the continuum shell framework. The collision and contact module enforces the impenetrability constraint between the fabric and human body and computes the associated contact forces between the two. The human body is represented in the current framework as an assemblage of overlapping ellipsoids that undergo rigid body motions consistent with human motions while performing actions such as walking, running, or jumping. The transient rigid body motions of each ellipsoidal body segment in time are determined using motion capture technology. The integrated modeling framework is then exercised to quantify the resistance that the clothing exerts on the wearer during the specific activities under consideration. Current results from the framework are presented and its intended applications are discussed along with some of the key challenges remaining in clothing system modeling.

  16. A Modeling Framework for Conventional and Heat Integrated Distillation Columns

    DEFF Research Database (Denmark)

    Bisgaard, Thomas; Huusom, Jakob Kjøbsted; Abildskov, Jens

    2013-01-01

    In this paper, a generic, modular model framework for describing fluid separation by distillation is presented. At present, the framework is able to describe a conventional distillation column and a heat-integrated distillation column, but due to a modular structure the database can be further ex...

  17. A framework for habitat monitoring and climate change modelling

    NARCIS (Netherlands)

    Villoslada, Miguel; Bunce, Robert G.H.; Sepp, Kalev; Jongman, Rob H.G.; Metzger, Marc J.; Kull, Tiiu; Raet, Janar; Kuusemets, Valdo; Kull, Ain; Leito, Aivar

    2017-01-01

    Environmental stratifications provide the framework for efficient surveillance and monitoring of biodiversity and ecological resources, as well as modelling exercises. An obstacle for agricultural landscape monitoring in Estonia has been the lack of a framework for the objective selection of

  18. A framework for modeling uncertainty in regional climate change

    Science.gov (United States)

    In this study, we present a new modeling framework and a large ensemble of climate projections to investigate the uncertainty in regional climate change over the United States associated with four dimensions of uncertainty. The sources of uncertainty considered in this framework ...

  19. A Framework for Dimensionality Assessment for Multidimensional Item Response Models

    Science.gov (United States)

    Svetina, Dubravka; Levy, Roy

    2014-01-01

    A framework is introduced for considering dimensionality assessment procedures for multidimensional item response models. The framework characterizes procedures in terms of their confirmatory or exploratory approach, parametric or nonparametric assumptions, and applicability to dichotomous, polytomous, and missing data. Popular and emerging…

  20. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the microsc

  1. Generic Model Predictive Control Framework for Advanced Driver Assistance Systems

    NARCIS (Netherlands)

    Wang, M.

    2014-01-01

    This thesis deals with a model predictive control framework for control design of Advanced Driver Assistance Systems, where car-following tasks are under control. The framework is applied to design several autonomous and cooperative controllers and to examine the controller properties at the microsc

  2. Coastal Ecosystem Integrated Compartment Model (ICM): Modeling Framework

    Science.gov (United States)

    Meselhe, E. A.; White, E. D.; Reed, D.

    2015-12-01

    The Integrated Compartment Model (ICM) was developed as part of the 2017 Coastal Master Plan modeling effort. It is a comprehensive and numerical hydrodynamic model coupled to various geophysical process models. Simplifying assumptions related to some of the flow dynamics are applied to increase the computational efficiency of the model. The model can be used to provide insights about coastal ecosystems and evaluate restoration strategies. It builds on existing tools where possible and incorporates newly developed tools where necessary. It can perform decadal simulations (~ 50 years) across the entire Louisiana coast. It includes several improvements over the approach used to support the 2012 Master Plan, such as: additional processes in the hydrology, vegetation, wetland and barrier island morphology subroutines, increased spatial resolution, and integration of previously disparate models into a single modeling framework. The ICM includes habitat suitability indices (HSIs) to predict broad spatial patterns of habitat change, and it provides an additional integration to a dynamic fish and shellfish community model which quantitatively predicts potential changes in important fishery resources. It can be used to estimate the individual and cumulative effects of restoration and protection projects on the landscape, including a general estimate of water levels associated with flooding. The ICM is also used to examine possible impacts of climate change and future environmental scenarios (e.g. precipitation, Eustatic sea level rise, subsidence, tropical storms, etc.) on the landscape and on the effectiveness of restoration projects. The ICM code is publically accessible, and coastal restoration and protection groups interested in planning-level modeling are encouraged to explore its utility as a computationally efficient tool to examine ecosystem response to future physical or ecological changes, including the implementation of restoration and protection strategies.

  3. Instant e-Teaching Framework Model for Live Online Teaching

    Directory of Open Access Journals (Sweden)

    Suhailan Safei

    2011-03-01

    Full Text Available Instant e-Teaching is a new concept that supplements e-Teaching and e-Learning environment in providing a full and comprehensive modern education styles. The e-Learning technology depicts the concept of enabling self-learning among students on a certain subject using online reference and materials. While the instant e-teaching requires 'face-to-face' characteristic between teacher and student to simultaneously execute actions and gain instant responses. The word instant enhances the e-Teaching with the concept of real time teaching. The challenge to exercise online and instant teaching is not just merely relying on the technologies and system efficiency, but it needs to satisfy the usability and friendliness of the system as to replicate the traditional class environment during the deliveries of the class. For this purpose, an instant e-Teaching framework is been developed that will emulate a dedicated virtual classroom, and primarily designed for synchronous and live sharing of current teaching notes. The model has been demonstrated using a teaching Arabic recitation prototype and evaluated from the professional user profession's perspectives.

  4. Metal-Organic Framework Thin Films as Platforms for Atomic Layer Deposition of Cobalt Ions To Enable Electrocatalytic Water Oxidation.

    Science.gov (United States)

    Kung, Chung-Wei; Mondloch, Joseph E; Wang, Timothy C; Bury, Wojciech; Hoffeditz, William; Klahr, Benjamin M; Klet, Rachel C; Pellin, Michael J; Farha, Omar K; Hupp, Joseph T

    2015-12-30

    Thin films of the metal-organic framework (MOF) NU-1000 were grown on conducting glass substrates. The films uniformly cover the conducting glass substrates and are composed of free-standing sub-micrometer rods. Subsequently, atomic layer deposition (ALD) was utilized to deposit Co(2+) ions throughout the entire MOF film via self-limiting surface-mediated reaction chemistry. The Co ions bind at aqua and hydroxo sites lining the channels of NU-1000, resulting in three-dimensional arrays of separated Co ions in the MOF thin film. The Co-modified MOF thin films demonstrate promising electrocatalytic activity for water oxidation.

  5. Conceptualising Business Models: Definitions, Frameworks and Classifications

    Directory of Open Access Journals (Sweden)

    Erwin Fielt

    2013-12-01

    Full Text Available The business model concept is gaining traction in different disciplines but is still criticized for being fuzzy and vague and lacking consensus on its definition and compositional elements. In this paper we set out to advance our understanding of the business model concept by addressing three areas of foundational research: business model definitions, business model elements, and business model archetypes. We define a business model as a representation of the value logic of an organization in terms of how it creates and captures customer value. This abstract and generic definition is made more specific and operational by the compositional elements that need to address the customer, value proposition, organizational architecture (firm and network level and economics dimensions. Business model archetypes complement the definition and elements by providing a more concrete and empirical understanding of the business model concept. The main contributions of this paper are (1 explicitly including the customer value concept in the business model definition and focussing on value creation, (2 presenting four core dimensions that business model elements need to cover, (3 arguing for flexibility by adapting and extending business model elements to cater for different purposes and contexts (e.g. technology, innovation, strategy (4 stressing a more systematic approach to business model archetypes by using business model elements for their description, and (5 suggesting to use business model archetype research for the empirical exploration and testing of business model elements and their relationships.

  6. Organizational Models for Non-Core Processes Management: A Classification Framework

    Directory of Open Access Journals (Sweden)

    Alberto F. De Toni

    2012-12-01

    The framework enables the identification and the explanation of the main advantages and disadvantages of each strategy and to highlight how a company should coherently choose an organizational model on the basis of: (a the specialization/complexity of the non‐core processes, (b the focus on core processes, (c its inclination towards know‐how outsourcing, and (d the desired level of autonomy in the management of non‐core processes.

  7. A conceptual framework for a mentoring model for nurse educators ...

    African Journals Online (AJOL)

    A conceptual framework for a mentoring model for nurse educators. ... recruiting and retaining nurse educators to meet the demands of teaching and learning ... approaches focusing on reasoning strategies, literature control and empirical data ...

  8. Bregman divergence as general framework to estimate unnormalized statistical models

    CERN Document Server

    Gutmann, Michael

    2012-01-01

    We show that the Bregman divergence provides a rich framework to estimate unnormalized statistical models for continuous or discrete random variables, that is, models which do not integrate or sum to one, respectively. We prove that recent estimation methods such as noise-contrastive estimation, ratio matching, and score matching belong to the proposed framework, and explain their interconnection based on supervised learning. Further, we discuss the role of boosting in unsupervised learning.

  9. Implementation of a capsular bag model to enable sufficient lens stabilization within a mechanical eye model

    Science.gov (United States)

    Bayer, Natascha; Rank, Elisabet; Traxler, Lukas; Beckert, Erik; Drauschke, Andreas

    2015-03-01

    Cataract still remains the leading cause of blindness affecting 20 million people worldwide. To restore the patients vision the natural lens is removed and replaced by an intraocular lens (IOL). In modern cataract surgery the posterior capsular bag is maintained to prevent inflammation and to enable stabilization of the implant. Refractive changes following cataract surgery are attributable to lens misalignments occurring due to postoperative shifts and tilts of the artificial lens. Mechanical eye models allow a preoperative investigation of the impact of such misalignments and are crucial to improve the quality of the patients' sense of sight. Furthermore, the success of sophisticated IOLs that correct high order aberrations is depending on a critical evaluation of the lens position. A new type of an IOL holder is designed and implemented into a preexisting mechanical eye model. A physiological representation of the capsular bag is realized with an integrated film element to guarantee lens stabilization and centering. The positioning sensitivity of the IOL is evaluated by performing shifts and tilts in reference to the optical axis. The modulation transfer function is used to measure the optical quality at each position. Lens stability tests within the holder itself are performed by determining the modulation transfer function before and after measurement sequence. Mechanical stability and reproducible measurement results are guaranteed with the novel capsular bag model that allows a precise interpretation of postoperative lens misalignments. The integrated film element offers additional stabilization during measurement routine without damaging the haptics or deteriorating the optical performance.

  10. POSITIVE LEADERSHIP MODELS: THEORETICAL FRAMEWORK AND RESEARCH

    Directory of Open Access Journals (Sweden)

    Javier Blanch, Francisco Gil

    2016-09-01

    Full Text Available The objective of this article is twofold; firstly, we establish the theoretical boundaries of positive leadership and the reasons for its emergence. It is related to the new paradigm of positive psychology that has recently been shaping the scope of organizational knowledge. This conceptual framework has triggered the development of the various forms of positive leadership (i.e. transformational, servant, spiritual, authentic, and positive. Although the construct does not seem univocally defined, these different types of leadership overlap and share a significant affinity. Secondly, we review the empirical evidence that shows the impact of positive leadership in organizations and we highlight the positive relationship between these forms of leadership and key positive organizational variables. Lastly, we analyse future research areas in order to further develop this concept.

  11. An Extensible Model and Analysis Framework

    Science.gov (United States)

    2010-11-01

    for a total of 543 seconds. For comparison purposes, in interpreted mode, opening the model took 224 seconds and running the model took 217 seconds...contains 19683 entities. 9 A comparison of the key model complexity metrics may be found in Table 3. Table 3: Comparison of the model...Triquetrum/RCP supports assembling in arbitrary ways. (12/08 presentation) 2. Prototyped OSGi component architecture for use with Netbeans and

  12. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  13. The universal fuzzy Logical framework of neural circuits and its application in modeling primary visual cortex

    Institute of Scientific and Technical Information of China (English)

    HU Hong; LI Su; WANG YunJiu; QI XiangLin; SHI ZhongZhi

    2008-01-01

    Analytical study of large-scale nonlinear neural circuits is a difficult task. Here we analyze the function of neural systems by probing the fuzzy logical framework of the neural cells' dynamical equations. Al-though there is a close relation between the theories of fuzzy logical systems and neural systems and many papers investigate this subject, most investigations focus on finding new functions of neural systems by hybridizing fuzzy logical and neural system. In this paper, the fuzzy logical framework of neural cells is used to understand the nonlinear dynamic attributes of a common neural system by abstracting the fuzzy logical framework of a neural cell. Our analysis enables the educated design of network models for classes of computation. As an example, a recurrent network model of the primary visual cortex has been built and tested using this approach.

  14. The universal fuzzy logical framework of neural circuits and its application in modeling primary visual cortex

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Analytical study of large-scale nonlinear neural circuits is a difficult task. Here we analyze the function of neural systems by probing the fuzzy logical framework of the neural cells’ dynamical equations. Al- though there is a close relation between the theories of fuzzy logical systems and neural systems and many papers investigate this subject, most investigations focus on finding new functions of neural systems by hybridizing fuzzy logical and neural system. In this paper, the fuzzy logical framework of neural cells is used to understand the nonlinear dynamic attributes of a common neural system by abstracting the fuzzy logical framework of a neural cell. Our analysis enables the educated design of network models for classes of computation. As an example, a recurrent network model of the primary visual cortex has been built and tested using this approach.

  15. The universal fuzzy logical framework of neural circuits and its application in modeling primary visual cortex.

    Science.gov (United States)

    Hu, Hong; Li, Su; Wang, YunJiu; Qi, XiangLin; Shi, ZhongZhi

    2008-10-01

    Analytical study of large-scale nonlinear neural circuits is a difficult task. Here we analyze the function of neural systems by probing the fuzzy logical framework of the neural cells' dynamical equations. Although there is a close relation between the theories of fuzzy logical systems and neural systems and many papers investigate this subject, most investigations focus on finding new functions of neural systems by hybridizing fuzzy logical and neural system. In this paper, the fuzzy logical framework of neural cells is used to understand the nonlinear dynamic attributes of a common neural system by abstracting the fuzzy logical framework of a neural cell. Our analysis enables the educated design of network models for classes of computation. As an example, a recurrent network model of the primary visual cortex has been built and tested using this approach.

  16. An Ising model for metal-organic frameworks

    Science.gov (United States)

    Höft, Nicolas; Horbach, Jürgen; Martín-Mayor, Victor; Seoane, Beatriz

    2017-08-01

    We present a three-dimensional Ising model where lines of equal spins are frozen such that they form an ordered framework structure. The frame spins impose an external field on the rest of the spins (active spins). We demonstrate that this "porous Ising model" can be seen as a minimal model for condensation transitions of gas molecules in metal-organic frameworks. Using Monte Carlo simulation techniques, we compare the phase behavior of a porous Ising model with that of a particle-based model for the condensation of methane (CH4) in the isoreticular metal-organic framework IRMOF-16. For both models, we find a line of first-order phase transitions that end in a critical point. We show that the critical behavior in both cases belongs to the 3D Ising universality class, in contrast to other phase transitions in confinement such as capillary condensation.

  17. Mediation Analysis in a Latent Growth Curve Modeling Framework

    Science.gov (United States)

    von Soest, Tilmann; Hagtvet, Knut A.

    2011-01-01

    This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…

  18. A DSM-based framework for integrated function modelling

    DEFF Research Database (Denmark)

    Eisenbart, Boris; Gericke, Kilian; Blessing, Lucienne T. M.

    2017-01-01

    an integrated function modelling framework, which specifically aims at relating between the different function modelling perspectives prominently addressed in different disciplines. It uses interlinked matrices based on the concept of DSM and MDM in order to facilitate cross-disciplinary modelling and analysis...

  19. Threat model framework and methodology for personal networks (PNs)

    DEFF Research Database (Denmark)

    Prasad, Neeli R.

    2007-01-01

    is to give a structured, convenient approach for building threat models. A framework for the threat model is presented with a list of requirements for methodology. The methodology will be applied to build a threat model for Personal Networks. Practical tools like UML sequence diagrams and attack trees have...

  20. MODELS FOR NETWORK DYNAMICS - A MARKOVIAN FRAMEWORK

    NARCIS (Netherlands)

    LEENDERS, RTAJ

    1995-01-01

    A question not very often addressed in social network analysis relates to network dynamics and focuses on how networks arise and change. It alludes to the idea that ties do not arise or vanish randomly, but (partly) as a consequence of human behavior and preferences. Statistical models for modeling

  1. A Computational Framework for Realistic Retina Modeling.

    Science.gov (United States)

    Martínez-Cañada, Pablo; Morillas, Christian; Pino, Begoña; Ros, Eduardo; Pelayo, Francisco

    2016-11-01

    Computational simulations of the retina have led to valuable insights about the biophysics of its neuronal activity and processing principles. A great number of retina models have been proposed to reproduce the behavioral diversity of the different visual processing pathways. While many of these models share common computational stages, previous efforts have been more focused on fitting specific retina functions rather than generalizing them beyond a particular model. Here, we define a set of computational retinal microcircuits that can be used as basic building blocks for the modeling of different retina mechanisms. To validate the hypothesis that similar processing structures may be repeatedly found in different retina functions, we implemented a series of retina models simply by combining these computational retinal microcircuits. Accuracy of the retina models for capturing neural behavior was assessed by fitting published electrophysiological recordings that characterize some of the best-known phenomena observed in the retina: adaptation to the mean light intensity and temporal contrast, and differential motion sensitivity. The retinal microcircuits are part of a new software platform for efficient computational retina modeling from single-cell to large-scale levels. It includes an interface with spiking neural networks that allows simulation of the spiking response of ganglion cells and integration with models of higher visual areas.

  2. GeoFramework: Coupling multiple models of mantle convection within a computational framework

    Science.gov (United States)

    Tan, E.; Choi, E.; Thoutireddy, P.; Gurnis, M.; Aivazis, M.

    2004-12-01

    Geological processes usually encompass a broad spectrum of length and time scales. Traditionally, a modeling code (solver) is developed for a problem of specific length and time scales, but the utility of the solver beyond the designated purpose is usually limited. As we have come to recognize that geological processes often result from the dynamic coupling of deformation across a wide range of time and spatial scales, more robust methods are needed. One means to address this need is through the integration of complementary modeling codes, while attempting to reuse existing software as much as possible. The GeoFramework project addresses this by developing a suite of reusable and combinable tools for the Earth science community. GeoFramework is based on and extends Pyre, a Python-based modeling framework, developed to link solid (Lagrangian) and fluid (Eulerian) solvers, as well as mesh generators, visualization packages, and databases, with one another for engineering applications. Under the framework, a solver is aware of the presence of other solvers and can interact with each other via exchanging information across adjacent mesh boundary. We will show an example of linking two instances of the CitcomS finite element solver within GeoFramework. A high-resolution regional mantle convection model is linked with a global mantle convection model. The global solver has a resolution of ˜180 km horizontally and 35-100 km (with mesh refinement) vertically. The fine mesh has a resolution of ˜40 km horizontally and vertically. The fine mesh is center on the Hawaii hotspot. A vertical plume is used as an initial condition. Time-varying plate velocity models are imposed since 80 Ma and we have investigated how the plume conduit is deflected by the global circulation patterns as a function of mantle viscosity, plume flux, and plate motion.

  3. Multilevel Models: Conceptual Framework and Applicability

    Directory of Open Access Journals (Sweden)

    Roxana-Otilia-Sonia Hrițcu

    2015-10-01

    Full Text Available Individuals and the social or organizational groups they belong to can be viewed as a hierarchical system situated on different levels. Individuals are situated on the first level of the hierarchy and they are nested together on the higher levels. Individuals interact with the social groups they belong to and are influenced by these groups. Traditional methods that study the relationships between data, like simple regression, do not take into account the hierarchical structure of the data and the effects of a group membership and, hence, results may be invalidated. Unlike standard regression modelling, the multilevel approach takes into account the individuals as well as the groups to which they belong. To take advantage of the multilevel analysis it is important that we recognize the multilevel characteristics of the data. In this article we introduce the outlines of multilevel data and we describe the models that work with such data. We introduce the basic multilevel model, the two-level model: students can be nested into classes, individuals into countries and the general two-level model can be extended very easily to several levels. Multilevel analysis has begun to be extensively used in many research areas. We present the most frequent study areas where multilevel models are used, such as sociological studies, education, psychological research, health studies, demography, epidemiology, biology, environmental studies and entrepreneurship. We support the idea that since hierarchies exist everywhere, multilevel data should be recognized and analyzed properly by using multilevel modelling.

  4. Traffic modelling framework for electric vehicles

    Science.gov (United States)

    Schlote, Arieh; Crisostomi, Emanuele; Kirkland, Stephen; Shorten, Robert

    2012-07-01

    This article reviews and improves a recently proposed model of road network dynamics. The model is also adapted and generalised to represent the patterns of battery consumption of electric vehicles travelling in the road network. Simulations from the mobility simulator SUMO are given to support and to illustrate the efficacy of the proposed approach. Applications relevant in the field of electric vehicles, such as optimal routing and traffic load control, are provided to illustrate how the proposed model can be used to address typical problems arising in contemporary road network planning and electric vehicle mobility.

  5. TRANSFORM - TRANsient Simulation Framework of Reconfigurable Models

    Energy Technology Data Exchange (ETDEWEB)

    2017-09-01

    Existing development tools for early stage design and scoping of energy systems are often time consuming to use, proprietary, and do not contain the necessary function to model complete systems (i.e., controls, primary, and secondary systems) in a common platform. The Modelica programming language based TRANSFORM tool (1) provides a standardized, common simulation environment for early design of energy systems (i.e., power plants), (2) provides a library of baseline component modules to be assembled into full plant models using available geometry, design, and thermal-hydraulic data, (3) defines modeling conventions for interconnecting component models, and (4) establishes user interfaces and support tools to facilitate simulation development (i.e., configuration and parameterization), execution, and results display and capture.

  6. Fisher Information Framework for Time Series Modeling

    CERN Document Server

    Venkatesan, R C

    2016-01-01

    A robust prediction model invoking the Takens embedding theorem, whose \\textit{working hypothesis} is obtained via an inference procedure based on the minimum Fisher information principle, is presented. The coefficients of the ansatz, central to the \\textit{working hypothesis} satisfy a time independent Schr\\"{o}dinger-like equation in a vector setting. The inference of i) the probability density function of the coefficients of the \\textit{working hypothesis} and ii) the establishing of constraint driven pseudo-inverse condition for the modeling phase of the prediction scheme, is made, for the case of normal distributions, with the aid of the quantum mechanical virial theorem. The well-known reciprocity relations and the associated Legendre transform structure for the Fisher information measure (FIM, hereafter)-based model in a vector setting (with least square constraints) are self-consistently derived. These relations are demonstrated to yield an intriguing form of the FIM for the modeling phase, which defi...

  7. A Modeling Framework for Gossip-based Information Spread

    CERN Document Server

    Bakhshi, Rena; Fokkink, Wan; van Steen, Maarten

    2011-01-01

    We present an analytical framework for gossip protocols based on the pairwise information exchange between interacting nodes. This framework allows for studying the impact of protocol parameters on the performance of the protocol. Previously, gossip-based information dissemination protocols have been analyzed under the assumption of perfect, lossless communication channels. We extend our framework for the analysis of networks with lossy channels. We show how the presence of message loss, coupled with specific topology configurations,impacts the expected behavior of the protocol. We validate the obtained models against simulations for two protocols.

  8. Multicriteria framework for selecting a process modelling language

    Science.gov (United States)

    Scanavachi Moreira Campos, Ana Carolina; Teixeira de Almeida, Adiel

    2016-01-01

    The choice of process modelling language can affect business process management (BPM) since each modelling language shows different features of a given process and may limit the ways in which a process can be described and analysed. However, choosing the appropriate modelling language for process modelling has become a difficult task because of the availability of a large number modelling languages and also due to the lack of guidelines on evaluating, and comparing languages so as to assist in selecting the most appropriate one. This paper proposes a framework for selecting a modelling language in accordance with the purposes of modelling. This framework is based on the semiotic quality framework (SEQUAL) for evaluating process modelling languages and a multicriteria decision aid (MCDA) approach in order to select the most appropriate language for BPM. This study does not attempt to set out new forms of assessment and evaluation criteria, but does attempt to demonstrate how two existing approaches can be combined so as to solve the problem of selection of modelling language. The framework is described in this paper and then demonstrated by means of an example. Finally, the advantages and disadvantages of using SEQUAL and MCDA in an integrated manner are discussed.

  9. IT-enabled dynamic capability on performance: An empirical study of BSC model

    Directory of Open Access Journals (Sweden)

    Adilson Carlos Yoshikuni

    2017-05-01

    Full Text Available ew studies have investigated the influence of “information capital,” through IT-enabled dynamic capability, on corporate performance, particularly in economic turbulence. Our study investigates the causal relationship between performance perspectives of the balanced scorecard using partial least squares path modeling. Using data on 845 Brazilian companies, we conduct a quantitative empirical study of firms during an economic crisis and observe the following interesting results. Operational and analytical IT-enabled dynamic capability had positive effects on business process improvement and corporate performance. Results pertaining to mediation (endogenous variables and moderation (control variables clarify IT’s role in and benefits for corporate performance.

  10. Digital Moon: A three-dimensional framework for lunar modeling

    Science.gov (United States)

    Paige, D. A.; Elphic, R. C.; Foote, E. J.; Meeker, S. R.; Siegler, M. A.; Vasavada, A. R.

    2009-12-01

    The Moon has a complex three-dimensional shape with significant large-scale and small-scale topographic relief. The Moon’s topography largely controls the distribution of incident solar radiation, as well as the scattered solar and infrared radiation fields. Topography also affects the Moon’s interaction with the space environment, its magnetic field, and the propagation of seismic waves. As more extensive and detailed lunar datasets become available, there is an increasing need to interpret and compare them with the results of physical models in a fully three-dimensional context. We have developed a three-dimensional framework for lunar modeling we call the Digital Moon. The goal of this work is to enable high fidelity physical modeling and visualization of the Moon in a parallel computing environment. The surface of the Moon is described by a continuous triangular mesh of arbitrary shape and spatial scale. For regions of limited geographic extent, it is convenient to employ meshes on a rectilinear grid. However for global-scale modeling, we employ a continuous geodesic gridding scheme (Teanby, 2008). Each element in the mesh surface is allowed to have a unique set of physical properties. Photon and particle interactions between mesh elements are modeled using efficient ray tracing algorithms. Heat, mass, photon and particle transfer within each mesh element are modeled in one dimension. Each compute node is assigned a portion of the mesh and collective interactions between elements are handled through network interfaces. We have used the model to calculate lunar surface and subsurface temperatures that can be compared directly with radiometric temperatures measured by the Diviner Lunar Radiometer Experiment on the Lunar Reconnaissance Orbiter. The model includes realistic surface photometric functions based on goniometric measurements of lunar soil samples (Foote and Paige, 2009), and one-dimensional thermal models based on lunar remote sensing and Apollo

  11. Theoretical Tinnitus framework: A Neurofunctional Model

    Directory of Open Access Journals (Sweden)

    Iman Ghodratitoostani

    2016-08-01

    Full Text Available Subjective tinnitus is the conscious (attended awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional tinnitus model to indicate that the conscious perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional tinnitus model includes the peripheral auditory system, the thalamus, the limbic system, brain stem, basal ganglia, striatum and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the sourceless sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be associated with aversive stimuli similar to abnormal neural activity in generating the phantom sound. Cognitive and emotional reactions depend on general

  12. Theoretical Tinnitus Framework: A Neurofunctional Model.

    Science.gov (United States)

    Ghodratitoostani, Iman; Zana, Yossi; Delbem, Alexandre C B; Sani, Siamak S; Ekhtiari, Hamed; Sanchez, Tanit G

    2016-01-01

    Subjective tinnitus is the conscious (attended) awareness perception of sound in the absence of an external source and can be classified as an auditory phantom perception. Earlier literature establishes three distinct states of conscious perception as unattended, attended, and attended awareness conscious perception. The current tinnitus development models depend on the role of external events congruently paired with the causal physical events that precipitate the phantom perception. We propose a novel Neurofunctional Tinnitus Model to indicate that the conscious (attended) awareness perception of phantom sound is essential in activating the cognitive-emotional value. The cognitive-emotional value plays a crucial role in governing attention allocation as well as developing annoyance within tinnitus clinical distress. Structurally, the Neurofunctional Tinnitus Model includes the peripheral auditory system, the thalamus, the limbic system, brainstem, basal ganglia, striatum, and the auditory along with prefrontal cortices. Functionally, we assume the model includes presence of continuous or intermittent abnormal signals at the peripheral auditory system or midbrain auditory paths. Depending on the availability of attentional resources, the signals may or may not be perceived. The cognitive valuation process strengthens the lateral-inhibition and noise canceling mechanisms in the mid-brain, which leads to the cessation of sound perception and renders the signal evaluation irrelevant. However, the "sourceless" sound is eventually perceived and can be cognitively interpreted as suspicious or an indication of a disease in which the cortical top-down processes weaken the noise canceling effects. This results in an increase in cognitive and emotional negative reactions such as depression and anxiety. The negative or positive cognitive-emotional feedbacks within the top-down approach may have no relation to the previous experience of the patients. They can also be

  13. Modeling of Information Sharing Enablers for building Trust in Indian Manufacturing Industry: An Integrated ISM and Fuzzy MICMAC Approach

    Directory of Open Access Journals (Sweden)

    M K KHURANA

    2010-06-01

    Full Text Available Trust is regarded as one of the most critical and essential ingredient in most of business activities for collaborative relationship among the supply chain members. Maintaining and building trust among supply chain members depends mainly upon continued commitment to communication together with sharing information. Trust becomes critical when uncertainty and asymmetric information are present in the transaction of a supply chain. Information sharing system has very critical importance for the creation and maintenance of Trust. Trust is concerned with both the receipt and the dissemination of information. The present research aims to provide a comprehensive framework for the various important factors of information sharing system affecting the level of trust in supply chain management. ISM and Fuzzy MICMAC have been deployed to identify and classify the key criterion of information sharing enablers that influence trust based on their direct and indirect relationship. In this paper role of different factors of information sharing those responsible for infusing trust has been analyzed. In this research, an integrated model of information sharing enablers has been developed which may be helpful to supply chain managers to employ this model in order to identify and classify the important criteria for their needs and to reveal the direct and indirect effects of each criterion on the trust building process in supply chain management.

  14. A framework for quantifying net benefits of alternative prognostic models

    DEFF Research Database (Denmark)

    Rapsomaniki, Eleni; White, Ian R; Wood, Angela M

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit...

  15. Industrial Sector Energy Efficiency Modeling (ISEEM) Framework Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Karali, Nihan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-12

    The goal of this study is to develop a new bottom-up industry sector energy-modeling framework with an agenda of addressing least cost regional and global carbon reduction strategies, improving the capabilities and limitations of the existing models that allows trading across regions and countries as an alternative.

  16. A framework for quantifying net benefits of alternative prognostic models

    NARCIS (Netherlands)

    Rapsomaniki, Eleni; White, Ian R.; Wood, Angela M.; Thompson, Simon G.

    2012-01-01

    New prognostic models are traditionally evaluated using measures of discrimination and risk reclassification, but these do not take full account of the clinical and health economic context. We propose a framework for comparing prognostic models by quantifying the public health impact (net benefit) o

  17. Conceptual Frameworks and Research Models on Resilience in Leadership

    OpenAIRE

    Janet Ledesma

    2014-01-01

    The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress...

  18. A National Modeling Framework for Water Management Decisions

    Science.gov (United States)

    Bales, J. D.; Cline, D. W.; Pietrowsky, R.

    2013-12-01

    The National Weather Service (NWS), the U.S. Army Corps of Engineers (USACE), and the U.S. Geological Survey (USGS), all Federal agencies with complementary water-resources activities, entered into an Interagency Memorandum of Understanding (MOU) "Collaborative Science Services and Tools to Support Integrated and Adaptive Water Resources Management" to collaborate in activities that are supportive to their respective missions. One of the interagency activities is the development of a highly integrated national water modeling framework and information services framework. Together these frameworks establish a common operating picture, improve modeling and synthesis, support the sharing of data and products among agencies, and provide a platform for incorporation of new scientific understanding. Each of the agencies has existing operational systems to assist in carrying out their respective missions. The systems generally are designed, developed, tested, fielded, and supported by specialized teams. A broader, shared approach is envisioned and would include community modeling, wherein multiple independent investigators or teams develop and contribute new modeling capabilities based on science advances; modern technology in coupling model components and visualizing results; and a coupled atmospheric - hydrologic model construct such that the framework could be used in real-time water-resources decision making or for long-term management decisions. The framework also is being developed to account for organizational structures of the three partners such that, for example, national data sets can move down to the regional scale, and vice versa. We envision the national water modeling framework to be an important element of North American Water Program, to contribute to goals of the Program, and to be informed by the science and approaches developed as a part of the Program.

  19. Application of a conceptual framework for the modelling and execution of clinical guidelines as networks of concurrent processes

    NARCIS (Netherlands)

    Fung, Nick Lik San; Widya, Ing; Broens, Tom; Larburu, Nekane; Bults, Richard; Shalom, Erez; Jones, Val; Hermens, Hermie

    2014-01-01

    We present a conceptual framework for modelling clinical guidelines as networks of concurrent processes. This enables the guideline to be partitioned and distributed at run-time across a knowledge-based telemedicine system, which is distributed by definition but whose exact physical configuration ca

  20. The Guided System Development Framework: Modeling and Verifying Communication Systems

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.; Nielson, Flemming

    2014-01-01

    In a world that increasingly relies on the Internet to function, application developers rely on the implementations of protocols to guarantee the security of data transferred. Whether a chosen protocol gives the required guarantees, and whether the implementation does the same, is usually unclear....... The Guided System Development framework contributes to more secure communication systems by aiding the development of such systems. The framework features a simple modelling language, step-wise refinement from models to implementation, interfaces to security verification tools, and code generation from...

  1. GEMFsim: A Stochastic Simulator for the Generalized Epidemic Modeling Framework

    CERN Document Server

    Sahneh, Faryad Darabi; Shakeri, Heman; Fan, Futing; Scoglio, Caterina

    2016-01-01

    The recently proposed generalized epidemic modeling framework (GEMF) \\cite{sahneh2013generalized} lays the groundwork for systematically constructing a broad spectrum of stochastic spreading processes over complex networks. This article builds an algorithm for exact, continuous-time numerical simulation of GEMF-based processes. Moreover the implementation of this algorithm, GEMFsim, is available in popular scientific programming platforms such as MATLAB, R, Python, and C; GEMFsim facilitates simulating stochastic spreading models that fit in GEMF framework. Using these simulations one can examine the accuracy of mean-field-type approximations that are commonly used for analytical study of spreading processes on complex networks.

  2. A software engineering perspective on environmental modeling framework design: The object modeling system

    Science.gov (United States)

    The environmental modeling community has historically been concerned with the proliferation of models and the effort associated with collective model development tasks (e.g., code generation, data provisioning and transformation, etc.). Environmental modeling frameworks (EMFs) have been developed to...

  3. A Bayesian framework for parameter estimation in dynamical models.

    Directory of Open Access Journals (Sweden)

    Flávio Codeço Coelho

    Full Text Available Mathematical models in biology are powerful tools for the study and exploration of complex dynamics. Nevertheless, bringing theoretical results to an agreement with experimental observations involves acknowledging a great deal of uncertainty intrinsic to our theoretical representation of a real system. Proper handling of such uncertainties is key to the successful usage of models to predict experimental or field observations. This problem has been addressed over the years by many tools for model calibration and parameter estimation. In this article we present a general framework for uncertainty analysis and parameter estimation that is designed to handle uncertainties associated with the modeling of dynamic biological systems while remaining agnostic as to the type of model used. We apply the framework to fit an SIR-like influenza transmission model to 7 years of incidence data in three European countries: Belgium, the Netherlands and Portugal.

  4. ECoS, a framework for modelling hierarchical spatial systems.

    Science.gov (United States)

    Harris, John R W; Gorley, Ray N

    2003-10-01

    A general framework for modelling hierarchical spatial systems has been developed and implemented as the ECoS3 software package. The structure of this framework is described, and illustrated with representative examples. It allows the set-up and integration of sets of advection-diffusion equations representing multiple constituents interacting in a spatial context. Multiple spaces can be defined, with zero, one or two-dimensions and can be nested, and linked through constituent transfers. Model structure is generally object-oriented and hierarchical, reflecting the natural relations within its real-world analogue. Velocities, dispersions and inter-constituent transfers, together with additional functions, are defined as properties of constituents to which they apply. The resulting modular structure of ECoS models facilitates cut and paste model development, and template model components have been developed for the assembly of a range of estuarine water quality models. Published examples of applications to the geochemical dynamics of estuaries are listed.

  5. Collaborative Cloud Manufacturing: Design of Business Model Innovations Enabled by Cyberphysical Systems in Distributed Manufacturing Systems

    Directory of Open Access Journals (Sweden)

    Erwin Rauch

    2016-01-01

    Full Text Available Collaborative cloud manufacturing, as a concept of distributed manufacturing, allows different opportunities for changing the logic of generating and capturing value. Cyberphysical systems and the technologies behind them are the enablers for new business models which have the potential to be disruptive. This paper introduces the topics of distributed manufacturing as well as cyberphysical systems. Furthermore, the main business model clusters of distributed manufacturing systems are described, including collaborative cloud manufacturing. The paper aims to provide support for developing business model innovations based on collaborative cloud manufacturing. Therefore, three business model architecture types of a differentiated business logic are discussed, taking into consideration the parameters which have an influence and the design of the business model and its architecture. As a result, new business models can be developed systematically and new ideas can be generated to boost the concept of collaborative cloud manufacturing within all sustainable business models.

  6. A unified framework for Schelling's model of segregation

    CERN Document Server

    Rogers, Tim

    2011-01-01

    Schelling's model of segregation is one of the first and most influential models in the field of social simulation. There are many variations of the model which have been proposed and simulated over the last forty years, though the present state of the literature on the subject is somewhat fragmented and lacking comprehensive analytical treatments. In this article a unified mathematical framework for Schelling's model and its many variants is developed. This methodology is useful in two regards: firstly, it provides a tool with which to understand the differences observed between models; secondly, phenomena which appear in several model variations may be understood in more depth through analytic studies of simpler versions.

  7. A unified framework for Schelling's model of segregation

    Science.gov (United States)

    Rogers, Tim; McKane, Alan J.

    2011-07-01

    Schelling's model of segregation is one of the first and most influential models in the field of social simulation. There are many variations of the model which have been proposed and simulated over the last forty years, though the present state of the literature on the subject is somewhat fragmented and lacking comprehensive analytical treatments. In this paper a unified mathematical framework for Schelling's model and its many variants is developed. This methodology is useful in two regards: firstly, it provides a tool with which to understand the differences observed between models; secondly, phenomena which appear in several model variations may be understood in more depth through analytic studies of simpler versions.

  8. Investigating Experimental Effects within the Framework of Structural Equation Modeling: An Example with Effects on Both Error Scores and Reaction Times

    Science.gov (United States)

    Schweizer, Karl

    2008-01-01

    Structural equation modeling provides the framework for investigating experimental effects on the basis of variances and covariances in repeated measurements. A special type of confirmatory factor analysis as part of this framework enables the appropriate representation of the experimental effect and the separation of experimental and…

  9. 3D Building Model Fitting Using A New Kinetic Framework

    CERN Document Server

    Brédif, Mathieu; Pierrot-Deseilligny, Marc; Maître, Henri

    2008-01-01

    We describe a new approach to fit the polyhedron describing a 3D building model to the point cloud of a Digital Elevation Model (DEM). We introduce a new kinetic framework that hides to its user the combinatorial complexity of determining or maintaining the polyhedron topology, allowing the design of a simple variational optimization. This new kinetic framework allows the manipulation of a bounded polyhedron with simple faces by specifying the target plane equations of each of its faces. It proceeds by evolving continuously from the polyhedron defined by its initial topology and its initial plane equations to a polyhedron that is as topologically close as possible to the initial polyhedron but with the new plane equations. This kinetic framework handles internally the necessary topological changes that may be required to keep the faces simple and the polyhedron bounded. For each intermediate configurations where the polyhedron looses the simplicity of its faces or its boundedness, the simplest topological mod...

  10. Adoption of information technology enabled innovations by primary care physicians: model and questionnaire development.

    OpenAIRE

    Dixon, D. R.; Dixon, B. J.

    1994-01-01

    A survey instrument was developed based on a model of the substantive factors influencing the adoption of Information Technology (IT) enabled innovations by physicians. The survey was given to all faculty and residents in a Primary Care teaching institution. Computerized literature searching was the IT innovation studied. The results support the role of the perceived ease of use and the perceived usefulness of an innovation as well as the intent to use an innovation as factors important for i...

  11. New framework for standardized notation in wastewater treatment modelling

    DEFF Research Database (Denmark)

    Corominas, L.; Rieger, L.; Takacs, I.

    2010-01-01

    is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper...... notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide......Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new...

  12. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  13. A Liver-Centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    Swat, Maciej; Cosmanescu, Alin; Clendenon, Sherry G.; Wambaugh, John F.; Glazier, James A.

    2016-01-01

    We describe a multi-scale, liver-centric in silico modeling framework for acetaminophen pharmacology and metabolism. We focus on a computational model to characterize whole body uptake and clearance, liver transport and phase I and phase II metabolism. We do this by incorporating sub-models that span three scales; Physiologically Based Pharmacokinetic (PBPK) modeling of acetaminophen uptake and distribution at the whole body level, cell and blood flow modeling at the tissue/organ level and metabolism at the sub-cellular level. We have used standard modeling modalities at each of the three scales. In particular, we have used the Systems Biology Markup Language (SBML) to create both the whole-body and sub-cellular scales. Our modeling approach allows us to run the individual sub-models separately and allows us to easily exchange models at a particular scale without the need to extensively rework the sub-models at other scales. In addition, the use of SBML greatly facilitates the inclusion of biological annotations directly in the model code. The model was calibrated using human in vivo data for acetaminophen and its sulfate and glucuronate metabolites. We then carried out extensive parameter sensitivity studies including the pairwise interaction of parameters. We also simulated population variation of exposure and sensitivity to acetaminophen. Our modeling framework can be extended to the prediction of liver toxicity following acetaminophen overdose, or used as a general purpose pharmacokinetic model for xenobiotics. PMID:27636091

  14. New framework for standardized notation in wastewater treatment modelling.

    Science.gov (United States)

    Corominas, L L; Rieger, L; Takács, I; Ekama, G; Hauduc, H; Vanrolleghem, P A; Oehmen, A; Gernaey, K V; van Loosdrecht, M C M; Comeau, Y

    2010-01-01

    Many unit process models are available in the field of wastewater treatment. All of these models use their own notation, causing problems for documentation, implementation and connection of different models (using different sets of state variables). The main goal of this paper is to propose a new notational framework which allows unique and systematic naming of state variables and parameters of biokinetic models in the wastewater treatment field. The symbols are based on one main letter that gives a general description of the state variable or parameter and several subscript levels that provide greater specification. Only those levels that make the name unique within the model context are needed in creating the symbol. The paper describes specific problems encountered with the currently used notation, presents the proposed framework and provides additional practical examples. The overall result is a framework that can be used in whole plant modelling, which consists of different fields such as activated sludge, anaerobic digestion, sidestream treatment, membrane bioreactors, metabolic approaches, fate of micropollutants and biofilm processes. The main objective of this consensus building paper is to establish a consistent set of rules that can be applied to existing and most importantly, future models. Applying the proposed notation should make it easier for everyone active in the wastewater treatment field to read, write and review documents describing modelling projects.

  15. A computational framework for a database of terrestrial biosphere models

    Science.gov (United States)

    Metzler, Holger; Müller, Markus; Ceballos-Núñez, Verónika; Sierra, Carlos A.

    2016-04-01

    Most terrestrial biosphere models consist of a set of coupled ordinary first order differential equations. Each equation represents a pool containing carbon with a certain turnover rate. Although such models share some basic mathematical structures, they can have very different properties such as number of pools, cycling rates, and internal fluxes. We present a computational framework that helps analyze the structure and behavior of terrestrial biosphere models using as an example the process of soil organic matter decomposition. The same framework can also be used for other sub-processes such as carbon fixation or allocation. First, the models have to be fed into a database consisting of simple text files with a common structure. Then they are read in using Python and transformed into an internal 'Model Class' that can be used to automatically create an overview stating the model's structure, state variables, internal and external fluxes. SymPy, a Python library for symbolic mathematics, helps to also calculate the Jacobian matrix at possibly given steady states and the eigenvalues of this matrix. If complete parameter sets are available, the model can also be run using R to simulate its behavior under certain conditions and to support a deeper stability analysis. In this case, the framework is also able to provide phase-plane plots if appropriate. Furthermore, an overview of all the models in the database can be given to help identify their similarities and differences.

  16. D Geological Framework Models as a Teaching Aid for Geoscience

    Science.gov (United States)

    Kessler, H.; Ward, E.; Geological ModelsTeaching Project Team

    2010-12-01

    3D geological models have great potential as a resource for universities when teaching foundation geological concepts as it allows the student to visualise and interrogate UK geology. They are especially useful when dealing with the conversion of 2D field, map and GIS outputs into three dimensional geological units, which is a common problem for all students of geology. Today’s earth science students use a variety of skills and processes during their learning experience including the application of schema’s, spatial thinking, image construction, detecting patterns, memorising figures, mental manipulation and interpretation, making predictions and deducing the orientation of themselves and the rocks. 3D geological models can reinforce spatial thinking strategies and encourage students to think about processes and properties, in turn helping the student to recognise pre-learnt geological principles in the field and to convert what they see at the surface into a picture of what is going on at depth. Learning issues faced by students may also be encountered by experts, policy managers, and stakeholders when dealing with environmental problems. Therefore educational research of student learning in earth science may also improve environmental decision making. 3D geological framework models enhance the learning of Geosciences because they: ● enable a student to observe, manipulate and interpret geology; in particular the models instantly convert two-dimensional geology (maps, boreholes and cross-sections) into three dimensions which is a notoriously difficult geospatial skill to acquire. ● can be orientated to whatever the user finds comfortable and most aids recognition and interpretation. ● can be used either to teach geosciences to complete beginners or add to experienced students body of knowledge (whatever point that may be at). Models could therefore be packaged as a complete educational journey or students and tutor can select certain areas of the model

  17. Enabling a Collaborative Problem-Solving Framework Through User Intent Modeling of the Analytic Process

    Science.gov (United States)

    2009-08-01

    Nguyen, Eugene Santos Jr., Russell Jacob, and Nathan Smith. In Proceedings of 2008 IEEE/WIC/ACM International Conference on Web Intelligence. Sydney...of Social Cognition. • McDoland, D. W., and Ackerman , M. S. 2000. Expertise Recommender: A plexible recom- mendation system and architecture. In

  18. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business

  19. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small d...

  20. The BMW Model: A New Framework for Teaching Monetary Economics

    Science.gov (United States)

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  1. A compositional modelling framework for exploring MPSoC systems

    DEFF Research Database (Denmark)

    Tranberg-Hansen, Anders Sejer; Madsen, Jan

    2009-01-01

    This paper presents a novel compositional framework for system level performance estimation and exploration of Multi-Processor System On Chip (MPSoC) based systems. The main contributions are the definition of a compositional model which allows quantitative performance estimation to be carried ou...

  2. Public–private partnership conceptual framework and models for the ...

    African Journals Online (AJOL)

    This paper presents public–private partnership (PPP) framework models for funding and financing of water services ... capital markets to finance water infrastructure, particularly local bond markets ...... for the provision of water services infrastructure assets to be ... of water use charges and/or tariffs (pricing), regulatory impact.

  3. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  4. A Liver-centric Multiscale Modeling Framework for Xenobiotics

    Science.gov (United States)

    We describe a multi-scale framework for modeling acetaminophen-induced liver toxicity. Acetaminophen is a widely used analgesic. Overdose of acetaminophen can result in liver injury via its biotransformation into toxic product, which further induce massive necrosis. Our study foc...

  5. Service business model framework and the service innovation scope

    NARCIS (Netherlands)

    van der Aa, W.; van der Rhee, B.; Victorino, L.

    2011-01-01

    In this paper we present a framework for service business models. We build on three streams of research. The first stream is the service management and marketing literature that focuses on the specific challenges of managing a service business. The second stream consists of research on e-business mo

  6. The BMW Model: A New Framework for Teaching Monetary Economics

    Science.gov (United States)

    Bofinger, Peter; Mayer, Eric; Wollmershauser, Timo

    2006-01-01

    Although the IS/LM-AS/AD model is still the central tool of macroeconomic teaching in most macroeconomic textbooks, it has been criticized by several economists. Colander (1995) demonstrated that the framework is logically inconsistent, Romer (2000) showed that it is unable to deal with a monetary policy that uses the interest rate as its…

  7. Model-based safety architecture framework for complex systems

    NARCIS (Netherlands)

    Schuitemaker, K.; Rajabalinejad, M.; Braakhuis, J.G.; Podofilini, Luca; Sudret, Bruno; Stojadinovic, Bozidar; Zio, Enrico; Kröger, Wolfgang

    2015-01-01

    The shift to transparency and rising need of the general public for safety, together with the increasing complexity and interdisciplinarity of modern safety-critical Systems of Systems (SoS) have resulted in a Model-Based Safety Architecture Framework (MBSAF) for capturing and sharing architectural

  8. A Graph Based Framework to Model Virus Integration Sites

    Directory of Open Access Journals (Sweden)

    Raffaele Fronza

    2016-01-01

    Here, we addressed the challenge to: 1 define the notion of CIS on graph models, 2 demonstrate that the structure of CIS enters in the category of scale-free networks and 3 show that our network approach analyzes CIS dynamically in an integrated systems biology framework using the Retroviral Transposon Tagged Cancer Gene Database (RTCGD as a testing dataset.

  9. Framework for Understanding Structural Errors (FUSE): a modular framework to diagnose differences between hydrological models

    Science.gov (United States)

    Clark, Martyn P.; Slater, Andrew G.; Rupp, David E.; Woods, Ross A.; Vrugt, Jasper A.; Gupta, Hoshin V.; Wagener, Thorsten; Hay, Lauren E.

    2008-01-01

    The problems of identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure remain outstanding research challenges for the discipline of hydrology. Progress on these problems requires understanding of the nature of differences between models. This paper presents a methodology to diagnose differences in hydrological model structures: the Framework for Understanding Structural Errors (FUSE). FUSE was used to construct 79 unique model structures by combining components of 4 existing hydrological models. These new models were used to simulate streamflow in two of the basins used in the Model Parameter Estimation Experiment (MOPEX): the Guadalupe River (Texas) and the French Broad River (North Carolina). Results show that the new models produced simulations of streamflow that were at least as good as the simulations produced by the models that participated in the MOPEX experiment. Our initial application of the FUSE method for the Guadalupe River exposed relationships between model structure and model performance, suggesting that the choice of model structure is just as important as the choice of model parameters. However, further work is needed to evaluate model simulations using multiple criteria to diagnose the relative importance of model structural differences in various climate regimes and to assess the amount of independent information in each of the models. This work will be crucial to both identifying the most appropriate model structure for a given problem and quantifying the uncertainty in model structure. To facilitate research on these problems, the FORTRAN-90 source code for FUSE is available upon request from the lead author.

  10. A unifying framework for systems modeling, control systems design, and system operation

    Science.gov (United States)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  11. COINSTAC: A Privacy Enabled Model and Prototype for Leveraging and Processing Decentralized Brain Imaging Data

    Science.gov (United States)

    Plis, Sergey M.; Sarwate, Anand D.; Wood, Dylan; Dieringer, Christopher; Landis, Drew; Reed, Cory; Panta, Sandeep R.; Turner, Jessica A.; Shoemaker, Jody M.; Carter, Kim W.; Thompson, Paul; Hutchison, Kent; Calhoun, Vince D.

    2016-01-01

    The field of neuroimaging has embraced the need for sharing and collaboration. Data sharing mandates from public funding agencies and major journal publishers have spurred the development of data repositories and neuroinformatics consortia. However, efficient and effective data sharing still faces several hurdles. For example, open data sharing is on the rise but is not suitable for sensitive data that are not easily shared, such as genetics. Current approaches can be cumbersome (such as negotiating multiple data sharing agreements). There are also significant data transfer, organization and computational challenges. Centralized repositories only partially address the issues. We propose a dynamic, decentralized platform for large scale analyses called the Collaborative Informatics and Neuroimaging Suite Toolkit for Anonymous Computation (COINSTAC). The COINSTAC solution can include data missing from central repositories, allows pooling of both open and “closed” repositories by developing privacy-preserving versions of widely-used algorithms, and incorporates the tools within an easy-to-use platform enabling distributed computation. We present an initial prototype system which we demonstrate on two multi-site data sets, without aggregating the data. In addition, by iterating across sites, the COINSTAC model enables meta-analytic solutions to converge to “pooled-data” solutions (i.e., as if the entire data were in hand). More advanced approaches such as feature generation, matrix factorization models, and preprocessing can be incorporated into such a model. In sum, COINSTAC enables access to the many currently unavailable data sets, a user friendly privacy enabled interface for decentralized analysis, and a powerful solution that complements existing data sharing solutions. PMID:27594820

  12. A Framework for Understanding Physics Students' Computational Modeling Practices

    Science.gov (United States)

    Lunk, Brandon Robert

    With the growing push to include computational modeling in the physics classroom, we are faced with the need to better understand students' computational modeling practices. While existing research on programming comprehension explores how novices and experts generate programming algorithms, little of this discusses how domain content knowledge, and physics knowledge in particular, can influence students' programming practices. In an effort to better understand this issue, I have developed a framework for modeling these practices based on a resource stance towards student knowledge. A resource framework models knowledge as the activation of vast networks of elements called "resources." Much like neurons in the brain, resources that become active can trigger cascading events of activation throughout the broader network. This model emphasizes the connectivity between knowledge elements and provides a description of students' knowledge base. Together with resources resources, the concepts of "epistemic games" and "frames" provide a means for addressing the interaction between content knowledge and practices. Although this framework has generally been limited to describing conceptual and mathematical understanding, it also provides a means for addressing students' programming practices. In this dissertation, I will demonstrate this facet of a resource framework as well as fill in an important missing piece: a set of epistemic games that can describe students' computational modeling strategies. The development of this theoretical framework emerged from the analysis of video data of students generating computational models during the laboratory component of a Matter & Interactions: Modern Mechanics course. Student participants across two semesters were recorded as they worked in groups to fix pre-written computational models that were initially missing key lines of code. Analysis of this video data showed that the students' programming practices were highly influenced by

  13. Re-orienting a remote acute care model towards a primary health care approach: key enablers.

    Science.gov (United States)

    Carroll, Vicki; Reeve, Carole A; Humphreys, John S; Wakerman, John; Carter, Maureen

    2015-01-01

    The objective of this study was to identify the key enablers of change in re-orienting a remote acute care model to comprehensive primary healthcare delivery. The setting of the study was a 12-bed hospital in Fitzroy Crossing, Western Australia. Individual key informant, in-depth interviews were completed with five of six identified senior leaders involved in the development of the Fitzroy Valley Health Partnership. Interviews were recorded and transcripts were thematically analysed by two investigators for shared views about the enabling factors strengthening primary healthcare delivery in a remote region of Australia. Participants described theestablishment of a culturally relevant primary healthcare service, using a community-driven, 'bottom up' approach characterised by extensive community participation. The formal partnership across the government and community controlled health services was essential, both to enable change to occur and to provide sustainability in the longer term. A hierarchy of major themes emerged. These included community participation, community readiness and desire for self-determination; linkages in the form of a government community controlled health service partnership; leadership; adequate infrastructure; enhanced workforce supply; supportive policy; and primary healthcare funding. The strong united leadership shown by the community and the health service enabled barriers to be overcome and it maximised the opportunities provided by government policy changes. The concurrent alignment around a common vision enabled implementation of change. The key principle learnt from this study is the importance of community and health service relationships and local leadership around a shared vision for the re-orientation of community health services.

  14. Generic Raman-based calibration models enabling real-time monitoring of cell culture bioreactors.

    Science.gov (United States)

    Mehdizadeh, Hamidreza; Lauri, David; Karry, Krizia M; Moshgbar, Mojgan; Procopio-Melino, Renee; Drapeau, Denis

    2015-01-01

    Raman-based multivariate calibration models have been developed for real-time in situ monitoring of multiple process parameters within cell culture bioreactors. Developed models are generic, in the sense that they are applicable to various products, media, and cell lines based on Chinese Hamster Ovarian (CHO) host cells, and are scalable to large pilot and manufacturing scales. Several batches using different CHO-based cell lines and corresponding proprietary media and process conditions have been used to generate calibration datasets, and models have been validated using independent datasets from separate batch runs. All models have been validated to be generic and capable of predicting process parameters with acceptable accuracy. The developed models allow monitoring multiple key bioprocess metabolic variables, and hence can be utilized as an important enabling tool for Quality by Design approaches which are strongly supported by the U.S. Food and Drug Administration.

  15. A Systematic Modelling Framework for Phase Transfer Catalyst Systems

    DEFF Research Database (Denmark)

    Anantpinijwatna, Amata; Sales-Cruz, Mauricio; Hyung Kim, Sun

    2016-01-01

    in an aqueous phase. These reacting systems are receiving increased attention as novel organic synthesis options due to their flexible operation, higher product yields, and ability to avoid hazardous or expensive solvents. Major considerations in the design and analysis of PTC systems are physical and chemical...... equilibria, as well as kinetic mechanisms and rates. This paper presents a modelling framework for design and analysis of PTC systems that requires a minimum amount of experimental data to develop and employ the necessary thermodynamic and reaction models and embeds them into a reactor model for simulation....... The application of the framework is made to two cases in order to highlight the performance and issues of activity coefficient models for predicting design and operation and the effects when different organic solvents are employed....

  16. Indeterminate direction relation model based on fuzzy description framework

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The indetermination of direction relation is a hot topic for fuzzy GIS researchers. The existing models only study the effects of indetermination of spatial objects,but ignore the uncertainty of direction reference framework. In this paper,first a for-malized representation model of indeterminate spatial objects is designed based on quadruple (x,y,A,μ),then a fuzzy direction reference framework is constructed by revising the cone method,in which the partitions of direction tiles are smooth and continuous,and two neighboring sections are overlapped in the transitional zones with fuzzy method. Grounded on these,a fuzzy description model for indeterminate direction relation is proposed in which the uncertainty of all three parts (source object,reference object and reference frame) is taken into account simultaneously. In the end,case studies are implemented to test the rationality and validity of the model.

  17. Theoretical Models and Operational Frameworks in Public Health Ethics

    Science.gov (United States)

    Petrini, Carlo

    2010-01-01

    The article is divided into three sections: (i) an overview of the main ethical models in public health (theoretical foundations); (ii) a summary of several published frameworks for public health ethics (practical frameworks); and (iii) a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided. PMID:20195441

  18. Theoretical Models and Operational Frameworks in Public Health Ethics

    Directory of Open Access Journals (Sweden)

    Carlo Petrini

    2010-01-01

    Full Text Available The article is divided into three sections: (i an overview of the main ethical models in public health (theoretical foundations; (ii a summary of several published frameworks for public health ethics (practical frameworks; and (iii a few general remarks. Rather than maintaining the superiority of one position over the others, the main aim of the article is to summarize the basic approaches proposed thus far concerning the development of public health ethics by describing and comparing the various ideas in the literature. With this in mind, an extensive list of references is provided.

  19. Interpretive Structural Modeling Of Implementation Enablers For Just In Time In ICPI

    Directory of Open Access Journals (Sweden)

    Nitin Upadhye

    2014-12-01

    Full Text Available Indian Corrugated Packaging Industries (ICPI have built up tough competition among the industries in terms of product cost, quality, product delivery, flexibility, and finally customer’s demand. As their customers, mostly OEMs are asking Just in Time deliveries, ICPI must implement JIT in their system. The term "JIT” as, it denotes a system that utilizes less, in terms of all inputs, to create the same outputs as those created by a traditional mass production system, while contributing increased varieties for the end customer. (Womack et al. 1990 "JIT" focuses on abolishing or reducing Muda (“Muda", the Japanese word for waste and on maximizing or fully utilizing activities that add value from the customer's perspective. There is lack of awareness in identifying the right enablers of JIT implementation. Therefore, this study has tried to find out the enablers from the literature review and expert’s opinions from corrugated packaging industries and developed the relationship matrix to see the driving power and dependence between them. In this study, modeling has been done in order to know the interrelationships between the enablers with the help of Interpretive Structural Modeling and Cross Impact Matrix Multiplication Applied to Classification (MICMAC analysis for the performance of Indian corrugated packaging industries.

  20. A High Performance Bayesian Computing Framework for Spatiotemporal Uncertainty Modeling

    Science.gov (United States)

    Cao, G.

    2015-12-01

    All types of spatiotemporal measurements are subject to uncertainty. With spatiotemporal data becomes increasingly involved in scientific research and decision making, it is important to appropriately model the impact of uncertainty. Quantitatively modeling spatiotemporal uncertainty, however, is a challenging problem considering the complex dependence and dataheterogeneities.State-space models provide a unifying and intuitive framework for dynamic systems modeling. In this paper, we aim to extend the conventional state-space models for uncertainty modeling in space-time contexts while accounting for spatiotemporal effects and data heterogeneities. Gaussian Markov Random Field (GMRF) models, also known as conditional autoregressive models, are arguably the most commonly used methods for modeling of spatially dependent data. GMRF models basically assume that a geo-referenced variable primarily depends on its neighborhood (Markov property), and the spatial dependence structure is described via a precision matrix. Recent study has shown that GMRFs are efficient approximation to the commonly used Gaussian fields (e.g., Kriging), and compared with Gaussian fields, GMRFs enjoy a series of appealing features, such as fast computation and easily accounting for heterogeneities in spatial data (e.g, point and areal). This paper represents each spatial dataset as a GMRF and integrates them into a state-space form to statistically model the temporal dynamics. Different types of spatial measurements (e.g., categorical, count or continuous), can be accounted for by according link functions. A fast alternative to MCMC framework, so-called Integrated Nested Laplace Approximation (INLA), was adopted for model inference.Preliminary case studies will be conducted to showcase the advantages of the described framework. In the first case, we apply the proposed method for modeling the water table elevation of Ogallala aquifer over the past decades. In the second case, we analyze the

  1. Catchment-Scale Simulation of Nitrogen Dynamics Using a Modular Hydrological Modelling Framework

    Science.gov (United States)

    Basu, N. B.; Shafii, M.; Craig, J. R.; Schiff, S. L.; Van Cappellen, P.

    2016-12-01

    The hydrological modelling framework Raven is a modular and flexible modelling framework for semi-distributed simulation of watershed hydrology. Raven enables the incorporation of different hydrologic processes, the evaluation of model choices, and hypothesis testing about model structure. Raven also supports the simulation of solute transport in catchments and in the surface water network. We developed a coupled hydrological-biogeochemical model within Raven to simulate catchment-scale nitrate loss in the Grand River Watershed (GRW), the largest basin in Southern Ontario feeding into the Lake Erie. GRW is a snow-dominated catchment and has severe nitrate contamination issues (due to intensive agriculture and a dense tile drainage system), especially during the snowmelt events. We used several sets of hydrochemical data (including tiles data), combined with a unique flow partitioning approach to constrain flow pathways in the hydrology model, which is critical to the accurate representation of the sources and sinks in the biogeochemical model. A biogeochemical model was then coupled to the hydrologic model in Raven to simulate nitrogen processes and identify nitrate loss at a variety of spatio-temporal scales in GRW. The preliminary results obtained after applying the coupled model to a subbasin in GRW are promising and we are at the stage of upscaling the model to the entire watershed. Raven, as an open-source object-oriented software, is currently being used by watershed managers, and incorporating nutrients dynamics in the code makes it applicable to solving water quality problems at the catchment scale as well.

  2. Compendium of models from a gauge U(1) framework

    Science.gov (United States)

    Ma, Ernest

    2016-06-01

    A gauge U(1) framework was established in 2002 to extend the supersymmetric Standard Model. It has many possible realizations. Whereas all have the necessary and sufficient ingredients to explain the possible 750 GeV diphoton excess, observed recently by the ATLAS and CMS Collaborations at the large hadron collider (LHC), they differ in other essential aspects. A compendium of such models is discussed.

  3. Advancing Integrated Systems Modelling Framework for Life Cycle Sustainability Assessment

    Directory of Open Access Journals (Sweden)

    Anthony Halog

    2011-02-01

    Full Text Available The need for integrated methodological framework for sustainability assessment has been widely discussed and is urgent due to increasingly complex environmental system problems. These problems have impacts on ecosystems and human well-being which represent a threat to economic performance of countries and corporations. Integrated assessment crosses issues; spans spatial and temporal scales; looks forward and backward; and incorporates multi-stakeholder inputs. This study aims to develop an integrated methodology by capitalizing the complementary strengths of different methods used by industrial ecologists and biophysical economists. The computational methodology proposed here is systems perspective, integrative, and holistic approach for sustainability assessment which attempts to link basic science and technology to policy formulation. The framework adopts life cycle thinking methods—LCA, LCC, and SLCA; stakeholders analysis supported by multi-criteria decision analysis (MCDA; and dynamic system modelling. Following Pareto principle, the critical sustainability criteria, indicators and metrics (i.e., hotspots can be identified and further modelled using system dynamics or agent based modelling and improved by data envelopment analysis (DEA and sustainability network theory (SNT. The framework is being applied to development of biofuel supply chain networks. The framework can provide new ways of integrating knowledge across the divides between social and natural sciences as well as between critical and problem-solving research.

  4. Modelling Framework to Support Decision-Making in Manufacturing Enterprises

    Directory of Open Access Journals (Sweden)

    Tariq Masood

    2013-01-01

    Full Text Available Systematic model-driven decision-making is crucial to design, engineer, and transform manufacturing enterprises (MEs. Choosing and applying the best philosophies and techniques is challenging as most MEs deploy complex and unique configurations of process-resource systems and seek economies of scope and scale in respect of changing and distinctive product flows. This paper presents a novel systematic enhanced integrated modelling framework to facilitate transformation of MEs, which is centred on CIMOSA. Application of the new framework in an automotive industrial case study is also presented. The following new contributions to knowledge are made: (1 an innovative structured framework that can support various decisions in design, optimisation, and control to reconfigure MEs; (2 an enriched and generic process modelling approach with capability to represent both static and dynamic aspects of MEs; and (3 an automotive industrial case application showing benefits in terms of reduced lead time and cost with improved responsiveness of process-resource system with a special focus on PPC. It is anticipated that the new framework is not limited to only automotive industry but has a wider scope of application. Therefore, it would be interesting to extend its testing with different configurations and decision-making levels.

  5. A new fit-for-purpose model testing framework: Decision Crash Tests

    Science.gov (United States)

    Tolson, Bryan; Craig, James

    2016-04-01

    decisions. In one case, we show the set of model building decisions has a low probability to correctly support the upgrade decision. In the other case, we show evidence suggesting another set of model building decisions has a high probability to correctly support the decision. The proposed DCT framework focuses on what model users typically care about: the management decision in question. The DCT framework will often be very strict and will produce easy to interpret results enabling clear unsuitability determinations. In the past, hydrologic modelling progress has necessarily meant new models and model building methods. Continued progress in hydrologic modelling requires finding clear evidence to motivate researchers to disregard unproductive models and methods and the DCT framework is built to produce this kind of evidence. References: Andréassian, V., C. Perrin, L. Berthet, N. Le Moine, J. Lerat, C. Loumagne, L. Oudin, T. Mathevet, M.-H. Ramos, and A. Valéry (2009), Crash tests for a standardized evaluation of hydrological models. Hydrology and Earth System Sciences, 13, 1757-1764. Klemeš, V. (1986), Operational testing of hydrological simulation models. Hydrological Sciences Journal, 31 (1), 13-24.

  6. Possibilities: A framework for modeling students' deductive reasoning in physics

    Science.gov (United States)

    Gaffney, Jonathan David Housley

    Students often make errors when trying to solve qualitative or conceptual physics problems, and while many successful instructional interventions have been generated to prevent such errors, the process of deduction that students use when solving physics problems has not been thoroughly studied. In an effort to better understand that reasoning process, I have developed a new framework, which is based on the mental models framework in psychology championed by P. N. Johnson-Laird. My new framework models how students search possibility space when thinking about conceptual physics problems and suggests that errors arise from failing to flesh out all possibilities. It further suggests that instructional interventions should focus on making apparent those possibilities, as well as all physical consequences those possibilities would incur. The possibilities framework emerged from the analysis of data from a unique research project specifically invented for the purpose of understanding how students use deductive reasoning. In the selection task, participants were given a physics problem along with three written possible solutions with the goal of identifying which one of the three possible solutions was correct. Each participant was also asked to identify the errors in the incorrect solutions. For the study presented in this dissertation, participants not only performed the selection task individually on four problems, but they were also placed into groups of two or three and asked to discuss with each other the reasoning they used in making their choices and attempt to reach a consensus about which solution was correct. Finally, those groups were asked to work together to perform the selection task on three new problems. The possibilities framework appropriately models the reasoning that students use, and it makes useful predictions about potentially helpful instructional interventions. The study reported in this dissertation emphasizes the useful insight the

  7. An enhanced BSIM modeling framework for selfheating aware circuit design

    Science.gov (United States)

    Schleyer, M.; Leuschner, S.; Baumgartner, P.; Mueller, J.-E.; Klar, H.

    2014-11-01

    This work proposes a modeling framework to enhance the industry-standard BSIM4 MOSFET models with capabilities for coupled electro-thermal simulations. An automated simulation environment extracts thermal information from model data as provided by the semiconductor foundry. The standard BSIM4 model is enhanced with a Verilog-A based wrapper module, adding thermal nodes which can be connected to a thermal-equivalent RC network. The proposed framework allows a fully automated extraction process based on the netlist of the top-level design and the model library. A numerical analysis tool is used to control the extraction flow and to obtain all required parameters. The framework is used to model self-heating effects on a fully integrated class A/AB power amplifier (PA) designed in a standard 65 nm CMOS process. The PA is driven with +30 dBm output power, leading to an average temperature rise of approximately 40 °C over ambient temperature.

  8. Service and business model for technology enabled and home-based cardiac rehabilitation programs.

    Science.gov (United States)

    Sarela, Antti; Whittaker, Frank; Korhonen, Ilkka

    2009-01-01

    Cardiac rehabilitation programs are comprehensive life-style programs aimed at preventing recurrence of a cardiac event. However, the current programs have globally significantly low levels of uptake. Home-based model can be a viable alternative to hospital-based programs. We developed and analysed a service and business model for home based cardiac rehabilitation based on personal mentoring using mobile phones and web services. We analysed the different organizational and economical aspects of setting up and running the home based program and propose a potential business model for a sustainable and viable service. The model can be extended to management of other chronic conditions to enable transition from hospital and care centre based treatments to sustainable home-based care.

  9. A MULTI-OBJECTIVE ROBUST OPERATION MODEL FORELECTRONIC MARKET ENABLED SUPPLY CHAIN WITH UNCERTAIN DEMANDS

    Institute of Scientific and Technical Information of China (English)

    Jiawang XU; Xiaoyuan HUANG; Nina YAN

    2007-01-01

    A multi-objective robust operation model is proposed in this paper for an electronic market enabled supply chain consisting of multi-supplier and multi-customer with uncertain demands.Suppliers in this supply chain provide many kinds of products to different customers directly or through electronic market.Uncertain demands are described as a scenario set with certain probability; the supply chain operation model is constructed by using the robust optimization method based on scenario analyses.The operation model we proposed is a multi-objective programming problem satisfying several conflict objectives,such as meeting the demands of all customers,minimizing the system cost,the availabilities of suppliers' capacities not below a certain level,and robustness of decision to uncertain demands.The results of numerical examples proved that the solution of the model is most conservative; however,it can ensure the robustness of the operation of the supply chain effectively.

  10. Land Surface Verification Toolkit (LVT) - A Generalized Framework for Land Surface Model Evaluation

    Science.gov (United States)

    Kumar, Sujay V.; Peters-Lidard, Christa D.; Santanello, Joseph; Harrison, Ken; Liu, Yuqiong; Shaw, Michael

    2011-01-01

    Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT) is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS), it also supports hydrological data products from other, non-LIS environments. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  11. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-02-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  12. Designing the Business Models for Circular Economy—Towards the Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Mateusz Lewandowski

    2016-01-01

    Full Text Available Switching from the current linear model of economy to a circular one has recently attracted increased attention from major global companies e.g., Google, Unilever, Renault, and policymakers attending the World Economic Forum. The reasons for this are the huge financial, social and environmental benefits. However, the global shift from one model of economy to another also concerns smaller companies on a micro-level. Thus, comprehensive knowledge on designing circular business models is needed to stimulate and foster implementation of the circular economy. Existing business models for the circular economy have limited transferability and there is no comprehensive framework supporting every kind of company in designing a circular business model. This study employs a literature review to identify and classify the circular economy characteristics according to a business model structure. The investigation in the eight sub-domains of research on circular business models was used to redefine the components of the business model canvas in the context of the circular economy. Two new components—the take-back system and adoption factors—have been identified, thereby leading to the conceptualization of an extended framework for the circular business model canvas. Additionally, the triple fit challenge has been recognized as an enabler of the transition towards a circular business model. Some directions for further research have been outlined, as well.

  13. Land surface Verification Toolkit (LVT – a generalized framework for land surface model evaluation

    Directory of Open Access Journals (Sweden)

    S. V. Kumar

    2012-06-01

    Full Text Available Model evaluation and verification are key in improving the usage and applicability of simulation models for real-world applications. In this article, the development and capabilities of a formal system for land surface model evaluation called the Land surface Verification Toolkit (LVT is described. LVT is designed to provide an integrated environment for systematic land model evaluation and facilitates a range of verification approaches and analysis capabilities. LVT operates across multiple temporal and spatial scales and employs a large suite of in-situ, remotely sensed and other model and reanalysis datasets in their native formats. In addition to the traditional accuracy-based measures, LVT also includes uncertainty and ensemble diagnostics, information theory measures, spatial similarity metrics and scale decomposition techniques that provide novel ways for performing diagnostic model evaluations. Though LVT was originally designed to support the land surface modeling and data assimilation framework known as the Land Information System (LIS, it supports hydrological data products from non-LIS environments as well. In addition, the analysis of diagnostics from various computational subsystems of LIS including data assimilation, optimization and uncertainty estimation are supported within LVT. Together, LIS and LVT provide a robust end-to-end environment for enabling the concepts of model data fusion for hydrological applications. The evolving capabilities of LVT framework are expected to facilitate rapid model evaluation efforts and aid the definition and refinement of formal evaluation procedures for the land surface modeling community.

  14. Adoption of mobile learning among 3g-enabled handheld users using extended technology acceptance model

    Directory of Open Access Journals (Sweden)

    Fadare Oluwaseun Gbenga

    2013-12-01

    Full Text Available This paper examines various constructs of an extended TAM, Technology Acceptance Model, that are theoretically influencing the adoption and acceptability of mobile learning among 3G enabled mobile users. Mobile learning activity- based, used for this study were drawn from behaviourist and “learning and teaching support” educational paradigms. An online and manual survey instruments were used to gather data. The structural equation modelling techniques were then employed to explain the adoption processes of hypothesized research model. A theoretical model ETAM is developed based on TAM. Our result proved that psychometric constructs of TAM can be extended and that ETAM is well suited, and of good pedagogical tool in understanding mobile learning among 3G enabled handheld devices in southwest part of Nigeria. Cognitive constructs, attitude toward m-learning, self-efficacy play significant roles in influencing behavioural intention for mobile learning, of which self-efficacy is the most importance construct. Implications of results and directions for future research are discussed.

  15. A modeling framework for system restoration from cascading failures.

    Science.gov (United States)

    Liu, Chaoran; Li, Daqing; Zio, Enrico; Kang, Rui

    2014-01-01

    System restoration from cascading failures is an integral part of the overall defense against catastrophic breakdown in networked critical infrastructures. From the outbreak of cascading failures to the system complete breakdown, actions can be taken to prevent failure propagation through the entire network. While most analysis efforts have been carried out before or after cascading failures, restoration during cascading failures has been rarely studied. In this paper, we present a modeling framework to investigate the effects of in-process restoration, which depends strongly on the timing and strength of the restoration actions. Furthermore, in the model we also consider additional disturbances to the system due to restoration actions themselves. We demonstrate that the effect of restoration is also influenced by the combination of system loading level and restoration disturbance. Our modeling framework will help to provide insights on practical restoration from cascading failures and guide improvements of reliability and resilience of actual network systems.

  16. Viewpoints: a framework for object oriented database modelling and distribution

    Directory of Open Access Journals (Sweden)

    Fouzia Benchikha

    2006-01-01

    Full Text Available The viewpoint concept has received widespread attention recently. Its integration into a data model improves the flexibility of the conventional object-oriented data model and allows one to improve the modelling power of objects. The viewpoint paradigm can be used as a means of providing multiple descriptions of an object and as a means of mastering the complexity of current database systems enabling them to be developed in a distributed manner. The contribution of this paper is twofold: to define an object data model integrating viewpoints in databases and to present a federated database system integrating multiple sources following a local-as-extended-view approach.

  17. Enabling Real-time Water Decision Support Services Using Model as a Service

    Science.gov (United States)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  18. Model Transformation for Model Driven Development of Semantic Web Enabled Multi-Agent Systems

    NARCIS (Netherlands)

    Kardas, G.; Göknil, Arda; Dikenelli, O.; Topaloglu, N.Y.; Weyns, D.; Holvoet, T.

    2007-01-01

    Model Driven Development (MDD) provides an infrastructure that simplifies Multi-agent System (MAS) development by increasing the abstraction level. In addition to defining models, transformation process for those models is also crucial in MDD. On the other hand, MAS modeling should also take care of

  19. Model Transformation for Model Driven Development of Semantic Web Enabled Multi-Agent Systems

    NARCIS (Netherlands)

    Kardas, G.; Göknil, A.; Dikenelli, O.; Topaloglu, N.Y.

    2007-01-01

    Model Driven Development (MDD) provides an infrastructure that simplifies Multi-agent System (MAS) development by increasing the abstraction level. In addition to defining models, transformation process for those models is also crucial in MDD. On the other hand, MAS modeling should also take care of

  20. A new model for enabling innovation in appropriate technology for sustainable development

    Directory of Open Access Journals (Sweden)

    Joshua Pearce

    2012-08-01

    Full Text Available The task of providing for basic human necessities such as food, water, shelter, and employment is growing as the world’s population continues to expand amid climate destabilization. One of the greatest challenges to development and innovation is access to relevant knowledge for quick technological dissemination. However, with the rise and application of advanced information technologies there is a great opportunity for knowledge building, community interaction, innovation, and collaboration using various online platforms. This article examines the potential of a novel model to enable innovation for collaborative enterprise, learning, and appropriate technology development on a global scale.

  1. Mechanisms of Soil Aggregation: a biophysical modeling framework

    Science.gov (United States)

    Ghezzehei, T. A.; Or, D.

    2016-12-01

    Soil aggregation is one of the main crosscutting concepts in all sub-disciplines and applications of soil science from agriculture to climate regulation. The concept generally refers to adhesion of primary soil particles into distinct units that remain stable when subjected to disruptive forces. It is one of the most sensitive soil qualities that readily respond to disturbances such as cultivation, fire, drought, flooding, and changes in vegetation. These changes are commonly quantified and incorporated in soil models indirectly as alterations in carbon content and type, bulk density, aeration, permeability, as well as water retention characteristics. Soil aggregation that is primarily controlled by organic matter generally exhibits hierarchical organization of soil constituents into stable units that range in size from a few microns to centimeters. However, this conceptual model of soil aggregation as the key unifying mechanism remains poorly quantified and is rarely included in predictive soil models. Here we provide a biophysical framework for quantitative and predictive modeling of soil aggregation and its attendant soil characteristics. The framework treats aggregates as hotspots of biological, chemical and physical processes centered around roots and root residue. We keep track of the life cycle of an individual aggregate from it genesis in the rhizosphere, fueled by rhizodeposition and mediated by vigorous microbial activity, until its disappearance when the root-derived resources are depleted. The framework synthesizes current understanding of microbial life in porous media; water holding and soil binding capacity of biopolymers; and environmental controls on soil organic matter dynamics. The framework paves a way for integration of processes that are presently modeled as disparate or poorly coupled processes, including storage and protection of carbon, microbial activity, greenhouse gas fluxes, movement and storage of water, resistance of soils against

  2. Bayesian-based Project Monitoring: Framework Development and Model Testing

    Directory of Open Access Journals (Sweden)

    Budi Hartono

    2015-12-01

    Full Text Available During project implementation, risk becomes an integral part of project monitoring. Therefore. a tool that could dynamically include elements of risk in project progress monitoring is needed. This objective of this study is to develop a general framework that addresses such a concern. The developed framework consists of three interrelated major building blocks, namely: Risk Register (RR, Bayesian Network (BN, and Project Time Networks (PTN for dynamic project monitoring. RR is used to list and to categorize identified project risks. PTN is utilized for modeling the relationship between project activities. BN is used to reflect the interdependence among risk factors and to bridge RR and PTN. A residential development project is chosen as a working example and the result shows that the proposed framework has been successfully applied. The specific model of the development project is also successfully developed and is used to monitor the project progress. It is shown in this study that the proposed BN-based model provides superior performance in terms of forecast accuracy compared to the extant models.

  3. Ames Culture Chamber System: Enabling Model Organism Research Aboard the international Space Station

    Science.gov (United States)

    Steele, Marianne

    2014-01-01

    Understanding the genetic, physiological, and behavioral effects of spaceflight on living organisms and elucidating the molecular mechanisms that underlie these effects are high priorities for NASA. Certain organisms, known as model organisms, are widely studied to help researchers better understand how all biological systems function. Small model organisms such as nem-atodes, slime mold, bacteria, green algae, yeast, and moss can be used to study the effects of micro- and reduced gravity at both the cellular and systems level over multiple generations. Many model organisms have sequenced genomes and published data sets on their transcriptomes and proteomes that enable scientific investigations of the molecular mechanisms underlying the adaptations of these organisms to space flight.

  4. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    Mock-Up Interface, which is used to link the models with the MPC system. The framework was used to develop and run initial thermal and CO2 models. Their performance and the implementation procedure are discussed in the present paper. The framework is going to be implemented in the MPC system planned...

  5. A framework to model real-time databases

    CERN Document Server

    Idoudi, Nizar; Duvallet, Claude; Sadeg, Bruno; Bouaziz, Rafik; Gargouri, Faiez

    2010-01-01

    Real-time databases deal with time-constrained data and time-constrained transactions. The design of this kind of databases requires the introduction of new concepts to support both data structures and the dynamic behaviour of the database. In this paper, we give an overview about different aspects of real-time databases and we clarify requirements of their modelling. Then, we present a framework for real-time database design and describe its fundamental operations. A case study demonstrates the validity of the structural model and illustrates SQL queries and Java code generated from the classes of the model

  6. An Integrated Modeling Framework for Probable Maximum Precipitation and Flood

    Science.gov (United States)

    Gangrade, S.; Rastogi, D.; Kao, S. C.; Ashfaq, M.; Naz, B. S.; Kabela, E.; Anantharaj, V. G.; Singh, N.; Preston, B. L.; Mei, R.

    2015-12-01

    With the increasing frequency and magnitude of extreme precipitation and flood events projected in the future climate, there is a strong need to enhance our modeling capabilities to assess the potential risks on critical energy-water infrastructures such as major dams and nuclear power plants. In this study, an integrated modeling framework is developed through high performance computing to investigate the climate change effects on probable maximum precipitation (PMP) and probable maximum flood (PMF). Multiple historical storms from 1981-2012 over the Alabama-Coosa-Tallapoosa River Basin near the Atlanta metropolitan area are simulated by the Weather Research and Forecasting (WRF) model using the Climate Forecast System Reanalysis (CFSR) forcings. After further WRF model tuning, these storms are used to simulate PMP through moisture maximization at initial and lateral boundaries. A high resolution hydrological model, Distributed Hydrology-Soil-Vegetation Model, implemented at 90m resolution and calibrated by the U.S. Geological Survey streamflow observations, is then used to simulate the corresponding PMF. In addition to the control simulation that is driven by CFSR, multiple storms from the Community Climate System Model version 4 under the Representative Concentrations Pathway 8.5 emission scenario are used to simulate PMP and PMF in the projected future climate conditions. The multiple PMF scenarios developed through this integrated modeling framework may be utilized to evaluate the vulnerability of existing energy-water infrastructures with various aspects associated PMP and PMF.

  7. Automatic Model Generation Framework for Computational Simulation of Cochlear Implantation

    DEFF Research Database (Denmark)

    Mangado Lopez, Nerea; Ceresa, Mario; Duchateau, Nicolas

    2016-01-01

    Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging. To addr......Recent developments in computational modeling of cochlear implantation are promising to study in silico the performance of the implant before surgery. However, creating a complete computational model of the patient's anatomy while including an external device geometry remains challenging....... To address such a challenge, we propose an automatic framework for the generation of patient-specific meshes for finite element modeling of the implanted cochlea. First, a statistical shape model is constructed from high-resolution anatomical μCT images. Then, by fitting the statistical model to a patient......'s CT image, an accurate model of the patient-specific cochlea anatomy is obtained. An algorithm based on the parallel transport frame is employed to perform the virtual insertion of the cochlear implant. Our automatic framework also incorporates the surrounding bone and nerve fibers and assigns...

  8. A Structural Model Decomposition Framework for Systems Health Management

    Science.gov (United States)

    Roychoudhury, Indranil; Daigle, Matthew J.; Bregon, Anibal; Pulido, Belamino

    2013-01-01

    Systems health management (SHM) is an important set of technologies aimed at increasing system safety and reliability by detecting, isolating, and identifying faults; and predicting when the system reaches end of life (EOL), so that appropriate fault mitigation and recovery actions can be taken. Model-based SHM approaches typically make use of global, monolithic system models for online analysis, which results in a loss of scalability and efficiency for large-scale systems. Improvement in scalability and efficiency can be achieved by decomposing the system model into smaller local submodels and operating on these submodels instead. In this paper, the global system model is analyzed offline and structurally decomposed into local submodels. We define a common model decomposition framework for extracting submodels from the global model. This framework is then used to develop algorithms for solving model decomposition problems for the design of three separate SHM technologies, namely, estimation (which is useful for fault detection and identification), fault isolation, and EOL prediction. We solve these model decomposition problems using a three-tank system as a case study.

  9. A Simulink simulation framework of a MagLev model

    Energy Technology Data Exchange (ETDEWEB)

    Boudall, H.; Williams, R.D.; Giras, T.C. [University of Virginia, Charlottesville (United States). School of Enegineering and Applied Science

    2003-09-01

    This paper presents a three-degree-of-freedom model of a section of the magnetically levitated train Maglev. The Maglev system dealt with in this article utilizes electromagnetic levitation. Each MagLev vehicle section is viewed as two separate parts, namely a body and a chassis, coupled by a set of springs and dampers. The MagLev model includes the propulsion, the guidance and the levitation systems. The equations of motion are developed. A Simulink simulation framework is implemented in order to study the interaction between the different systems and the dynamics of a MagLev vehicle. The simulation framework will eventually serve as a tool to assist the design and development of the Maglev system in the United States of America. (author)

  10. Modelling Framework and Assistive Device for Peripheral Intravenous Injections

    Science.gov (United States)

    Kam, Kin F.; Robinson, Martin P.; Gilbert, Mathew A.; Pelah, Adar

    2016-02-01

    Intravenous access for blood sampling or drug administration that requires peripheral venepuncture is perhaps the most common invasive procedure practiced in hospitals, clinics and general practice surgeries.We describe an idealised mathematical framework for modelling the dynamics of the peripheral venepuncture process. Basic assumptions of the model are confirmed through motion analysis of needle trajectories during venepuncture, taken from video recordings of a skilled practitioner injecting into a practice kit. The framework is also applied to the design and construction of a proposed device for accurate needle guidance during venepuncture administration, assessed as consistent and repeatable in application and does not lead to over puncture. The study provides insights into the ubiquitous peripheral venepuncture process and may contribute to applications in training and in the design of new devices, including for use in robotic automation.

  11. Integrating knowledge seeking into knowledge management models and frameworks

    Directory of Open Access Journals (Sweden)

    Francois Lottering

    2012-02-01

    Full Text Available Background: A striking feature of the knowledge management (KM literature is that the standard list of KM processes either subsumes or overlooks the process of knowledge seeking. Knowledge seeking is manifestly under-theorised, making the need to address this gap in KM theory and practice clear and urgent.Objectives: This article investigates the theoretical status of the knowledge-seeking process in extant KM models and frameworks. It also statistically describes knowledge seeking and knowledge sharing practices in a sample of South African companies. Using this data, it proposes a KM model based on knowledge seeking.Method: Knowledge seeking is traced in a number of KM models and frameworks with a specific focus on Han Lai and Margaret Graham’s adapted KM cycle model, which separates knowledge seeking from knowledge sharing. This empirical investigation used a questionnaire to examine knowledge seeking and knowledge sharing practices in a sample of South African companies.Results: This article critiqued and elaborated on the adapted KM cycle model of Lai and Graham. It identified some of the key features of knowledge seeking practices in the workplace. It showed that knowledge seeking and sharing are human-centric actions and that seeking knowledge uses trust and loyalty as its basis. It also showed that one cannot separate knowledge seeking from knowledge sharing.Conclusion: The knowledge seeking-based KM model elaborates on Lai and Graham’s model. It provides insight into how and where people seek and share knowledge in the workplace. The article concludes that it is necessary to cement the place of knowledge seeking in KM models as well as frameworks and suggests that organisations should apply its findings to improving their knowledge management strategies. 

  12. Genomic-Enabled Prediction in Maize Using Kernel Models with Genotype × Environment Interaction

    Science.gov (United States)

    Bandeira e Sousa, Massaine; Cuevas, Jaime; de Oliveira Couto, Evellyn Giselly; Pérez-Rodríguez, Paulino; Jarquín, Diego; Fritsche-Neto, Roberto; Burgueño, Juan; Crossa, Jose

    2017-01-01

    Multi-environment trials are routinely conducted in plant breeding to select candidates for the next selection cycle. In this study, we compare the prediction accuracy of four developed genomic-enabled prediction models: (1) single-environment, main genotypic effect model (SM); (2) multi-environment, main genotypic effects model (MM); (3) multi-environment, single variance G×E deviation model (MDs); and (4) multi-environment, environment-specific variance G×E deviation model (MDe). Each of these four models were fitted using two kernel methods: a linear kernel Genomic Best Linear Unbiased Predictor, GBLUP (GB), and a nonlinear kernel Gaussian kernel (GK). The eight model-method combinations were applied to two extensive Brazilian maize data sets (HEL and USP data sets), having different numbers of maize hybrids evaluated in different environments for grain yield (GY), plant height (PH), and ear height (EH). Results show that the MDe and the MDs models fitted with the Gaussian kernel (MDe-GK, and MDs-GK) had the highest prediction accuracy. For GY in the HEL data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 9 to 32%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 9 to 49%. For GY in the USP data set, the increase in prediction accuracy of SM-GK over SM-GB ranged from 0 to 7%. For the MM, MDs, and MDe models, the increase in prediction accuracy of GK over GB ranged from 34 to 70%. For traits PH and EH, gains in prediction accuracy of models with GK compared to models with GB were smaller than those achieved in GY. Also, these gains in prediction accuracy decreased when a more difficult prediction problem was studied. PMID:28455415

  13. Certified reduced basis model validation: A frequentistic uncertainty framework

    OpenAIRE

    Patera, A. T.; Huynh, Dinh Bao Phuong; Knezevic, David; Patera, Anthony T.

    2011-01-01

    We introduce a frequentistic validation framework for assessment — acceptance or rejection — of the consistency of a proposed parametrized partial differential equation model with respect to (noisy) experimental data from a physical system. Our method builds upon the Hotelling T[superscript 2] statistical hypothesis test for bias first introduced by Balci and Sargent in 1984 and subsequently extended by McFarland and Mahadevan (2008). Our approach introduces two new elements: a spectral repre...

  14. Comparison between linear and non-parametric regression models for genome-enabled prediction in wheat.

    Science.gov (United States)

    Pérez-Rodríguez, Paulino; Gianola, Daniel; González-Camacho, Juan Manuel; Crossa, José; Manès, Yann; Dreisigacker, Susanne

    2012-12-01

    In genome-enabled prediction, parametric, semi-parametric, and non-parametric regression models have been used. This study assessed the predictive ability of linear and non-linear models using dense molecular markers. The linear models were linear on marker effects and included the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B. The non-linear models (this refers to non-linearity on markers) were reproducing kernel Hilbert space (RKHS) regression, Bayesian regularized neural networks (BRNN), and radial basis function neural networks (RBFNN). These statistical models were compared using 306 elite wheat lines from CIMMYT genotyped with 1717 diversity array technology (DArT) markers and two traits, days to heading (DTH) and grain yield (GY), measured in each of 12 environments. It was found that the three non-linear models had better overall prediction accuracy than the linear regression specification. Results showed a consistent superiority of RKHS and RBFNN over the Bayesian LASSO, Bayesian ridge regression, Bayes A, and Bayes B models.

  15. Common and Innovative Visuals: A sparsity modeling framework for video.

    Science.gov (United States)

    Abdolhosseini Moghadam, Abdolreza; Kumar, Mrityunjay; Radha, Hayder

    2014-05-02

    Efficient video representation models are critical for many video analysis and processing tasks. In this paper, we present a framework based on the concept of finding the sparsest solution to model video frames. To model the spatio-temporal information, frames from one scene are decomposed into two components: (i) a common frame, which describes the visual information common to all the frames in the scene/segment, and (ii) a set of innovative frames, which depicts the dynamic behaviour of the scene. The proposed approach exploits and builds on recent results in the field of compressed sensing to jointly estimate the common frame and the innovative frames for each video segment. We refer to the proposed modeling framework by CIV (Common and Innovative Visuals). We show how the proposed model can be utilized to find scene change boundaries and extend CIV to videos from multiple scenes. Furthermore, the proposed model is robust to noise and can be used for various video processing applications without relying on motion estimation and detection or image segmentation. Results for object tracking, video editing (object removal, inpainting) and scene change detection are presented to demonstrate the efficiency and the performance of the proposed model.

  16. Dynamic modelling of household automobile transactions within a microsimulation framework

    Energy Technology Data Exchange (ETDEWEB)

    Mohammadian, A.

    2002-07-01

    This thesis presents a newly developed dynamic model of household automobile transactions within an integrated land-use transportation and environment (ILUTE) modeling system framework. It is a market-based decision-making tool for use by individuals who have to choose between adding new vehicles to a fleet, disposing of vehicles, trading one of the vehicles of a fleet, or do-nothing. Different approaches were used within the model, including an artificial neural network, hedonic price, regression, and vehicle class and vintage choices. The model can also predict the complex activity of individuals' behaviour to become active in the market. An estimation approach was used to incorporate the vehicle type choice model into the main dynamic transaction choice model.

  17. A framework for the calibration of social simulation models

    CERN Document Server

    Ciampaglia, Giovanni Luca

    2013-01-01

    Simulation with agent-based models is increasingly used in the study of complex socio-technical systems and in social simulation in general. This paradigm offers a number of attractive features, namely the possibility of modeling emergent phenomena within large populations. As a consequence, often the quantity in need of calibration may be a distribution over the population whose relation with the parameters of the model is analytically intractable. Nevertheless, we can simulate. In this paper we present a simulation-based framework for the calibration of agent-based models with distributional output based on indirect inference. We illustrate our method step by step on a model of norm emergence in an online community of peer production, using data from three large Wikipedia communities. Model fit and diagnostics are discussed.

  18. Population balance models: a useful complementary modelling framework for future WWTP modelling.

    Science.gov (United States)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel; Vanrolleghem, Peter A; Gernaey, Krist V

    2015-01-01

    Population balance models (PBMs) represent a powerful modelling framework for the description of the dynamics of properties that are characterised by distributions. This distribution of properties under transient conditions has been demonstrated in many chemical engineering applications. Modelling efforts of several current and future unit processes in wastewater treatment plants could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently capture the true behaviour and even lead to completely wrong conclusions. Examples of distributed properties are bubble size, floc size, crystal size or granule size. In these cases, PBMs can be used to develop new knowledge that can be embedded in our current models to improve their predictive capability. Hence, PBMs should be regarded as a complementary modelling framework to biokinetic models. This paper provides an overview of current applications, future potential and limitations of PBMs in the field of wastewater treatment modelling, thereby looking over the fence to other scientific disciplines.

  19. Cultural Resources as Sustainability Enablers: Towards a Community-Based Cultural Heritage Resources Management (COBACHREM Model

    Directory of Open Access Journals (Sweden)

    Susan O. Keitumetse

    2013-12-01

    Full Text Available People inhabit and change environments using socio-cultural and psycho-social behaviors and processes. People use their socio-cultural understanding of phenomena to interact with the environment. People are carriers of cultural heritage. These characteristics make cultural values ubiquitous in all people-accessed and people-inhabited geographic spaces of the world, making people readily available assets through which environmental sustainability can be implemented. Yet, people’s conservation development is rarely planned using cultural resources. It is against this background that a Community-Based Cultural Heritage Resources Management (COBACHREM model is initiated as a new approach that outlines the symbiosis between cultural heritage, environment and various stakeholders, with a view to create awareness about neglected conservation indicators inherent in cultural resources and better placed to complement already existing natural resources conservation indicators. The model constitutes a two-phased process with four (04 levels of operation, namely: level I (production; level II (reproduction; level III (consumption that distinguish specific components of cultural heritage resources to be monitored at level IV for sustainability using identified cultural conservation indicators. Monitored indicators, which are limitless, constitute work in progress of the model and will be constantly reviewed, renewed and updated through time. Examples of monitoring provided in this article are the development of cultural competency-based training curriculum that will assist communities to transform cultural information into certifiable intellectual (educational and culture-economic (tourism assets. Another monitoring example is the mainstreaming of community cultural qualities into already existing environmental conservation frameworks such as eco-certification to infuse new layers of conservation indicators that enrich resource sustainability. The technical

  20. Population Balance Models: A useful complementary modelling framework for future WWTP modelling

    DEFF Research Database (Denmark)

    Nopens, Ingmar; Torfs, Elena; Ducoste, Joel

    2014-01-01

    processes in WWTPs could potentially benefit from this framework, especially when distributed dynamics have a significant impact on the overall unit process performance. In these cases, current models that rely on average properties cannot sufficiently captured the true behaviour. Examples are bubble size...

  1. An Integrated Snow Radiance and Snow Physics Modeling Framework for Cold Land Surface Modeling

    Science.gov (United States)

    Kim, Edward J.; Tedesco, Marco

    2006-01-01

    Recent developments in forward radiative transfer modeling and physical land surface modeling are converging to allow the assembly of an integrated snow/cold lands modeling framework for land surface modeling and data assimilation applications. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. Together these form a flexible framework for self-consistent remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. Each element of this framework is modular so the choice of element can be tailored to match the emphasis of a particular study. For example, within our framework, four choices of a FRTM are available to simulate the brightness temperature of snow: Two models are available to model the physical evolution of the snowpack and underlying soil, and two models are available to handle the water/energy balance at the land surface. Since the framework is modular, other models-physical or statistical--can be accommodated, too. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster at the NASA Goddard Space Flight Center. The advantages of such an integrated modular framework built on the LIS will be described through examples-e.g., studies to analyze snow field experiment observations, and simulations of future satellite missions for snow and cold land processes.

  2. Enabling model checking for collaborative process analysis: from BPMN to `Network of Timed Automata'

    Science.gov (United States)

    Mallek, Sihem; Daclin, Nicolas; Chapurlat, Vincent; Vallespir, Bruno

    2015-04-01

    Interoperability is a prerequisite for partners involved in performing collaboration. As a consequence, the lack of interoperability is now considered a major obstacle. The research work presented in this paper aims to develop an approach that allows specifying and verifying a set of interoperability requirements to be satisfied by each partner in the collaborative process prior to process implementation. To enable the verification of these interoperability requirements, it is necessary first and foremost to generate a model of the targeted collaborative process; for this research effort, the standardised language BPMN 2.0 is used. Afterwards, a verification technique must be introduced, and model checking is the preferred option herein. This paper focuses on application of the model checker UPPAAL in order to verify interoperability requirements for the given collaborative process model. At first, this step entails translating the collaborative process model from BPMN into a UPPAAL modelling language called 'Network of Timed Automata'. Second, it becomes necessary to formalise interoperability requirements into properties with the dedicated UPPAAL language, i.e. the temporal logic TCTL.

  3. Enabling HCCI modeling: The RIOT/CMCS Web Service for Automatic Reaction Mechanism Reduction

    Energy Technology Data Exchange (ETDEWEB)

    Oluwole, O; Pitz, W J; Schuchardt, K; Rahn, L A; Green, Jr., W H; Leahy, D; Pancerella, C; Sj?berg, M; Dec, J

    2005-12-12

    New approaches are being developed to facilitate multidisciplinary collaborative research of Homogeneous Charge Compression Ignition (HCCI) combustion processes. In this paper, collaborative sharing of the Range Identification and Optimization Toolkit (RIOT) and related data and models is discussed. RIOT is a developmental approach to reduce the computational complexity of detailed chemical kinetic mechanisms, enabling their use in modeling kinetically-controlled combustion applications such as HCCI. These approaches are being developed and piloted as a part of the Collaboratory for Multiscale Chemical Sciences (CMCS) project. The capabilities of the RIOT code are shared through a portlet in the CMCS portal that allows easy specification and processing of RIOT inputs, remote execution of RIOT, tracking of data pedigree and translation of RIOT outputs (such as the reduced model) to a table view and to the commonly-used CHEMKIN mechanism format. The reduced model is thus immediately ready to be used for more efficient simulation of the chemically reacting system of interest. This effort is motivated by the need to improve computational efficiency in modeling HCCI systems. Preliminary use of the web service to obtain reduced models for this application has yielded computational speedup factors of up to 20 as presented in this paper.

  4. Spin models inferred from patient data faithfully describe HIV fitness landscapes and enable rational vaccine design

    CERN Document Server

    Shekhar, Karthik; Ferguson, Andrew L; Barton, John P; Kardar, Mehran; Chakraborty, Arup K

    2013-01-01

    Mutational escape from vaccine induced immune responses has thwarted the development of a successful vaccine against AIDS, whose causative agent is HIV, a highly mutable virus. Knowing the virus' fitness as a function of its proteomic sequence can enable rational design of potent vaccines, as this information can focus vaccine induced immune responses to target mutational vulnerabilities of the virus. Spin models have been proposed as a means to infer intrinsic fitness landscapes of HIV proteins from patient-derived viral protein sequences. These sequences are the product of non-equilibrium viral evolution driven by patient-specific immune responses, and are subject to phylogenetic constraints. How can such sequence data allow inference of intrinsic fitness landscapes? We combined computer simulations and variational theory \\'{a} la Feynman to show that, in most circumstances, spin models inferred from patient-derived viral sequences reflect the correct rank order of the fitness of mutant viral strains. Our f...

  5. The shared circuits model (SCM): how control, mirroring, and simulation can enable imitation, deliberation, and mindreading.

    Science.gov (United States)

    Hurley, Susan

    2008-02-01

    Imitation, deliberation, and mindreading are characteristically human sociocognitive skills. Research on imitation and its role in social cognition is flourishing across various disciplines. Imitation is surveyed in this target article under headings of behavior, subpersonal mechanisms, and functions of imitation. A model is then advanced within which many of the developments surveyed can be located and explained. The shared circuits model (SCM) explains how imitation, deliberation, and mindreading can be enabled by subpersonal mechanisms of control, mirroring, and simulation. It is cast at a middle, functional level of description, that is, between the level of neural implementation and the level of conscious perceptions and intentional actions. The SCM connects shared informational dynamics for perception and action with shared informational dynamics for self and other, while also showing how the action/perception, self/other, and actual/possible distinctions can be overlaid on these shared informational dynamics. It avoids the common conception of perception and action as separate and peripheral to central cognition. Rather, it contributes to the situated cognition movement by showing how mechanisms for perceiving action can be built on those for active perception.;>;>The SCM is developed heuristically, in five layers that can be combined in various ways to frame specific ontogenetic or phylogenetic hypotheses. The starting point is dynamic online motor control, whereby an organism is closely attuned to its embedding environment through sensorimotor feedback. Onto this are layered functions of prediction and simulation of feedback, mirroring, simulation of mirroring, monitored inhibition of motor output, and monitored simulation of input. Finally, monitored simulation of input specifying possible actions plus inhibited mirroring of such possible actions can generate information about the possible as opposed to actual instrumental actions of others, and the

  6. Conceptual model and economic experiments to explain nonpersistence and enable mechanism designs fostering behavioral change.

    Science.gov (United States)

    Djawadi, Behnud Mir; Fahr, René; Turk, Florian

    2014-12-01

    Medical nonpersistence is a worldwide problem of striking magnitude. Although many fields of studies including epidemiology, sociology, and psychology try to identify determinants for medical nonpersistence, comprehensive research to explain medical nonpersistence from an economics perspective is rather scarce. The aim of the study was to develop a conceptual framework that augments standard economic choice theory with psychological concepts of behavioral economics to understand how patients' preferences for discontinuing with therapy arise over the course of the medical treatment. The availability of such a framework allows the targeted design of mechanisms for intervention strategies. Our conceptual framework models the patient as an active economic agent who evaluates the benefits and costs for continuing with therapy. We argue that a combination of loss aversion and mental accounting operations explains why patients discontinue with therapy at a specific point in time. We designed a randomized laboratory economic experiment with a student subject pool to investigate the behavioral predictions. Subjects continue with therapy as long as experienced utility losses have to be compensated. As soon as previous losses are evened out, subjects perceive the marginal benefit of persistence lower than in the beginning of the treatment. Consequently, subjects start to discontinue with therapy. Our results highlight that concepts of behavioral economics capture the dynamic structure of medical nonpersistence better than does standard economic choice theory. We recommend that behavioral economics should be a mandatory part of the development of possible intervention strategies aimed at improving patients' compliance and persistence behavior. Copyright © 2014 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.

  7. C-HiLasso: A Collaborative Hierarchical Sparse Modeling Framework

    CERN Document Server

    Sprechmann, Pablo; Sapiro, Guillermo; Eldar, Yonina

    2010-01-01

    Sparse modeling is a powerful framework for data analysis and processing. Traditionally, encoding in this framework is performed by solving an L1-regularized linear regression problem, commonly referred to as Lasso or Basis Pursuit. In this work we combine the sparsity-inducing property of the Lasso model at the individual feature level, with the block-sparsity property of the Group Lasso model, where sparse groups of features are jointly encoded, obtaining a sparsity pattern hierarchically structured. This results in the Hierarchical Lasso (HiLasso), which shows important practical modeling advantages. We then extend this approach to the collaborative case, where a set of simultaneously coded signals share the same sparsity pattern at the higher (group) level, but not necessarily at the lower (inside the group) level, obtaining the collaborative HiLasso model (C-HiLasso). Such signals then share the same active groups, or classes, but not necessarily the same active set. This model is very well suited for ap...

  8. Designing for Learning and Play - The Smiley Model as Framework

    DEFF Research Database (Denmark)

    Weitze, Charlotte Lærke

    2016-01-01

    This paper presents a framework for designing engaging learning experiences in games – the Smiley Model. In this Design-Based Research project, student-game-designers were learning inside a gamified learning design - while designing and implementing learning goals from curriculum into the small...... digital games. The Smiley Model inspired and provided a scaffold or a heuristic for the overall gamified learning design –- as well as for the students’ learning game design processes when creating small games turning the learning situation into an engaging experience. The audience for the experiments...

  9. A constitutive model for magnetostriction based on thermodynamic framework

    Science.gov (United States)

    Ho, Kwangsoo

    2016-08-01

    This work presents a general framework for the continuum-based formulation of dissipative materials with magneto-mechanical coupling in the viewpoint of irreversible thermodynamics. The thermodynamically consistent model developed for the magnetic hysteresis is extended to include the magnetostrictive effect. The dissipative and hysteretic response of magnetostrictive materials is captured through the introduction of internal state variables. The evolution rate of magnetostrictive strain as well as magnetization is derived from thermodynamic and dissipative potentials in accordance with the general principles of thermodynamics. It is then demonstrated that the constitutive model is competent to describe the magneto-mechanical behavior by comparing simulation results with the experimental data reported in the literature.

  10. Next generation framework for aquatic modeling of the Earth System

    Directory of Open Access Journals (Sweden)

    C. J. Vörösmarty

    2009-03-01

    Full Text Available Earth System model development is becoming an increasingly complex task. As scientists attempt to represent the physical and bio-geochemical processes and various feedback mechanisms in unprecedented detail, the models themselves are becoming increasingly complex. At the same time, the complexity of the surrounding IT infrastructure is growing as well. Earth System models must manage a vast amount of data in heterogeneous computing environments. Numerous development efforts are on the way to ease that burden and offer model development platforms that reduce IT challenges and allow scientists to focus on their science. While these new modeling frameworks (e.g. FMS, ESMF, CCA, OpenMI do provide solutions to many IT challenges (performing input/output, managing space and time, establishing model coupling, etc., they are still considerably complex and often have steep learning curves.

    The Next generation Framework for Aquatic Modeling of the Earth System (NextFrAMES, a revised version of FrAMES have numerous similarities to those developed by other teams, but represents a novel model development paradigm. NextFrAMES is built around a modeling XML that lets modelers to express the overall model structure and provides an API for dynamically linked plugins to represent the processes. The model XML is executed by the NextFrAMES run-time engine that parses the model definition, loads the module plugins, performs the model I/O and executes the model calculations. NextFrAMES has a minimalistic view representing spatial domains and treats every domain (regardless of its layout such as grid, network tree, individual points, polygons, etc. as vector of objects. NextFrAMES performs computations on multiple domains and interactions between different spatial domains are carried out through couplers. NextFrAMES allows processes to operate at different frequencies by providing rudimentary aggregation and disaggregation facilities.

    NextFrAMES was

  11. A Framework for Improving Project Performance of Standard Design Models in Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Shabbab Al-Otaib

    2013-07-01

    Full Text Available Improving project performance in the construction industry poses several challenges for stakeholders. Recently, there have been frequent calls for the importance of adopting standardisation in improving construction design as well as the process and a focus on learning mapping from other industries. The Saudi Ministry of Interior (SMoI has adopted a new Standard Design Model (SDM approach for the development of its construction programme to effectively manage its complex project portfolio and improve project performance. A review of existing literature indicates that despite the adoption of SDM repetitive projects, which enable learning from past mistakes and improving the performance of future projects, it has been realised that there is a lack of learning instruments to capture, store and disseminate Lessons Learnt (LL. This research proposes a framework for improving the project performance of SDMs in the Saudi construction industry. Eight case studies related to a typical standard design project were performed that included interviews with of 24 key stakeholders who are involved in the planning and implementation of SDM projects within the SMoI. The research identified 14 critical success factors CSFs have a direct impact on the SDM project performance. These are classified into three main CSF-related clusters: adaptability to the context; contract management; and construction management. A framework, which comprises the identified 14 CSFs, was developed, refined and validated through a workshop with 12 key stakeholders in the SMoI construction programme. Additionally, a framework implementation process map was developed. Web-based tools and KM were identified as core factors in the framework implementation strategy. Although many past CSF-related studies were conducted to develop a range of construction project performance improvement frameworks, the paper provides the first initiative to develop a framework to improve the performance of

  12. Model Selection Framework for Graph-based data

    CERN Document Server

    Caceres, Rajmonda S; Schmidt, Matthew C; Miller, Benjamin A; Campbell, William M

    2016-01-01

    Graphs are powerful abstractions for capturing complex relationships in diverse application settings. An active area of research focuses on theoretical models that define the generative mechanism of a graph. Yet given the complexity and inherent noise in real datasets, it is still very challenging to identify the best model for a given observed graph. We discuss a framework for graph model selection that leverages a long list of graph topological properties and a random forest classifier to learn and classify different graph instances. We fully characterize the discriminative power of our approach as we sweep through the parameter space of two generative models, the Erdos-Renyi and the stochastic block model. We show that our approach gets very close to known theoretical bounds and we provide insight on which topological features play a critical discriminating role.

  13. Social networks enabled coordination model for cost management of patient hospital admissions.

    Science.gov (United States)

    Uddin, Mohammed Shahadat; Hossain, Liaquat

    2011-09-01

    In this study, we introduce a social networks enabled coordination model for exploring the effect of network position of "patient," "physician," and "hospital" actors in a patient-centered care network that evolves during patient hospitalization period on the total cost of coordination. An actor is a node, which represents an entity such as individual and organization in a social network. In our analysis of actor networks and coordination in the healthcare literature, we identified that there is significant gap where a number of promising hospital coordination model have been developed (e.g., Guided Care Model, Chronic Care Model) for the current healthcare system focusing on quality of service and patient satisfaction. The health insurance dataset for total hip replacement (THR) from hospital contribution fund, a prominent Australian Health Insurance Company, are analyzed to examine our proposed coordination model. We consider network attributes of degree, connectedness, in-degree, out-degree, and tie strength to measure network position of actors. To measure the cost of coordination for a particular hospital, average of total hospitalization expenses for all THR hospital admissions is used. Results show that network positions of "patient," "physician," and "hospital" actors considering all hospital admissions that a particular hospital has have effect on the average of total hospitalization expenses of that hospital. These results can be used as guidelines to set up a cost-effective healthcare practice structure for patient hospitalization expenses.

  14. A framework for similarity recognition of CAD models

    Directory of Open Access Journals (Sweden)

    Leila Zehtaban

    2016-07-01

    Full Text Available A designer is mainly supported by two essential factors in design decisions. These two factors are intelligence and experience aiding the designer by predicting the interconnection between the required design parameters. Through classification of product data and similarity recognition between new and existing designs, it is partially possible to replace the required experience for an inexperienced designer. Given this context, the current paper addresses a framework for recognition and flexible retrieval of similar models in product design. The idea is to establish an infrastructure for transferring design as well as the required PLM (Product Lifecycle Management know-how to the design phase of product development in order to reduce the design time. Furthermore, such a method can be applied as a brainstorming method for a new and creative product development as well. The proposed framework has been tested and benchmarked while showing promising results.

  15. The ontology model of FrontCRM framework

    Science.gov (United States)

    Budiardjo, Eko K.; Perdana, Wira; Franshisca, Felicia

    2013-03-01

    Adoption and implementation of Customer Relationship Management (CRM) is not merely a technological installation, but the emphasis is more on the application of customer-centric philosophy and culture as a whole. CRM must begin at the level of business strategy, the only level that thorough organizational changes are possible to be done. Changes agenda can be directed to each departmental plans, and supported by information technology. Work processes related to CRM concept include marketing, sales, and services. FrontCRM is developed as framework to guide in identifying business processes related to CRM in which based on the concept of strategic planning approach. This leads to processes and practices identification in every process area related to marketing, sales, and services. The Ontology model presented on this paper by means serves as tools to avoid framework misunderstanding, to define practices systematically within process area and to find CRM software features related to those practices.

  16. Modeling phenotypic plasticity in growth trajectories: a statistical framework.

    Science.gov (United States)

    Wang, Zhong; Pang, Xiaoming; Wu, Weimiao; Wang, Jianxin; Wang, Zuoheng; Wu, Rongling

    2014-01-01

    Phenotypic plasticity, that is multiple phenotypes produced by a single genotype in response to environmental change, has been thought to play an important role in evolution and speciation. Historically, knowledge about phenotypic plasticity has resulted from the analysis of static traits measured at a single time point. New insight into the adaptive nature of plasticity can be gained by an understanding of how organisms alter their developmental processes in a range of environments. Recent advances in statistical modeling of functional data and developmental genetics allow us to construct a dynamic framework of plastic response in developmental form and pattern. Under this framework, development, genetics, and evolution can be synthesized through statistical bridges to better address how evolution results from phenotypic variation in the process of development via genetic alterations.

  17. Velo: A Knowledge Management Framework for Modeling and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Gorton, Ian; Sivaramakrishnan, Chandrika; Black, Gary D.; White, Signe K.; Purohit, Sumit; Lansing, Carina S.; Madison, Michael C.; Schuchardt, Karen L.; Liu, Yan

    2012-03-01

    Modern scientific enterprises are inherently knowledge-intensive. Scientific studies in domains such as geosciences, climate, and biology require the acquisition and manipulation of large amounts of experimental and field data to create inputs for large-scale computational simulations. The results of these simulations are then analyzed, leading to refinements of inputs and models and additional simulations. The results of this process must be managed and archived to provide justifications for regulatory decisions and publications that are based on the models. In this paper we introduce our Velo framework that is designed as a reusable, domain independent knowledge management infrastructure for modeling and simulation. Velo leverages, integrates and extends open source collaborative and content management technologies to create a scalable and flexible core platform that can be tailored to specific scientific domains. We describe the architecture of Velo for managing and associating the various types of data that are used and created in modeling and simulation projects, as well as the framework for integrating domain-specific tools. To demonstrate realizations of Velo, we describe examples from two deployed sites for carbon sequestration and climate modeling. These provide concrete example of the inherent extensibility and utility of our approach.

  18. Optimization Framework for Stochastic Modeling of Annual Streamflows

    Science.gov (United States)

    Srivastav, R. K.; Srinivasan, K.; Sudheer, K.

    2008-12-01

    Synthetic streamflow data generation involves the synthesis of likely streamflow patterns that are statistically indistinguishable from the observed streamflow data. The various kinds of stochastic models adopted for streamflow generation in hydrology are: i) parametric models which hypothesize the form of the dependence structure and the distributional form a priori (examples are AR, ARMA); ii) Nonparametric models (examples are bootstrap/kernel based methods), which characterize the laws of chance, describing the stream flow process, without recourse to prior assumptions as to the form or structure of these laws; iii) Hybrid models which blend both parametric and non-parametric models advantageously to model the streamflows effectively. Despite many of these developments that have taken place in the field of stochastic modeling of streamflows over the last four decades, accurate prediction of the storage and the critical drought (water use) characteristics has been posing a persistent challenge to the stochastic modeler. This may be because, usually, the stochastic streamflow model parameters are estimated by minimizing a statistically based objective function (such as maximum likelihood (MLE) or least squares estimation) and subsequently the efficacy of the models is being validated based on the accuracy of prediction of the estimates of the water- use characteristics. In this study a framework is proposed to find the optimal hybrid model (blend of ARMA(1,1) and moving block bootstrap (MBB)) based on the explicit objective function of minimizing the relative bias in estimating the storage capacity of the reservoir. The optimal parameter set of the hybrid model is obtained based on the search over a multi-dimensional parameter space involving simultaneous exploration of the parametric (ARMA[1,1]) as well as the non-parametric (MBB) components. This is achieved using the efficient evolutionary search based optimization tool namely, non-dominated sorting genetic

  19. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    Science.gov (United States)

    Chakravarty, T.; Chowdhury, A.; Ghose, A.; Bhaumik, C.; Balamuralidhar, P.

    2014-03-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc.

  20. A Modelling Framework to Assess the Effect of Pressures on River Abiotic Habitat Conditions and Biota.

    Directory of Open Access Journals (Sweden)

    Jochem Kail

    Full Text Available River biota are affected by global reach-scale pressures, but most approaches for predicting biota of rivers focus on river reach or segment scale processes and habitats. Moreover, these approaches do not consider long-term morphological changes that affect habitat conditions. In this study, a modelling framework was further developed and tested to assess the effect of pressures at different spatial scales on reach-scale habitat conditions and biota. Ecohydrological and 1D hydrodynamic models were used to predict discharge and water quality at the catchment scale and the resulting water level at the downstream end of a study reach. Long-term reach morphology was modelled using empirical regime equations, meander migration and 2D morphodynamic models. The respective flow and substrate conditions in the study reach were predicted using a 2D hydrodynamic model, and the suitability of these habitats was assessed with novel habitat models. In addition, dispersal models for fish and macroinvertebrates were developed to assess the re-colonization potential and to finally compare habitat suitability and the availability/ability of species to colonize these habitats. Applicability was tested and model performance was assessed by comparing observed and predicted conditions in the lowland Treene River in northern Germany. Technically, it was possible to link the different models, but future applications would benefit from the development of open source software for all modelling steps to enable fully automated model runs. Future research needs concern the physical modelling of long-term morphodynamics, feedback of biota (e.g., macrophytes on abiotic habitat conditions, species interactions, and empirical data on the hydraulic habitat suitability and dispersal abilities of macroinvertebrates. The modelling framework is flexible and allows for including additional models and investigating different research and management questions, e.g., in climate impact

  1. Machine learning methods enable predictive modeling of antibody feature:function relationships in RV144 vaccinees.

    Directory of Open Access Journals (Sweden)

    Ickwon Choi

    2015-04-01

    Full Text Available The adaptive immune response to vaccination or infection can lead to the production of specific antibodies to neutralize the pathogen or recruit innate immune effector cells for help. The non-neutralizing role of antibodies in stimulating effector cell responses may have been a key mechanism of the protection observed in the RV144 HIV vaccine trial. In an extensive investigation of a rich set of data collected from RV144 vaccine recipients, we here employ machine learning methods to identify and model associations between antibody features (IgG subclass and antigen specificity and effector function activities (antibody dependent cellular phagocytosis, cellular cytotoxicity, and cytokine release. We demonstrate via cross-validation that classification and regression approaches can effectively use the antibody features to robustly predict qualitative and quantitative functional outcomes. This integration of antibody feature and function data within a machine learning framework provides a new, objective approach to discovering and assessing multivariate immune correlates.

  2. Reduced ENSO variability at the LGM revealed by an isotope-enabled Earth system model

    Science.gov (United States)

    Zhu, Jiang; Liu, Zhengyu; Brady, Esther; Otto-Bliesner, Bette; Zhang, Jiaxu; Noone, David; Tomas, Robert; Nusbaumer, Jesse; Wong, Tony; Jahn, Alexandra; Tabor, Clay

    2017-07-01

    Studying the El Niño-Southern Oscillation (ENSO) in the past can help us better understand its dynamics and improve its future projections. However, both paleoclimate reconstructions and model simulations of ENSO strength at the Last Glacial Maximum (LGM; 21 ka B.P.) have led to contradicting results. Here we perform model simulations using the recently developed water isotope-enabled Community Earth System Model (iCESM). For the first time, model-simulated oxygen isotopes are directly compared with those from ENSO reconstructions using the individual foraminifera analysis (IFA). We find that the LGM ENSO is most likely weaker comparing with the preindustrial. The iCESM suggests that total variance of the IFA records may only reflect changes in the annual cycle instead of ENSO variability as previously assumed. Furthermore, the interpretation of subsurface IFA records can be substantially complicated by the habitat depth of thermocline-dwelling foraminifera and their vertical migration with a temporally varying thermocline.

  3. A Structural Model Decomposition Framework for Hybrid Systems Diagnosis

    Science.gov (United States)

    Daigle, Matthew; Bregon, Anibal; Roychoudhury, Indranil

    2015-01-01

    Nowadays, a large number of practical systems in aerospace and industrial environments are best represented as hybrid systems that consist of discrete modes of behavior, each defined by a set of continuous dynamics. These hybrid dynamics make the on-line fault diagnosis task very challenging. In this work, we present a new modeling and diagnosis framework for hybrid systems. Models are composed from sets of user-defined components using a compositional modeling approach. Submodels for residual generation are then generated for a given mode, and reconfigured efficiently when the mode changes. Efficient reconfiguration is established by exploiting causality information within the hybrid system models. The submodels can then be used for fault diagnosis based on residual generation and analysis. We demonstrate the efficient causality reassignment, submodel reconfiguration, and residual generation for fault diagnosis using an electrical circuit case study.

  4. Generalized framework for context-specific metabolic model extraction methods

    Directory of Open Access Journals (Sweden)

    Semidán eRobaina Estévez

    2014-09-01

    Full Text Available Genome-scale metabolic models are increasingly applied to investigate the physiology not only of simple prokaryotes, but also eukaryotes, such as plants, characterized with compartmentalized cells of multiple types. While genome-scale models aim at including the entirety of known metabolic reactions, mounting evidence has indicated that only a subset of these reactions is active in a given context, including: developmental stage, cell type, or environment. As a result, several methods have been proposed to reconstruct context-specific models from existing genome-scale models by integrating various types of high-throughput data. Here we present a mathematical framework that puts all existing methods under one umbrella and provides the means to better understand their functioning, highlight similarities and differences, and to help users in selecting a most suitable method for an application.

  5. A computational framework for modeling targets as complex adaptive systems

    Science.gov (United States)

    Santos, Eugene; Santos, Eunice E.; Korah, John; Murugappan, Vairavan; Subramanian, Suresh

    2017-05-01

    Modeling large military targets is a challenge as they can be complex systems encompassing myriad combinations of human, technological, and social elements that interact, leading to complex behaviors. Moreover, such targets have multiple components and structures, extending across multiple spatial and temporal scales, and are in a state of change, either in response to events in the environment or changes within the system. Complex adaptive system (CAS) theory can help in capturing the dynamism, interactions, and more importantly various emergent behaviors, displayed by the targets. However, a key stumbling block is incorporating information from various intelligence, surveillance and reconnaissance (ISR) sources, while dealing with the inherent uncertainty, incompleteness and time criticality of real world information. To overcome these challenges, we present a probabilistic reasoning network based framework called complex adaptive Bayesian Knowledge Base (caBKB). caBKB is a rigorous, overarching and axiomatic framework that models two key processes, namely information aggregation and information composition. While information aggregation deals with the union, merger and concatenation of information and takes into account issues such as source reliability and information inconsistencies, information composition focuses on combining information components where such components may have well defined operations. Since caBKBs can explicitly model the relationships between information pieces at various scales, it provides unique capabilities such as the ability to de-aggregate and de-compose information for detailed analysis. Using a scenario from the Network Centric Operations (NCO) domain, we will describe how our framework can be used for modeling targets with a focus on methodologies for quantifying NCO performance metrics.

  6. Vulnerability Assessment Models to Drought: Toward a Conceptual Framework

    Directory of Open Access Journals (Sweden)

    Kiumars Zarafshani

    2016-06-01

    Full Text Available Drought is regarded as a slow-onset natural disaster that causes inevitable damage to water resources and to farm life. Currently, crisis management is the basis of drought mitigation plans, however, thus far studies indicate that effective drought management strategies are based on risk management. As a primary tool in mitigating the impact of drought, vulnerability assessment can be used as a benchmark in drought mitigation plans and to enhance farmers’ ability to cope with drought. Moreover, literature pertaining to drought has focused extensively on its impact, only awarding limited attention to vulnerability assessment as a tool. Therefore, the main purpose of this paper is to develop a conceptual framework for designing a vulnerability model in order to assess farmers’ level of vulnerability before, during and after the onset of drought. Use of this developed drought vulnerability model would aid disaster relief workers by enhancing the adaptive capacity of farmers when facing the impacts of drought. The paper starts with the definition of vulnerability and outlines different frameworks on vulnerability developed thus far. It then identifies various approaches of vulnerability assessment and finally offers the most appropriate model. The paper concludes that the introduced model can guide drought mitigation programs in countries that are impacted the most by drought.

  7. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    Energy Technology Data Exchange (ETDEWEB)

    Diachin, L F; Garaizar, F X; Henson, V E; Pope, G

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE and the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.

  8. Computer-aided modeling framework – a generic modeling template for catalytic membrane fixed bed reactors

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2013-01-01

    This work focuses on development of computer-aided modeling framework. The framework is a knowledge-based system that is built on a generic modeling language and structured based on workflows for different general modeling tasks. The overall objective of this work is to support the model developers...... and users to generate and test models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. In this contribution, as part of the framework a generic modeling template for the systematic derivation of problem specific catalytic...... membrane fixed bed models is developed. The application of the modeling template is highlighted with a case study related to the modeling of a catalytic membrane reactor coupling dehydrogenation of ethylbenzene with hydrogenation of nitrobenzene....

  9. Modeling of RFID-Enabled Real-Time Manufacturing Execution System in Mixed-Model Assembly Lines

    Directory of Open Access Journals (Sweden)

    Zhixin Yang

    2015-01-01

    Full Text Available To quickly respond to the diverse product demands, mixed-model assembly lines are well adopted in discrete manufacturing industries. Besides the complexity in material distribution, mixed-model assembly involves a variety of components, different process plans and fast production changes, which greatly increase the difficulty for agile production management. Aiming at breaking through the bottlenecks in existing production management, a novel RFID-enabled manufacturing execution system (MES, which is featured with real-time and wireless information interaction capability, is proposed to identify various manufacturing objects including WIPs, tools, and operators, etc., and to trace their movements throughout the production processes. However, being subject to the constraints in terms of safety stock, machine assignment, setup, and scheduling requirements, the optimization of RFID-enabled MES model for production planning and scheduling issues is a NP-hard problem. A new heuristical generalized Lagrangian decomposition approach has been proposed for model optimization, which decomposes the model into three subproblems: computation of optimal configuration of RFID senor networks, optimization of production planning subjected to machine setup cost and safety stock constraints, and optimization of scheduling for minimized overtime. RFID signal processing methods that could solve unreliable, redundant, and missing tag events are also described in detail. The model validity is discussed through algorithm analysis and verified through numerical simulation. The proposed design scheme has important reference value for the applications of RFID in multiple manufacturing fields, and also lays a vital research foundation to leverage digital and networked manufacturing system towards intelligence.

  10. Tarmo: A Framework for Parallelized Bounded Model Checking

    CERN Document Server

    Wieringa, Siert; Heljanko, Keijo; 10.4204/EPTCS.14.5

    2009-01-01

    This paper investigates approaches to parallelizing Bounded Model Checking (BMC) for shared memory environments as well as for clusters of workstations. We present a generic framework for parallelized BMC named Tarmo. Our framework can be used with any incremental SAT encoding for BMC but for the results in this paper we use only the current state-of-the-art encoding for full PLTL. Using this encoding allows us to check both safety and liveness properties, contrary to an earlier work on distributing BMC that is limited to safety properties only. Despite our focus on BMC after it has been translated to SAT, existing distributed SAT solvers are not well suited for our application. This is because solving a BMC problem is not solving a set of independent SAT instances but rather involves solving multiple related SAT instances, encoded incrementally, where the satisfiability of each instance corresponds to the existence of a counterexample of a specific length. Our framework includes a generic architecture for a ...

  11. A Diaminopropane-Appended Metal-Organic Framework Enabling Efficient CO2 Capture from Coal Flue Gas via a Mixed Adsorption Mechanism.

    Science.gov (United States)

    Milner, Phillip J; Siegelman, Rebecca L; Forse, Alexander C; Gonzalez, Miguel I; Runčevski, Tomče; Martell, Jeffrey D; Reimer, Jeffrey A; Long, Jeffrey R

    2017-09-27

    A new diamine-functionalized metal-organic framework comprised of 2,2-dimethyl-1,3-diaminopropane (dmpn) appended to the Mg(2+) sites lining the channels of Mg2(dobpdc) (dobpdc(4-) = 4,4'-dioxidobiphenyl-3,3'-dicarboxylate) is characterized for the removal of CO2 from the flue gas emissions of coal-fired power plants. Unique to members of this promising class of adsorbents, dmpn-Mg2(dobpdc) displays facile step-shaped adsorption of CO2 from coal flue gas at 40 °C and near complete CO2 desorption upon heating to 100 °C, enabling a high CO2 working capacity (2.42 mmol/g, 9.1 wt %) with a modest 60 °C temperature swing. Evaluation of the thermodynamic parameters of adsorption for dmpn-Mg2(dobpdc) suggests that the narrow temperature swing of its CO2 adsorption steps is due to the high magnitude of its differential enthalpy of adsorption (Δhads = -73 ± 1 kJ/mol), with a larger than expected entropic penalty for CO2 adsorption (Δsads = -204 ± 4 J/mol·K) positioning the step in the optimal range for carbon capture from coal flue gas. In addition, thermogravimetric analysis and breakthrough experiments indicate that, in contrast to many adsorbents, dmpn-Mg2(dobpdc) captures CO2 effectively in the presence of water and can be subjected to 1000 humid adsorption/desorption cycles with minimal degradation. Solid-state (13)C NMR spectra and single-crystal X-ray diffraction structures of the Zn analogue reveal that this material adsorbs CO2 via formation of both ammonium carbamates and carbamic acid pairs, the latter of which are crystallographically verified for the first time in a porous material. Taken together, these properties render dmpn-Mg2(dobpdc) one of the most promising adsorbents for carbon capture applications.

  12. Tarmo: A Framework for Parallelized Bounded Model Checking

    Directory of Open Access Journals (Sweden)

    Siert Wieringa

    2009-12-01

    Full Text Available This paper investigates approaches to parallelizing Bounded Model Checking (BMC for shared memory environments as well as for clusters of workstations. We present a generic framework for parallelized BMC named Tarmo. Our framework can be used with any incremental SAT encoding for BMC but for the results in this paper we use only the current state-of-the-art encoding for full PLTL. Using this encoding allows us to check both safety and liveness properties, contrary to an earlier work on distributing BMC that is limited to safety properties only. Despite our focus on BMC after it has been translated to SAT, existing distributed SAT solvers are not well suited for our application. This is because solving a BMC problem is not solving a set of independent SAT instances but rather involves solving multiple related SAT instances, encoded incrementally, where the satisfiability of each instance corresponds to the existence of a counterexample of a specific length. Our framework includes a generic architecture for a shared clause database that allows easy clause sharing between SAT solver threads solving various such instances. We present extensive experimental results obtained with multiple variants of our Tarmo implementation. Our shared memory variants have a significantly better performance than conventional single threaded approaches, which is a result that many users can benefit from as multi-core and multi-processor technology is widely available. Furthermore we demonstrate that our framework can be deployed in a typical cluster of workstations, where several multi-core machines are connected by a network.

  13. Estimating and modelling cure in population-based cancer studies within the framework of flexible parametric survival models

    Directory of Open Access Journals (Sweden)

    Eloranta Sandra

    2011-06-01

    Full Text Available Abstract Background When the mortality among a cancer patient group returns to the same level as in the general population, that is, the patients no longer experience excess mortality, the patients still alive are considered "statistically cured". Cure models can be used to estimate the cure proportion as well as the survival function of the "uncured". One limitation of parametric cure models is that the functional form of the survival of the "uncured" has to be specified. It can sometimes be hard to find a survival function flexible enough to fit the observed data, for example, when there is high excess hazard within a few months from diagnosis, which is common among older age groups. This has led to the exclusion of older age groups in population-based cancer studies using cure models. Methods Here we have extended the flexible parametric survival model to incorporate cure as a special case to estimate the cure proportion and the survival of the "uncured". Flexible parametric survival models use splines to model the underlying hazard function, and therefore no parametric distribution has to be specified. Results We have compared the fit from standard cure models to our flexible cure model, using data on colon cancer patients in Finland. This new method gives similar results to a standard cure model, when it is reliable, and better fit when the standard cure model gives biased estimates. Conclusions Cure models within the framework of flexible parametric models enables cure modelling when standard models give biased estimates. These flexible cure models enable inclusion of older age groups and can give stage-specific estimates, which is not always possible from parametric cure models.

  14. Use of Transition Modeling to Enable the Computation of Losses for Variable-Speed Power Turbine

    Science.gov (United States)

    Ameri, Ali A.

    2012-01-01

    To investigate the penalties associated with using a variable speed power turbine (VSPT) in a rotorcraft capable of vertical takeoff and landing, various analysis tools are required. Such analysis tools must be able to model the flow accurately within the operating envelope of VSPT. For power turbines low Reynolds numbers and a wide range of the incidence angles, positive and negative, due to the variation in the shaft speed at relatively fixed corrected flows, characterize this envelope. The flow in the turbine passage is expected to be transitional and separated at high incidence. The turbulence model of Walters and Leylek was implemented in the NASA Glenn-HT code to enable a more accurate analysis of such flows. Two-dimensional heat transfer predictions of flat plate flow and two-dimensional and three-dimensional heat transfer predictions on a turbine blade were performed and reported herein. Heat transfer computations were performed because it is a good marker for transition. The final goal is to be able to compute the aerodynamic losses. Armed with the new transition model, total pressure losses for three-dimensional flow of an Energy Efficient Engine (E3) tip section cascade for a range of incidence angles were computed in anticipation of the experimental data. The results obtained form a loss bucket for the chosen blade.

  15. Cyber Enabled Collaborative Environment for Data and Modeling Driven Curriculum Modules for Hydrology and Geoscience Education

    Science.gov (United States)

    Merwade, V.; Ruddell, B. L.; Manduca, C. A.; Fox, S.; Kirk, K. B.

    2012-12-01

    With the access to emerging datasets and computational tools, there is a need to bring these capabilities into hydrology and geoscience classrooms. However, developing curriculum modules using data and models to augment classroom teaching is hindered by steep technology learning curve, rapid technology turnover, and lack of an organized community cyberinfrastructure (CI) for the dissemination, publication, and sharing of the latest tools and curriculum material for hydrology and geoscience education. The objective of this project is to overcome some of these limitations by developing a cyber enabled collaborative environment for publishing, sharing and adoption of data and modeling driven curriculum modules in hydrology and geoscience classroom. The CI is based on Carleton College's Science Education Resource Center (SERC) Content Management System. Building on its existing community authoring capabilities the system is being extended to allow assembly of new teaching activities by drawing on a collection of interchangeable building blocks; each of which represents a step in the modeling process. This poster presentation will describe the structure of the CI, the type and description of the modules that are under development, and the approach that will be used in assessing students' learning from using modules.

  16. Strategic assessment of capacity consumption in railway networks: Framework and model

    DEFF Research Database (Denmark)

    Jensen, Lars Wittrup; Landex, Alex; Nielsen, Otto Anker

    2017-01-01

    In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences are gene......In this paper, we develop a new framework for strategic planning purposes to calculate railway infrastructure occupation and capacity consumption in networks, independent of a timetable. Furthermore, a model implementing the framework is presented. In this model different train sequences...

  17. Model-Driven Policy Framework for Data Centers

    DEFF Research Database (Denmark)

    Caba, Cosmin Marius; Kentis, Angelos Mimidis; Soler, José

    2016-01-01

    . Moreover, the lack of simple solutions for managing the configuration and behavior of the DC components makes the DC hard to configure and slow in adapting to changes in business needs. In this paper, we propose a model-driven framework for policy-based management for DCs, to simplify not only the service......Data Centers (DCs) continue to become increasingly complex, due to comprising multiple functional entities (e.g. routing, orchestration). Managing the multitude of interconnected components in the DC becomes difficult and error prone, leading to slow service provisioning, lack of QoS support, etc...

  18. Generic modelling framework for economic analysis of battery systems

    DEFF Research Database (Denmark)

    You, Shi; Rasmussen, Claus Nygaard

    2011-01-01

    Deregulated electricity markets provide opportunities for Battery Systems (BS) to participate in energy arbitrage and ancillary services (regulation, operating reserves, contingency reserves, voltage regulation, power quality etc.). To evaluate the economic viability of BS with different business...... for battery cycle life estimation, since the cycle life plays a central role in the economic analysis of BS. To illustrate the modelling framework, a case study using a Sodium Sulfur Battery (NAS) system with 5-minute regulating service is performed. The economic performances of two dispatch scenarios, a so...

  19. An Integrated Framework Advancing Membrane Protein Modeling and Design.

    Directory of Open Access Journals (Sweden)

    Rebecca F Alford

    2015-09-01

    Full Text Available Membrane proteins are critical functional molecules in the human body, constituting more than 30% of open reading frames in the human genome. Unfortunately, a myriad of difficulties in overexpression and reconstitution into membrane mimetics severely limit our ability to determine their structures. Computational tools are therefore instrumental to membrane protein structure prediction, consequently increasing our understanding of membrane protein function and their role in disease. Here, we describe a general framework facilitating membrane protein modeling and design that combines the scientific principles for membrane protein modeling with the flexible software architecture of Rosetta3. This new framework, called RosettaMP, provides a general membrane representation that interfaces with scoring, conformational sampling, and mutation routines that can be easily combined to create new protocols. To demonstrate the capabilities of this implementation, we developed four proof-of-concept applications for (1 prediction of free energy changes upon mutation; (2 high-resolution structural refinement; (3 protein-protein docking; and (4 assembly of symmetric protein complexes, all in the membrane environment. Preliminary data show that these algorithms can produce meaningful scores and structures. The data also suggest needed improvements to both sampling routines and score functions. Importantly, the applications collectively demonstrate the potential of combining the flexible nature of RosettaMP with the power of Rosetta algorithms to facilitate membrane protein modeling and design.

  20. A hybrid parallel framework for the cellular Potts model simulations

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Yi [Los Alamos National Laboratory; He, Kejing [SOUTH CHINA UNIV; Dong, Shoubin [SOUTH CHINA UNIV

    2009-01-01

    The Cellular Potts Model (CPM) has been widely used for biological simulations. However, most current implementations are either sequential or approximated, which can't be used for large scale complex 3D simulation. In this paper we present a hybrid parallel framework for CPM simulations. The time-consuming POE solving, cell division, and cell reaction operation are distributed to clusters using the Message Passing Interface (MPI). The Monte Carlo lattice update is parallelized on shared-memory SMP system using OpenMP. Because the Monte Carlo lattice update is much faster than the POE solving and SMP systems are more and more common, this hybrid approach achieves good performance and high accuracy at the same time. Based on the parallel Cellular Potts Model, we studied the avascular tumor growth using a multiscale model. The application and performance analysis show that the hybrid parallel framework is quite efficient. The hybrid parallel CPM can be used for the large scale simulation ({approx}10{sup 8} sites) of complex collective behavior of numerous cells ({approx}10{sup 6}).

  1. Testing, Modeling, and Monitoring to Enable Simpler, Cheaper, Longer-Lived Surface Caps

    Energy Technology Data Exchange (ETDEWEB)

    Piet, Steven James; Breckenridge, Robert Paul; Burns, Douglas Edward

    2003-02-01

    Society has and will continue to generate hazardous wastes whose risks must be managed. For exceptionally toxic, long-lived, and feared waste, the solution is deep burial, e.g., deep geological disposal at Yucca Mtn. For some waste, recycle or destruction/treatment is possible. The alternative for other wastes is storage at or near the ground level (in someone’s back yard); most of these storage sites include a surface barrier (cap) to prevent downward water migration. Some of the hazards will persist indefinitely. As society and regulators have demanded additional proof that caps are robust against more threats and for longer time periods, the caps have become increasingly complex and expensive. As in other industries, increased complexity will eventually increase the difficulty in estimating performance, in monitoring system/component performance, and in repairing or upgrading barriers as risks are managed. An approach leading to simpler, less expensive, longer-lived, more manageable caps is needed. Our project, which started in April 2002, aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late). The knowledge gained and the capabilities built will help verify the adequacy of past remedial decisions, improve barrier management, and enable improved solutions for future decisions. We believe it will be possible to develop simpler, longer-lived, less expensive caps that are easier to monitor, manage, and repair. The project is planned to: a) improve the knowledge of degradation mechanisms in times shorter than service life; b) improve modeling of barrier degradation dynamics; c) develop sensor systems to identify early degradation; and d) provide a better basis for developing and testing of new barrier systems. This project combines selected exploratory studies (benchtop and field scale), coupled effects accelerated aging

  2. Testing, Modeling, and Monitoring to Enable Simpler, Cheaper, Longer-lived Surface Caps

    Energy Technology Data Exchange (ETDEWEB)

    Piet, S. J.; Breckenridge, R. P.; Burns, D. E.

    2003-02-25

    Society has and will continue to generate hazardous wastes whose risks must be managed. For exceptionally toxic, long-lived, and feared waste, the solution is deep burial, e.g., deep geological disposal at Yucca Mtn. For some waste, recycle or destruction/treatment is possible. The alternative for other wastes is storage at or near the ground level (in someone's back yard); most of these storage sites include a surface barrier (cap) to prevent downward water migration. Some of the hazards will persist indefinitely. As society and regulators have demanded additional proof that caps are robust against more threats and for longer time periods, the caps have become increasingly complex and expensive. As in other industries, increased complexity will eventually increase the difficulty in estimating performance, in monitoring system/component performance, and in repairing or upgrading barriers as risks are managed. An approach leading to simpler, less expensive, longer-lived, more manageable caps is needed. Our project, which started in April 2002, aims to catalyze a Barrier Improvement Cycle (iterative learning and application) and thus enable Remediation System Performance Management (doing the right maintenance neither too early nor too late). The knowledge gained and the capabilities built will help verify the adequacy of past remedial decisions, improve barrier management, and enable improved solutions for future decisions. We believe it will be possible to develop simpler, longer-lived, less expensive caps that are easier to monitor, manage, and repair. The project is planned to: (a) improve the knowledge of degradation mechanisms in times shorter than service life; (b) improve modeling of barrier degradation dynamics; (c) develop sensor systems to identify early degradation; and (d) provide a better basis for developing and testing of new barrier systems. This project combines selected exploratory studies (benchtop and field scale), coupled effects

  3. CIMS: A FRAMEWORK FOR INFRASTRUCTURE INTERDEPENDENCY MODELING AND ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Donald D. Dudenhoeffer; May R. Permann; Milos Manic

    2006-12-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, utilities, telecommunication, and even financial networks. While modeling and simulation tools have provided insight into the behavior of individual infrastructure networks, a far less understood area is that of the interrelationships among multiple infrastructure networks including the potential cascading effects that may result due to these interdependencies. This paper first describes infrastructure interdependencies as well as presenting a formalization of interdependency types. Next the paper describes a modeling and simulation framework called CIMS© and the work that is being conducted at the Idaho National Laboratory (INL) to model and simulate infrastructure interdependencies and the complex behaviors that can result.

  4. A framework for modeling emerging diseases to inform management

    Science.gov (United States)

    Russell, Robin E.; Katz, Rachel A.; Richgels, Katherine L.D.; Walsh, Daniel P.; Grant, Evan

    2017-01-01

    The rapid emergence and reemergence of zoonotic diseases requires the ability to rapidly evaluate and implement optimal management decisions. Actions to control or mitigate the effects of emerging pathogens are commonly delayed because of uncertainty in the estimates and the predicted outcomes of the control tactics. The development of models that describe the best-known information regarding the disease system at the early stages of disease emergence is an essential step for optimal decision-making. Models can predict the potential effects of the pathogen, provide guidance for assessing the likelihood of success of different proposed management actions, quantify the uncertainty surrounding the choice of the optimal decision, and highlight critical areas for immediate research. We demonstrate how to develop models that can be used as a part of a decision-making framework to determine the likelihood of success of different management actions given current knowledge.

  5. A Building Model Framework for a Genetic Algorithm Multi-objective Model Predictive Control

    DEFF Research Database (Denmark)

    Arendt, Krzysztof; Ionesi, Ana; Jradi, Muhyiddine

    2016-01-01

    implemented only in few buildings. The following difficulties hinder the widespread usage of MPC: (1) significant model development time, (2) limited portability of models, (3) model computational demand. In the present study a new model development framework for an MPC system based on a Genetic Algorithm (GA...

  6. A mathematical framework for the registration and analysis of multi-fascicle models for population studies of the brain microstructure.

    Science.gov (United States)

    Taquet, Maxime; Scherrer, Benoit; Commowick, Olivier; Peters, Jurriaan M; Sahin, Mustafa; Macq, Benoit; Warfield, Simon K

    2014-02-01

    Diffusion tensor imaging (DTI) is unable to represent the diffusion signal arising from multiple crossing fascicles and freely diffusing water molecules. Generative models of the diffusion signal, such as multi-fascicle models, overcome this limitation by providing a parametric representation for the signal contribution of each population of water molecules. These models are of great interest in population studies to characterize and compare the brain microstructural properties. Central to population studies is the construction of an atlas and the registration of all subjects to it. However, the appropriate definition of registration and atlasing methods for multi-fascicle models have proven challenging. This paper proposes a mathematical framework to register and analyze multi-fascicle models. Specifically, we define novel operators to achieve interpolation, smoothing and averaging of multi-fascicle models. We also define a novel similarity metric to spatially align multi-fascicle models. Our framework enables simultaneous comparisons of different microstructural properties that are confounded in conventional DTI. The framework is validated on multi-fascicle models from 24 healthy subjects and 38 patients with tuberous sclerosis complex, 10 of whom have autism. We demonstrate the use of the multi-fascicle models registration and analysis framework in a population study of autism spectrum disorder.

  7. Stochastic Hybrid Systems Modeling and Middleware-enabled DDDAS for Next-generation US Air Force Systems

    Science.gov (United States)

    2017-03-30

    AFRL-AFOSR-VA-TR-2017-0075 Stochastic Hybrid Systems Modeling and Middleware-enabled DDDAS for Next-generation US Air Force Systems Aniruddha...release. Air Force Research Laboratory AF Office Of Scientific Research (AFOSR)/RTA2 4/6/2017https://livelink.ebs.afrl.af.mil/livelink/llisapi.dll a...Sep 2013 to 31 Dec 2016 4. TITLE AND SUBTITLE Stochastic Hybrid Systems Modeling and Middleware-enabled DDDAS for Next- generation US Air Force

  8. A MULTISCALE, CELL-BASED FRAMEWORK FOR MODELING CANCER DEVELOPMENT

    Energy Technology Data Exchange (ETDEWEB)

    JIANG, YI [Los Alamos National Laboratory

    2007-01-16

    Cancer remains to be one of the leading causes of death due to diseases. We use a systems approach that combines mathematical modeling, numerical simulation, in vivo and in vitro experiments, to develop a predictive model that medical researchers can use to study and treat cancerous tumors. The multiscale, cell-based model includes intracellular regulations, cellular level dynamics and intercellular interactions, and extracellular level chemical dynamics. The intracellular level protein regulations and signaling pathways are described by Boolean networks. The cellular level growth and division dynamics, cellular adhesion and interaction with the extracellular matrix is described by a lattice Monte Carlo model (the Cellular Potts Model). The extracellular dynamics of the signaling molecules and metabolites are described by a system of reaction-diffusion equations. All three levels of the model are integrated through a hybrid parallel scheme into a high-performance simulation tool. The simulation results reproduce experimental data in both avasular tumors and tumor angiogenesis. By combining the model with experimental data to construct biologically accurate simulations of tumors and their vascular systems, this model will enable medical researchers to gain a deeper understanding of the cellular and molecular interactions associated with cancer progression and treatment.

  9. Instant e-Teaching Framework Model for Live Online Teaching

    CERN Document Server

    Safei, Suhailan; Rose, Ahmad Nazari Mohd; Rahman, Mohd Nordin Abdul

    2011-01-01

    Instant e-Teaching is a new concept that supplements e-Teaching and e-Learning environment in providing a full and comprehensive modern education styles. The e-Learning technology depicts the concept of enabling self-learning among students on a certain subject using online reference and materials. While the instant e-teaching requires 'face-to-face' characteristic between teacher and student to simultaneously execute actions and gain instant responses. The word instant enhances the e- Teaching with the concept of real time teaching. The challenge to exercise online and instant teaching is not just merely relying on the technologies and system efficiency, but it needs to satisfy the usability and friendliness of the system as to replicate the traditional class environment during the deliveries of the class. For this purpose, an instant e-Teaching framework is been developed that will emulate a dedicated virtual classroom, and primarily designed for synchronous and live sharing of current teaching notes. The m...

  10. Genome-Enabled Modeling of Biogeochemical Processes Predicts Metabolic Dependencies that Connect the Relative Fitness of Microbial Functional Guilds

    Science.gov (United States)

    Brodie, E.; King, E.; Molins, S.; Karaoz, U.; Steefel, C. I.; Banfield, J. F.; Beller, H. R.; Anantharaman, K.; Ligocki, T. J.; Trebotich, D.

    2015-12-01

    Pore-scale processes mediated by microorganisms underlie a range of critical ecosystem services, regulating carbon stability, nutrient flux, and the purification of water. Advances in cultivation-independent approaches now provide us with the ability to reconstruct thousands of genomes from microbial populations from which functional roles may be assigned. With this capability to reveal microbial metabolic potential, the next step is to put these microbes back where they belong to interact with their natural environment, i.e. the pore scale. At this scale, microorganisms communicate, cooperate and compete across their fitness landscapes with communities emerging that feedback on the physical and chemical properties of their environment, ultimately altering the fitness landscape and selecting for new microbial communities with new properties and so on. We have developed a trait-based model of microbial activity that simulates coupled functional guilds that are parameterized with unique combinations of traits that govern fitness under dynamic conditions. Using a reactive transport framework, we simulate the thermodynamics of coupled electron donor-acceptor reactions to predict energy available for cellular maintenance, respiration, biomass development, and enzyme production. From metagenomics, we directly estimate some trait values related to growth and identify the linkage of key traits associated with respiration and fermentation, macromolecule depolymerizing enzymes, and other key functions such as nitrogen fixation. Our simulations were carried out to explore abiotic controls on community emergence such as seasonally fluctuating water table regimes across floodplain organic matter hotspots. Simulations and metagenomic/metatranscriptomic observations highlighted the many dependencies connecting the relative fitness of functional guilds and the importance of chemolithoautotrophic lifestyles. Using an X-Ray microCT-derived soil microaggregate physical model combined

  11. An integrated modelling framework for neural circuits with multiple neuromodulators

    Science.gov (United States)

    Vemana, Vinith

    2017-01-01

    Neuromodulators are endogenous neurochemicals that regulate biophysical and biochemical processes, which control brain function and behaviour, and are often the targets of neuropharmacological drugs. Neuromodulator effects are generally complex partly owing to the involvement of broad innervation, co-release of neuromodulators, complex intra- and extrasynaptic mechanism, existence of multiple receptor subtypes and high interconnectivity within the brain. In this work, we propose an efficient yet sufficiently realistic computational neural modelling framework to study some of these complex behaviours. Specifically, we propose a novel dynamical neural circuit model that integrates the effective neuromodulator-induced currents based on various experimental data (e.g. electrophysiology, neuropharmacology and voltammetry). The model can incorporate multiple interacting brain regions, including neuromodulator sources, simulate efficiently and easily extendable to large-scale brain models, e.g. for neuroimaging purposes. As an example, we model a network of mutually interacting neural populations in the lateral hypothalamus, dorsal raphe nucleus and locus coeruleus, which are major sources of neuromodulator orexin/hypocretin, serotonin and norepinephrine/noradrenaline, respectively, and which play significant roles in regulating many physiological functions. We demonstrate that such a model can provide predictions of systemic drug effects of the popular antidepressants (e.g. reuptake inhibitors), neuromodulator antagonists or their combinations. Finally, we developed user-friendly graphical user interface software for model simulation and visualization for both fundamental sciences and pharmacological studies. PMID:28100828

  12. Assessing pharmacokinetics of different doses of fosfomycin in laboratory rats enables adequate exposure for pharmacodynamic models.

    Science.gov (United States)

    Poeppl, Wolfgang; Lingscheid, Tilman; Bernitzky, Dominik; Donath, Oliver; Reznicek, Gottfried; Zeitlinger, Markus; Burgmann, Heinz

    2014-01-01

    Fosfomycin has been the subject of numerous pharmacodynamic in vivo models in recent years. The present study set out to determine fosfomycin pharmacokinetics in laboratory rats to enable adequate dosing regimens in future rodent models. Fosfomycin was given intraperitoneally as single doses of 75, 200 and 500 mg/kg bodyweight to 4 Sprague-Dawley rats per dose group. Blood samples were collected over 8 h and fosfomycin concentrations were determined by HPLC-mass spectrometry. Fosfomycin showed a dose-proportional pharmacokinetic profile indicated by a correlation of 0.99 for maximum concentration and area under the concentration-time curve (AUC). The mean AUC0-8 after intraperitoneal administration of 75, 200 or 500 mg/kg bodyweight fosfomycin were 109.4, 387.0 and 829.1 µg·h/ml, respectively. In conclusion, a dosing regimen of 200-500 mg/kg 3 times daily is appropriate to obtain serum concentrations in laboratory rats, closely mimicking human serum concentrations over time.

  13. Enabling Grid Computing resources within the KM3NeT computing model

    Directory of Open Access Journals (Sweden)

    Filippidis Christos

    2016-01-01

    Full Text Available KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that – located at the bottom of the Mediterranean Sea – will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  14. Enabling Grid Computing resources within the KM3NeT computing model

    Science.gov (United States)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  15. A rotamer library to enable modeling and design of peptoid foldamers.

    Science.gov (United States)

    Renfrew, P Douglas; Craven, Timothy W; Butterfoss, Glenn L; Kirshenbaum, Kent; Bonneau, Richard

    2014-06-18

    Peptoids are a family of synthetic oligomers composed of N-substituted glycine units. Along with other "foldamer" systems, peptoid oligomer sequences can be predictably designed to form a variety of stable secondary structures. It is not yet evident if foldamer design can be extended to reliably create tertiary structure features that mimic more complex biomolecular folds and functions. Computational modeling and prediction of peptoid conformations will likely play a critical role in enabling complex biomimetic designs. We introduce a computational approach to provide accurate conformational and energetic parameters for peptoid side chains needed for successful modeling and design. We find that peptoids can be described by a "rotamer" treatment, similar to that established for proteins, in which the peptoid side chains display rotational isomerism to populate discrete regions of the conformational landscape. Because of the insufficient number of solved peptoid structures, we have calculated the relative energies of side-chain conformational states to provide a backbone-dependent (BBD) rotamer library for a set of 54 different peptoid side chains. We evaluated two rotamer library development methods that employ quantum mechanics (QM) and/or molecular mechanics (MM) energy calculations to identify side-chain rotamers. We show by comparison to experimental peptoid structures that both methods provide an accurate prediction of peptoid side chain placements in folded peptoid oligomers and at protein interfaces. We have incorporated our peptoid rotamer libraries into ROSETTA, a molecular design package previously validated in the context of protein design and structure prediction.

  16. A python framework for environmental model uncertainty analysis

    Science.gov (United States)

    White, Jeremy; Fienen, Michael; Doherty, John E.

    2016-01-01

    We have developed pyEMU, a python framework for Environmental Modeling Uncertainty analyses, open-source tool that is non-intrusive, easy-to-use, computationally efficient, and scalable to highly-parameterized inverse problems. The framework implements several types of linear (first-order, second-moment (FOSM)) and non-linear uncertainty analyses. The FOSM-based analyses can also be completed prior to parameter estimation to help inform important modeling decisions, such as parameterization and objective function formulation. Complete workflows for several types of FOSM-based and non-linear analyses are documented in example notebooks implemented using Jupyter that are available in the online pyEMU repository. Example workflows include basic parameter and forecast analyses, data worth analyses, and error-variance analyses, as well as usage of parameter ensemble generation and management capabilities. These workflows document the necessary steps and provides insights into the results, with the goal of educating users not only in how to apply pyEMU, but also in the underlying theory of applied uncertainty quantification.

  17. Receptor modeling application framework for particle source apportionment.

    Science.gov (United States)

    Watson, John G; Zhu, Tan; Chow, Judith C; Engelbrecht, Johann; Fujita, Eric M; Wilson, William E

    2002-12-01

    Receptor models infer contributions from particulate matter (PM) source types using multivariate measurements of particle chemical and physical properties. Receptor models complement source models that estimate concentrations from emissions inventories and transport meteorology. Enrichment factor, chemical mass balance, multiple linear regression, eigenvector. edge detection, neural network, aerosol evolution, and aerosol equilibrium models have all been used to solve particulate air quality problems, and more than 500 citations of their theory and application document these uses. While elements, ions, and carbons were often used to apportion TSP, PM10, and PM2.5 among many source types, many of these components have been reduced in source emissions such that more complex measurements of carbon fractions, specific organic compounds, single particle characteristics, and isotopic abundances now need to be measured in source and receptor samples. Compliance monitoring networks are not usually designed to obtain data for the observables, locations, and time periods that allow receptor models to be applied. Measurements from existing networks can be used to form conceptual models that allow the needed monitoring network to be optimized. The framework for using receptor models to solve air quality problems consists of: (1) formulating a conceptual model; (2) identifying potential sources; (3) characterizing source emissions; (4) obtaining and analyzing ambient PM samples for major components and source markers; (5) confirming source types with multivariate receptor models; (6) quantifying source contributions with the chemical mass balance; (7) estimating profile changes and the limiting precursor gases for secondary aerosols; and (8) reconciling receptor modeling results with source models, emissions inventories, and receptor data analyses.

  18. Enabling immersive simulation.

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Josh (University of California Santa Cruz, Santa Cruz, CA); Mateas, Michael (University of California Santa Cruz, Santa Cruz, CA); Hart, Derek H.; Whetzel, Jonathan; Basilico, Justin Derrick; Glickman, Matthew R.; Abbott, Robert G.

    2009-02-01

    The object of the 'Enabling Immersive Simulation for Complex Systems Analysis and Training' LDRD has been to research, design, and engineer a capability to develop simulations which (1) provide a rich, immersive interface for participation by real humans (exploiting existing high-performance game-engine technology wherever possible), and (2) can leverage Sandia's substantial investment in high-fidelity physical and cognitive models implemented in the Umbra simulation framework. We report here on these efforts. First, we describe the integration of Sandia's Umbra modular simulation framework with the open-source Delta3D game engine. Next, we report on Umbra's integration with Sandia's Cognitive Foundry, specifically to provide for learning behaviors for 'virtual teammates' directly from observed human behavior. Finally, we describe the integration of Delta3D with the ABL behavior engine, and report on research into establishing the theoretical framework that will be required to make use of tools like ABL to scale up to increasingly rich and realistic virtual characters.

  19. A Model-driven Framework for Educational Game Design

    Directory of Open Access Journals (Sweden)

    Bill Roungas

    2016-09-01

    Full Text Available Educational games are a class of serious games whose main purpose is to teach some subject to their players. Despite the many existing design frameworks, these games are too often created in an ad-hoc manner, and typically without the use of a game design document (GDD. We argue that a reason for this phenomenon is that current ways to structure, create and update GDDs do not increase the value of the artifact in the design and development process. As a solution, we propose a model-driven, web-based knowledge management environment that supports game designers in the creation of a GDD that accounts for and relates educational and entertainment game elements. The foundation of our approach is our devised conceptual model for educational games, which also defines the structure of the design environment. We present promising results from an evaluation of our environment with eight experts in serious games.

  20. A Bisimulation-based Hierarchical Framework for Software Development Models

    Directory of Open Access Journals (Sweden)

    Ping Liang

    2013-08-01

    Full Text Available Software development models have been ripen since the emergence of software engineering, like waterfall model, V-model, spiral model, etc. To ensure the successful implementation of those models, various metrics for software products and development process have been developed along, like CMMI, software metrics, and process re-engineering, etc. The quality of software products and processes can be ensured in consistence as much as possible and the abstract integrity of a software product can be achieved. However, in reality, the maintenance of software products is still high and even higher along with software evolution due to the inconsistence occurred by changes and inherent errors of software products. It is better to build up a robust software product that can sustain changes as many as possible. Therefore, this paper proposes a process algebra based hierarchical framework to extract an abstract equivalent of deliverable at the end of phases of a software product from its software development models. The process algebra equivalent of the deliverable is developed hierarchically with the development of the software product, applying bi-simulation to test run the deliverable of phases to guarantee the consistence and integrity of the software development and product in a trivially mathematical way. And an algorithm is also given to carry out the assessment of the phase deliverable in process algebra.  

  1. A Categorical Framework for Model Classification in the Geosciences

    Science.gov (United States)

    Hauhs, Michael; Trancón y Widemann, Baltasar; Lange, Holger

    2016-04-01

    Models have a mixed record of success in the geosciences. In meteorology, model development and implementation has been among the first and most successful examples of triggering computer technology in science. On the other hand, notorious problems such as the 'equifinality issue' in hydrology lead to a rather mixed reputation of models in other areas. The most successful models in geosciences are applications of dynamic systems theory to non-living systems or phenomena. Thus, we start from the hypothesis that the success of model applications relates to the influence of life on the phenomenon under study. We thus focus on the (formal) representation of life in models. The aim is to investigate whether disappointment in model performance is due to system properties such as heterogeneity and historicity of ecosystems, or rather reflects an abstraction and formalisation problem at a fundamental level. As a formal framework for this investigation, we use category theory as applied in computer science to specify behaviour at an interface. Its methods have been developed for translating and comparing formal structures among different application areas and seems highly suited for a classification of the current "model zoo" in the geosciences. The approach is rather abstract, with a high degree of generality but a low level of expressibility. Here, category theory will be employed to check the consistency of assumptions about life in different models. It will be shown that it is sufficient to distinguish just four logical cases to check for consistency of model content. All four cases can be formalised as variants of coalgebra-algebra homomorphisms. It can be demonstrated that transitions between the four variants affect the relevant observations (time series or spatial maps), the formalisms used (equations, decision trees) and the test criteria of success (prediction, classification) of the resulting model types. We will present examples from hydrology and ecology in

  2. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2012-12-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  3. The use of cloud enabled building information models – an expert analysis

    Directory of Open Access Journals (Sweden)

    Alan Redmond

    2015-10-01

    Full Text Available The dependency of today’s construction professionals to use singular commercial applications for design possibilities creates the risk of being dictated by the language-tools they use. This unknowingly approach to converting to the constraints of a particular computer application’s style, reduces one’s association with cutting-edge design as no single computer application can support all of the tasks associated with building-design and production. Interoperability depicts the need to pass data between applications, allowing multiple types of experts and applications to contribute to the work at hand. Cloud computing is a centralized heterogeneous platform that enables different applications to be connected to each other through using remote data servers. However, the possibility of providing an interoperable process based on binding several construction applications through a single repository platform ‘cloud computing’ required further analysis. The following Delphi questionnaires analysed the exchanging information opportunities of Building Information Modelling (BIM as the possible solution for the integration of applications on a cloud platform. The survey structure is modelled to; (i identify the most appropriate applications for advancing interoperability at the early design stage, (ii detect the most severe barriers of BIM implementation from a business and legal viewpoint, (iii examine the need for standards to address information exchange between design team, and (iv explore the use of the most common interfaces for exchanging information. The anticipated findings will assist in identifying a model that will enhance the standardized passing of information between systems at the feasibility design stage of a construction project.

  4. Design theoretic analysis of three system modeling frameworks.

    Energy Technology Data Exchange (ETDEWEB)

    McDonald, Michael James

    2007-05-01

    This paper analyzes three simulation architectures from the context of modeling scalability to address System of System (SoS) and Complex System problems. The paper first provides an overview of the SoS problem domain and reviews past work in analyzing model and general system complexity issues. It then identifies and explores the issues of vertical and horizontal integration as well as coupling and hierarchical decomposition as the system characteristics and metrics against which the tools are evaluated. In addition, it applies Nam Suh's Axiomatic Design theory as a construct for understanding coupling and its relationship to system feasibility. Next it describes the application of MATLAB, Swarm, and Umbra (three modeling and simulation approaches) to modeling swarms of Unmanned Flying Vehicle (UAV) agents in relation to the chosen characteristics and metrics. Finally, it draws general conclusions for analyzing model architectures that go beyond those analyzed. In particular, it identifies decomposition along phenomena of interaction and modular system composition as enabling features for modeling large heterogeneous complex systems.

  5. An Extended Model Driven Framework for End-to-End Consistent Model Transformation

    Directory of Open Access Journals (Sweden)

    Mr. G. Ramesh

    2016-08-01

    Full Text Available Model Driven Development (MDD results in quick transformation from models to corresponding systems. Forward engineering features of modelling tools can help in generating source code from models. To build a robust system it is important to have consistency checking in the design models and the same between design model and the transformed implementation. Our framework named as Extensible Real Time Software Design Inconsistency Checker (XRTSDIC proposed in our previous papers supports consistency checking in design models. This paper focuses on automatic model transformation. An algorithm and defined transformation rules for model transformation from UML class diagram to ERD and SQL are being proposed. The model transformation bestows many advantages such as reducing cost of development, improving quality, enhancing productivity and leveraging customer satisfaction. Proposed framework has been enhanced to ensure that the transformed implementations conform to their model counterparts besides checking end-to-end consistency.

  6. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    Science.gov (United States)

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  7. Participatory Model Construction and Model Use in Natural Resource Management: a Framework for Reflection

    NARCIS (Netherlands)

    Bots, P.W.G.; Van Daalen, C.E.

    2008-01-01

    In this article we propose a framework which can assist analysts in their reflection on the requirements for a participatory modelling exercise in natural resource management. Firstly, we distinguish different types of formal models which may be developed, ranging from models that focus on (bio)phys

  8. Usage Intention Framework Model: A Fuzzy Logic Interpretation of the Classical Utaut Model

    Science.gov (United States)

    Sandaire, Johnny

    2009-01-01

    A fuzzy conjoint analysis (FCA: Turksen, 1992) model for enhancing management decision in the technology adoption domain was implemented as an extension to the UTAUT model (Venkatesh, Morris, Davis, & Davis, 2003). Additionally, a UTAUT-based Usage Intention Framework Model (UIFM) introduced a closed-loop feedback system. The empirical evidence…

  9. An Integrated Hydro-Economic Modelling Framework to Evaluate Water Allocation Strategies I: Model Development.

    NARCIS (Netherlands)

    George, B.; Malano, H.; Davidson, B.; Hellegers, P.; Bharati, L.; Sylvain, M.

    2011-01-01

    In this paper an integrated modelling framework for water resources planning and management that can be used to carry out an analysis of alternative policy scenarios for water allocation and use is described. The modelling approach is based on integrating a network allocation model (REALM) and a soc

  10. How much cryosphere model complexity is just right? Exploration using the conceptual cryosphere hydrology framework

    Science.gov (United States)

    Mosier, Thomas M.; Hill, David F.; Sharp, Kendra V.

    2016-09-01

    Making meaningful projections of the impacts that possible future climates would have on water resources in mountain regions requires understanding how cryosphere hydrology model performance changes under altered climate conditions and when the model is applied to ungaged catchments. Further, if we are to develop better models, we must understand which specific process representations limit model performance. This article presents a modeling tool, named the Conceptual Cryosphere Hydrology Framework (CCHF), that enables implementing and evaluating a wide range of cryosphere modeling hypotheses. The CCHF represents cryosphere hydrology systems using a set of coupled process modules that allows easily interchanging individual module representations and includes analysis tools to evaluate model outputs. CCHF version 1 (Mosier, 2016) implements model formulations that require only precipitation and temperature as climate inputs - for example variations on simple degree-index (SDI) or enhanced temperature index (ETI) formulations - because these model structures are often applied in data-sparse mountain regions, and perform relatively well over short periods, but their calibration is known to change based on climate and geography. Using CCHF, we implement seven existing and novel models, including one existing SDI model, two existing ETI models, and four novel models that utilize a combination of existing and novel module representations. The novel module representations include a heat transfer formulation with net longwave radiation and a snowpack internal energy formulation that uses an approximation of the cold content. We assess the models for the Gulkana and Wolverine glaciated watersheds in Alaska, which have markedly different climates and contain long-term US Geological Survey benchmark glaciers. Overall we find that the best performing models are those that are more physically consistent and representative, but no single model performs best for all of our model

  11. DATA CONTEXT MODEL IN THE PRO- CESS INTEGRATION FRAMEWORK

    Institute of Scientific and Technical Information of China (English)

    ZHAO Bo; YAN Yan; NING Ruxin; LI Shiyun

    2008-01-01

    Process integration is the important aspect of product development process. The recent researches focus on project management, workflow management and process modeling. Based on the analysis of the process, product development process is divided into three levels according to different grains from macroscopy to microcosm. Our research concentrate on the workflow and the fine-grained design process. According to the need of representing the data and the relationships among them for process integration, context model is introduced, and its characters are analyzed. The tree-like structure of inheritance among context model's classes is illustrated; The relationships of reference among them are also explained. Then, extensible markup language (XML) file is used to depict these classes. A four-tier framework of process integration has been established, in which model-view-controller pattern is designed to realize the separation between context model and its various views. The integration of applications is applied by the encapsulation of enterprise's business logic as distributed services. The prototype system for the design of air filter is applied in an institute.

  12. Assessment of Solution Uncertainties in Single-Column Modeling Frameworks.

    Science.gov (United States)

    Hack, James J.; Pedretti, John A.

    2000-01-01

    Single-column models (SCMs) have been extensively promoted in recent years as an effective means to develop and test physical parameterizations targeted for more complex three-dimensional climate models. Although there are some clear advantages associated with single-column modeling, there are also some significant disadvantages, including the absence of large-scale feedbacks. Basic limitations of an SCM framework can make it difficult to interpret solutions, and at times contribute to rather striking failures to identify even first-order sensitivities as they would be observed in a global climate simulation. This manuscript will focus on one of the basic experimental approaches currently exploited by the single-column modeling community, with an emphasis on establishing the inherent uncertainties in the numerical solutions. The analysis will employ the standard physics package from the NCAR CCM3 and will illustrate the nature of solution uncertainties that arise from nonlinearities in parameterized physics. The results of this study suggest the need to make use of an ensemble methodology when conducting single-column modeling investigations.

  13. Structural uncertainty in watershed phosphorus modeling: Toward a stochastic framework

    Science.gov (United States)

    Chen, Lei; Gong, Yongwei; Shen, Zhenyao

    2016-06-01

    Structural uncertainty is an important source of model predictive errors, but few studies have been conducted on the error-transitivity from model structure to nonpoint source (NPS) prediction. In this study, we focused on the structural uncertainty caused by the algorithms and equations that are used to describe the phosphorus (P) cycle at the watershed scale. The sensitivity of simulated P to each algorithm/equation was quantified using the Soil and Water Assessment Tool (SWAT) in the Three Gorges Reservoir Area, China. The results indicated that the ratios of C:N and P:N for humic materials, as well as the algorithm of fertilization and P leaching contributed the largest output uncertainties. In comparison, the initiation of inorganic P in the soil layer and the transformation algorithm between P pools are less sensitive for the NPS-P predictions. In addition, the coefficient of variation values were quantified as 0.028-0.086, indicating that the structure-induced uncertainty is minor compared to NPS-P prediction uncertainty caused by the model input and parameters. Using the stochastic framework, the cumulative probability of simulated NPS-P data provided a trade-off between expenditure burden and desired risk. In this sense, this paper provides valuable information for the control of model structural uncertainty, and can be extrapolated to other model-based studies.

  14. Fullrmc, a rigid body Reverse Monte Carlo modeling package enabled with machine learning and artificial intelligence.

    Science.gov (United States)

    Aoun, Bachir

    2016-05-01

    A new Reverse Monte Carlo (RMC) package "fullrmc" for atomic or rigid body and molecular, amorphous, or crystalline materials is presented. fullrmc main purpose is to provide a fully modular, fast and flexible software, thoroughly documented, complex molecules enabled, written in a modern programming language (python, cython, C and C++ when performance is needed) and complying to modern programming practices. fullrmc approach in solving an atomic or molecular structure is different from existing RMC algorithms and software. In a nutshell, traditional RMC methods and software randomly adjust atom positions until the whole system has the greatest consistency with a set of experimental data. In contrast, fullrmc applies smart moves endorsed with reinforcement machine learning to groups of atoms. While fullrmc allows running traditional RMC modeling, the uniqueness of this approach resides in its ability to customize grouping atoms in any convenient way with no additional programming efforts and to apply smart and more physically meaningful moves to the defined groups of atoms. In addition, fullrmc provides a unique way with almost no additional computational cost to recur a group's selection, allowing the system to go out of local minimas by refining a group's position or exploring through and beyond not allowed positions and energy barriers the unrestricted three dimensional space around a group.

  15. The Cancer Cell Line Encyclopedia enables predictive modelling of anticancer drug sensitivity.

    Science.gov (United States)

    Barretina, Jordi; Caponigro, Giordano; Stransky, Nicolas; Venkatesan, Kavitha; Margolin, Adam A; Kim, Sungjoon; Wilson, Christopher J; Lehár, Joseph; Kryukov, Gregory V; Sonkin, Dmitriy; Reddy, Anupama; Liu, Manway; Murray, Lauren; Berger, Michael F; Monahan, John E; Morais, Paula; Meltzer, Jodi; Korejwa, Adam; Jané-Valbuena, Judit; Mapa, Felipa A; Thibault, Joseph; Bric-Furlong, Eva; Raman, Pichai; Shipway, Aaron; Engels, Ingo H; Cheng, Jill; Yu, Guoying K; Yu, Jianjun; Aspesi, Peter; de Silva, Melanie; Jagtap, Kalpana; Jones, Michael D; Wang, Li; Hatton, Charles; Palescandolo, Emanuele; Gupta, Supriya; Mahan, Scott; Sougnez, Carrie; Onofrio, Robert C; Liefeld, Ted; MacConaill, Laura; Winckler, Wendy; Reich, Michael; Li, Nanxin; Mesirov, Jill P; Gabriel, Stacey B; Getz, Gad; Ardlie, Kristin; Chan, Vivien; Myer, Vic E; Weber, Barbara L; Porter, Jeff; Warmuth, Markus; Finan, Peter; Harris, Jennifer L; Meyerson, Matthew; Golub, Todd R; Morrissey, Michael P; Sellers, William R; Schlegel, Robert; Garraway, Levi A

    2012-03-28

    The systematic translation of cancer genomic data into knowledge of tumour biology and therapeutic possibilities remains challenging. Such efforts should be greatly aided by robust preclinical model systems that reflect the genomic diversity of human cancers and for which detailed genetic and pharmacological annotation is available. Here we describe the Cancer Cell Line Encyclopedia (CCLE): a compilation of gene expression, chromosomal copy number and massively parallel sequencing data from 947 human cancer cell lines. When coupled with pharmacological profiles for 24 anticancer drugs across 479 of the cell lines, this collection allowed identification of genetic, lineage, and gene-expression-based predictors of drug sensitivity. In addition to known predictors, we found that plasma cell lineage correlated with sensitivity to IGF1 receptor inhibitors; AHR expression was associated with MEK inhibitor efficacy in NRAS-mutant lines; and SLFN11 expression predicted sensitivity to topoisomerase inhibitors. Together, our results indicate that large, annotated cell-line collections may help to enable preclinical stratification schemata for anticancer agents. The generation of genetic predictions of drug response in the preclinical setting and their incorporation into cancer clinical trial design could speed the emergence of 'personalized' therapeutic regimens.

  16. A modelling framework to simulate foliar fungal epidemics using functional-structural plant models.

    Science.gov (United States)

    Garin, Guillaume; Fournier, Christian; Andrieu, Bruno; Houlès, Vianney; Robert, Corinne; Pradal, Christophe

    2014-09-01

    Sustainable agriculture requires the identification of new, environmentally responsible strategies of crop protection. Modelling of pathosystems can allow a better understanding of the major interactions inside these dynamic systems and may lead to innovative protection strategies. In particular, functional-structural plant models (FSPMs) have been identified as a means to optimize the use of architecture-related traits. A current limitation lies in the inherent complexity of this type of modelling, and thus the purpose of this paper is to provide a framework to both extend and simplify the modelling of pathosystems using FSPMs. Different entities and interactions occurring in pathosystems were formalized in a conceptual model. A framework based on these concepts was then implemented within the open-source OpenAlea modelling platform, using the platform's general strategy of modelling plant-environment interactions and extending it to handle plant interactions with pathogens. New developments include a generic data structure for representing lesions and dispersal units, and a series of generic protocols to communicate with objects representing the canopy and its microenvironment in the OpenAlea platform. Another development is the addition of a library of elementary models involved in pathosystem modelling. Several plant and physical models are already available in OpenAlea and can be combined in models of pathosystems using this framework approach. Two contrasting pathosystems are implemented using the framework and illustrate its generic utility. Simulations demonstrate the framework's ability to simulate multiscaled interactions within pathosystems, and also show that models are modular components within the framework and can be extended. This is illustrated by testing the impact of canopy architectural traits on fungal dispersal. This study provides a framework for modelling a large number of pathosystems using FSPMs. This structure can accommodate both

  17. A Theoretical Framework for Physics Education Research: Modeling Student Thinking

    CERN Document Server

    Redish, E F

    2004-01-01

    Education is a goal-oriented field. But if we want to treat education scientifically so we can accumulate, evaluate, and refine what we learn, then we must develop a theoretical framework that is strongly rooted in objective observations and through which different theoretical models of student thinking can be compared. Much that is known in the behavioral sciences is robust and observationally based. In this paper, I draw from a variety of fields ranging from neuroscience to sociolinguistics to propose an over-arching theoretical framework that allows us to both make sense of what we see in the classroom and to compare a variety of specific theoretical approaches. My synthesis is organized around an analysis of the individual's cognition and how it interacts with the environment. This leads to a two level system, a knowledge-structure level where associational patterns dominate, and a control-structure level where one can describe expectations and epistemology. For each level, I sketch some plausible startin...

  18. Toward the quantification of a conceptual framework for movement ecology using circular statistical modeling.

    Directory of Open Access Journals (Sweden)

    Ichiro Ken Shimatani

    Full Text Available To analyze an animal's movement trajectory, a basic model is required that satisfies the following conditions: the model must have an ecological basis and the parameters used in the model must have ecological interpretations, a broad range of movement patterns can be explained by that model, and equations and probability distributions in the model should be mathematically tractable. Random walk models used in previous studies do not necessarily satisfy these requirements, partly because movement trajectories are often more oriented or tortuous than expected from the models. By improving the modeling for turning angles, this study aims to propose a basic movement model. On the basis of the recently developed circular auto-regressive model, we introduced a new movement model and extended its applicability to capture the asymmetric effects of external factors such as wind. The model was applied to GPS trajectories of a seabird (Calonectris leucomelas to demonstrate its applicability to various movement patterns and to explain how the model parameters are ecologically interpreted under a general conceptual framework for movement ecology. Although it is based on a simple extension of a generalized linear model to circular variables, the proposed model enables us to evaluate the effects of external factors on movement separately from the animal's internal state. For example, maximum likelihood estimates and model selection suggested that in one homing flight section, the seabird intended to fly toward the island, but misjudged its navigation and was driven off-course by strong winds, while in the subsequent flight section, the seabird reset the focal direction, navigated the flight under strong wind conditions, and succeeded in approaching the island.

  19. Long-Term Impact of an Electronic Health Record-Enabled, Team-Based, and Scalable Population Health Strategy Based on the Chronic Care Model

    Science.gov (United States)

    Kawamoto, Kensaku; Anstrom, Kevin J; Anderson, John B; Bosworth, Hayden B; Lobach, David F; McAdam-Marx, Carrie; Ferranti, Jeffrey M; Shang, Howard; Yarnall, Kimberly S H

    2016-01-01

    The Chronic Care Model (CCM) is a promising framework for improving population health, but little is known regarding the long-term impact of scalable, informatics-enabled interventions based on this model. To address this challenge, this study evaluated the long-term impact of implementing a scalable, electronic health record (EHR)- enabled, and CCM-based population health program to replace a labor-intensive legacy program in 18 primary care practices. Interventions included point-of-care decision support, quality reporting, team-based care, patient engagement, and provider education. Among 6,768 patients with diabetes receiving care over 4 years, hemoglobin A1c levels remained stable during the 2-year pre-intervention and post-intervention periods (0.03% and 0% increases, respectively), compared to a 0.42% increase expected based on A1c progression observed in the United Kingdom Prospective Diabetes Study long-term outcomes cohort. The results indicate that an EHR-enabled, team- based, and scalable population health strategy based on the CCM may be effective and efficient for managing population health.

  20. Enabling Parametric Optimal Ascent Trajectory Modeling During Early Phases of Design

    Science.gov (United States)

    Holt, James B.; Dees, Patrick D.; Diaz, Manuel J.

    2015-01-01

    -modal due to the interaction of various constraints. Additionally, when these obstacles are coupled with The Program to Optimize Simulated Trajectories [1] (POST), an industry standard program to optimize ascent trajectories that is difficult to use, it requires expert trajectory analysts to effectively optimize a vehicle's ascent trajectory. As it has been pointed out, the paradigm of trajectory optimization is still a very manual one because using modern computational resources on POST is still a challenging problem. The nuances and difficulties involved in correctly utilizing, and therefore automating, the program presents a large problem. In order to address these issues, the authors will discuss a methodology that has been developed. The methodology is two-fold: first, a set of heuristics will be introduced and discussed that were captured while working with expert analysts to replicate the current state-of-the-art; secondly, leveraging the power of modern computing to evaluate multiple trajectories simultaneously, and therefore, enable the exploration of the trajectory's design space early during the pre-conceptual and conceptual phases of design. When this methodology is coupled with design of experiments in order to train surrogate models, the authors were able to visualize the trajectory design space, enabling parametric optimal ascent trajectory information to be introduced with other pre-conceptual and conceptual design tools. The potential impact of this methodology's success would be a fully automated POST evaluation suite for the purpose of conceptual and preliminary design trade studies. This will enable engineers to characterize the ascent trajectory's sensitivity to design changes in an arbitrary number of dimensions and for finding settings for trajectory specific variables, which result in optimal performance for a "dialed-in" launch vehicle design. The effort described in this paper was developed for the Advanced Concepts Office [2] at NASA Marshall

  1. TP-model transformation-based-control design frameworks

    CERN Document Server

    Baranyi, Péter

    2016-01-01

    This book covers new aspects and frameworks of control, design, and optimization based on the TP model transformation and its various extensions. The author outlines the three main steps of polytopic and LMI based control design: 1) development of the qLPV state-space model, 2) generation of the polytopic model; and 3) application of LMI to derive controller and observer. He goes on to describe why literature has extensively studied LMI design, but has not focused much on the second step, in part because the generation and manipulation of the polytopic form was not tractable in many cases. The author then shows how the TP model transformation facilitates this second step and hence reveals new directions, leading to powerful design procedures and the formulation of new questions. The chapters of this book, and the complex dynamical control tasks which they cover, are organized so as to present and analyze the beneficial aspect of the family of approaches (control, design, and optimization). Additionally, the b...

  2. A Multiple Reaction Modelling Framework for Microbial Electrochemical Technologies

    Science.gov (United States)

    Oyetunde, Tolutola; Sarma, Priyangshu M.; Ahmad, Farrukh; Rodríguez, Jorge

    2017-01-01

    A mathematical model for the theoretical evaluation of microbial electrochemical technologies (METs) is presented that incorporates a detailed physico-chemical framework, includes multiple reactions (both at the electrodes and in the bulk phase) and involves a variety of microbial functional groups. The model is applied to two theoretical case studies: (i) A microbial electrolysis cell (MEC) for continuous anodic volatile fatty acids (VFA) oxidation and cathodic VFA reduction to alcohols, for which the theoretical system response to changes in applied voltage and VFA feed ratio (anode-to-cathode) as well as membrane type are investigated. This case involves multiple parallel electrode reactions in both anode and cathode compartments; (ii) A microbial fuel cell (MFC) for cathodic perchlorate reduction, in which the theoretical impact of feed flow rates and concentrations on the overall system performance are investigated. This case involves multiple electrode reactions in series in the cathode compartment. The model structure captures interactions between important system variables based on first principles and provides a platform for the dynamic description of METs involving electrode reactions both in parallel and in series and in both MFC and MEC configurations. Such a theoretical modelling approach, largely based on first principles, appears promising in the development and testing of MET control and optimization strategies. PMID:28054959

  3. Improving NASA's Multiscale Modeling Framework for Tropical Cyclone Climate Study

    Science.gov (United States)

    Shen, Bo-Wen; Nelson, Bron; Cheung, Samson; Tao, Wei-Kuo

    2013-01-01

    One of the current challenges in tropical cyclone (TC) research is how to improve our understanding of TC interannual variability and the impact of climate change on TCs. Recent advances in global modeling, visualization, and supercomputing technologies at NASA show potential for such studies. In this article, the authors discuss recent scalability improvement to the multiscale modeling framework (MMF) that makes it feasible to perform long-term TC-resolving simulations. The MMF consists of the finite-volume general circulation model (fvGCM), supplemented by a copy of the Goddard cumulus ensemble model (GCE) at each of the fvGCM grid points, giving 13,104 GCE copies. The original fvGCM implementation has a 1D data decomposition; the revised MMF implementation retains the 1D decomposition for most of the code, but uses a 2D decomposition for the massive copies of GCEs. Because the vast majority of computation time in the MMF is spent computing the GCEs, this approach can achieve excellent speedup without incurring the cost of modifying the entire code. Intelligent process mapping allows differing numbers of processes to be assigned to each domain for load balancing. The revised parallel implementation shows highly promising scalability, obtaining a nearly 80-fold speedup by increasing the number of cores from 30 to 3,335.

  4. Modular GIS Framework for National Scale Hydrologic and Hydraulic Modeling Support

    Science.gov (United States)

    Djokic, D.; Noman, N.; Kopp, S.

    2015-12-01

    Geographic information systems (GIS) have been extensively used for pre- and post-processing of hydrologic and hydraulic models at multiple scales. An extensible GIS-based framework was developed for characterization of drainage systems (stream networks, catchments, floodplain characteristics) and model integration. The framework is implemented as a set of free, open source, Python tools and builds on core ArcGIS functionality and uses geoprocessing capabilities to ensure extensibility. Utilization of COTS GIS core capabilities allows immediate use of model results in a variety of existing online applications and integration with other data sources and applications.The poster presents the use of this framework to downscale global hydrologic models to local hydraulic scale and post process the hydraulic modeling results and generate floodplains at any local resolution. Flow forecasts from ECMWF or WRF-Hydro are downscaled and combined with other ancillary data for input into the RAPID flood routing model. RAPID model results (stream flow along each reach) are ingested into a GIS-based scale dependent stream network database for efficient flow utilization and visualization over space and time. Once the flows are known at localized reaches, the tools can be used to derive the floodplain depth and extent for each time step in the forecast at any available local resolution. If existing rating curves are available they can be used to relate the flow to the depth of flooding, or synthetic rating curves can be derived using the tools in the toolkit and some ancillary data/assumptions. The results can be published as time-enabled spatial services to be consumed by web applications that use floodplain information as an input. Some of the existing online presentation templates can be easily combined with available online demographic and infrastructure data to present the impact of the potential floods on the local community through simple, end user products. This framework

  5. Enabling School Structure, Collective Responsibility, and a Culture of Academic Optimism: Toward a Robust Model of School Performance in Taiwan

    Science.gov (United States)

    Wu, Jason H.; Hoy, Wayne K.; Tarter, C. John

    2013-01-01

    Purpose: The purpose of this research is twofold: to test a theory of academic optimism in Taiwan elementary schools and to expand the theory by adding new variables, collective responsibility and enabling school structure, to the model. Design/methodology/approach: Structural equation modeling was used to test, refine, and expand an…

  6. Energy Consumption Model and Measurement Results for Network Coding-enabled IEEE 802.11 Meshed Wireless Networks

    DEFF Research Database (Denmark)

    Paramanathan, Achuthan; Rasmussen, Ulrik Wilken; Hundebøll, Martin

    2012-01-01

    This paper presents an energy model and energy measurements for network coding enabled wireless meshed networks based on IEEE 802.11 technology. The energy model and the energy measurement testbed is limited to a simple Alice and Bob scenario. For this toy scenario we compare the energy usages...

  7. Coding conventions and principles for a National Land-Change Modeling Framework

    Science.gov (United States)

    Donato, David I.

    2017-07-14

    This report establishes specific rules for writing computer source code for use with the National Land-Change Modeling Framework (NLCMF). These specific rules consist of conventions and principles for writing code primarily in the C and C++ programming languages. Collectively, these coding conventions and coding principles create an NLCMF programming style. In addition to detailed naming conventions, this report provides general coding conventions and principles intended to facilitate the development of high-performance software implemented with code that is extensible, flexible, and interoperable. Conventions for developing modular code are explained in general terms and also enabled and demonstrated through the appended templates for C++ base source-code and header files. The NLCMF limited-extern approach to module structure, code inclusion, and cross-module access to data is both explained in the text and then illustrated through the module templates. Advice on the use of global variables is provided.

  8. Physical microscopic free-choice model in the framework of a Darwinian approach to quantum mechanics

    Energy Technology Data Exchange (ETDEWEB)

    Baladron, Carlos [Departamento de Fisica Teorica, Atomica y Optica, Universidad de Valladolid, E-47011, Valladolid (Spain)

    2017-06-15

    A compatibilistic model of free choice for a fundamental particle is built within a general framework that explores the possibility that quantum mechanics be the emergent result of generalised Darwinian evolution acting on the abstract landscape of possible physical theories. The central element in this approach is a probabilistic classical Turing machine -basically an information processor plus a randomiser- methodologically associated with every fundamental particle. In this scheme every system acts not under a general law, but as a consequence of the command of a particular, evolved algorithm. This evolved programme enables the particle to algorithmically anticipate possible future world configurations in information space, and as a consequence, without altering the natural forward causal order in physical space, to incorporate elements to the decision making procedure that are neither purely random nor strictly in the past, but in a possible future. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  9. A FRAMEWORK FOR AN OPEN SOURCE GEOSPATIAL CERTIFICATION MODEL

    Directory of Open Access Journals (Sweden)

    T. U. R. Khan

    2016-06-01

    Full Text Available The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission “Making geospatial education and opportunities accessible to all”. Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the “Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM. The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and

  10. a Framework for AN Open Source Geospatial Certification Model

    Science.gov (United States)

    Khan, T. U. R.; Davis, P.; Behr, F.-J.

    2016-06-01

    The geospatial industry is forecasted to have an enormous growth in the forthcoming years and an extended need for well-educated workforce. Hence ongoing education and training play an important role in the professional life. Parallel, in the geospatial and IT arena as well in the political discussion and legislation Open Source solutions, open data proliferation, and the use of open standards have an increasing significance. Based on the Memorandum of Understanding between International Cartographic Association, OSGeo Foundation, and ISPRS this development led to the implementation of the ICA-OSGeo-Lab imitative with its mission "Making geospatial education and opportunities accessible to all". Discussions in this initiative and the growth and maturity of geospatial Open Source software initiated the idea to develop a framework for a worldwide applicable Open Source certification approach. Generic and geospatial certification approaches are already offered by numerous organisations, i.e., GIS Certification Institute, GeoAcademy, ASPRS, and software vendors, i. e., Esri, Oracle, and RedHat. They focus different fields of expertise and have different levels and ways of examination which are offered for a wide range of fees. The development of the certification framework presented here is based on the analysis of diverse bodies of knowledge concepts, i.e., NCGIA Core Curriculum, URISA Body Of Knowledge, USGIF Essential Body Of Knowledge, the "Geographic Information: Need to Know", currently under development, and the Geospatial Technology Competency Model (GTCM). The latter provides a US American oriented list of the knowledge, skills, and abilities required of workers in the geospatial technology industry and influenced essentially the framework of certification. In addition to the theoretical analysis of existing resources the geospatial community was integrated twofold. An online survey about the relevance of Open Source was performed and evaluated with 105

  11. A framework of modeling detector systems for computed tomography simulations

    Science.gov (United States)

    Youn, H.; Kim, D.; Kim, S. H.; Kam, S.; Jeon, H.; Nam, J.; Kim, H. K.

    2016-01-01

    Ultimate development in computed tomography (CT) technology may be a system that can provide images with excellent lesion conspicuity with the patient dose as low as possible. Imaging simulation tools have been cost-effectively used for these developments and will continue. For a more accurate and realistic imaging simulation, the signal and noise propagation through a CT detector system has been modeled in this study using the cascaded linear-systems theory. The simulation results are validated in comparisons with the measured results using a laboratory flat-panel micro-CT system. Although the image noise obtained from the simulations at higher exposures is slightly smaller than that obtained from the measurements, the difference between them is reasonably acceptable. According to the simulation results for various exposure levels and additive electronic noise levels, x-ray quantum noise is more dominant than the additive electronic noise. The framework of modeling a CT detector system suggested in this study will be helpful for the development of an accurate and realistic projection simulation model.

  12. A Production Model for Construction: A Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Ricardo Antunes

    2015-03-01

    Full Text Available The building construction industry faces challenges, such as increasing project complexity and scope requirements, but shorter deadlines. Additionally, economic uncertainty and rising business competition with a subsequent decrease in profit margins for the industry demands the development of new approaches to construction management. However, the building construction sector relies on practices based on intuition and experience, overlooking the dynamics of its production system. Furthermore, researchers maintain that the construction industry has no history of the application of mathematical approaches to model and manage production. Much work has been carried out on how manufacturing practices apply to construction projects, mostly lean principles. Nevertheless, there has been little research to understand the fundamental mechanisms of production in construction. This study develops an in-depth literature review to examine the existing knowledge about production models and their characteristics in order to establish a foundation for dynamic production systems management in construction. As a result, a theoretical framework is proposed, which will be instrumental in the future development of mathematical production models aimed at predicting the performance and behaviour of dynamic project-based systems in construction.

  13. The International Lunar Decade — 2017-2029: Framework for Concurrent Development of Enabling Technologies, Infrastructures, Financings, and Policies for Lunar Development

    Science.gov (United States)

    Beldavs, V. Z.; Dunlop, D.; Crisafulli, J.; Foing, B.

    2015-10-01

    The International Lunar Decade (ILD) planned for launch in 2017 provides a framework for long-term international collaboration in the development of technologies, infrastructures, and financing mechanisms for lunar development.

  14. A unified framework for benchmark dose estimation applied to mixed models and model averaging

    DEFF Research Database (Denmark)

    Ritz, Christian; Gerhard, Daniel; Hothorn, Ludwig A.

    2013-01-01

    This article develops a framework for benchmark dose estimation that allows intrinsically nonlinear dose-response models to be used for continuous data in much the same way as is already possible for quantal data. This means that the same dose-response model equations may be applied to both...

  15. Mapping disability-adjusted life years: a Bayesian hierarchical model framework for burden of disease and injury assessment.

    Science.gov (United States)

    MacNab, Ying C

    2007-11-20

    This paper presents a Bayesian disability-adjusted life year (DALY) methodology for spatial and spatiotemporal analyses of disease and/or injury burden. A Bayesian disease mapping model framework, which blends together spatial modelling, shared-component modelling (SCM), temporal modelling, ecological modelling, and non-linear modelling, is developed for small-area DALY estimation and inference. In particular, we develop a model framework that enables SCM as well as multivariate CAR modelling of non-fatal and fatal disease or injury rates and facilitates spline smoothing for non-linear modelling of temporal rate and risk trends. Using British Columbia (Canada) hospital admission-separation data and vital statistics mortality data on non-fatal and fatal road traffic injuries to male population age 20-39 for year 1991-2000 and for 84 local health areas and 16 health service delivery areas, spatial and spatiotemporal estimation and inference on years of life lost due to premature death, years lived with disability, and DALYs are presented. Fully Bayesian estimation and inference, with Markov chain Monte Carlo implementation, are illustrated. We present a methodological framework within which the DALY and the Bayesian disease mapping methodologies interface and intersect. Its development brings the relative importance of premature mortality and disability into the assessment of community health and health needs in order to provide reliable information and evidence for community-based public health surveillance and evaluation, disease and injury prevention, and resource provision.

  16. Acid deposition: decision framework. Volume 1. Description of conceptual framework and decision-tree models. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Balson, W.E.; Boyd, D.W.; North, D.W.

    1982-08-01

    Acid precipitation and dry deposition of acid materials have emerged as an important environmental issue affecting the electric utility industry. This report presents a framework for the analysis of decisions on acid deposition. The decision framework is intended as a means of summarizing scientific information and uncertainties on the relation between emissions from electric utilities and other sources, acid deposition, and impacts on ecological systems. The methodology for implementing the framework is that of decision analysis, which provides a quantitative means of analyzing decisions under uncertainty. The decisions of interest include reductions in sulfur oxide and other emissions thought to be precursors of acid deposition, mitigation of acid deposition impacts through means such as liming of waterways and soils, and choice of strategies for research. The report first gives an overview of the decision framework and explains the decision analysis methods with a simplified caricature example. The state of scientific information and the modeling assumptions for the framework are then discussed for the three main modules of the framework: emissions and control technologies; long-range transport and chemical conversion in the atmosphere; and ecological impacts. The report then presents two versions of a decision tree model that implements the decision framework. The basic decision tree addresses decisions on emissions control and mitigation in the immediate future and a decade hence, and it includes uncertainties in the long-range transport and ecological impacts. The research emphasis decision tree addresses the effect of research funding on obtaining new information as the basis for future decisions. Illustrative data and calculations using the decision tree models are presented.

  17. Scrutiny of Appropriate Model Error Specification in Multivariate Assimilation Framework using mHM

    Science.gov (United States)

    Rakovec, O.; Noh, S. J.; Kumar, R.; Samaniego, L. E.

    2015-12-01

    Reliable and accurate predictions of regional scale water fluxes and states is of great challenge to the scientific community. Several sectors of society (municipalities, agriculture, energy, etc.) may benefit from successful solutions to appropriately quantify uncertainties in hydro-meteorological prediction systems, with particular attention to extreme weather conditions.Increased availability and quality of near real-time data enables better understanding of predictive skill of forecasting frameworks. To address this issue, automatic model-observation integrations are required for appropriate model initializations. In this study, the effects of noise specification on the quality of hydrological forecasts is scrutinized via a data assimilation system. This framework has been developed by incorporating the mesoscale hydrologic model (mHM, {http://www.ufz.de/mhm) with particle filtering (PF) approach used for model state updating. In comparison with previous works, lag PF is considered to better account for the response times of internal hydrologic processes.The objective of this study is to assess the benefits of model state updating for prediction of water fluxes and states up to 3-month ahead forecast using particle filtering. The efficiency of this system is demonstrated in 10 large European basins. We evaluate the model skill for five assimilation scenarios using observed (1) discharge (Q); (2) MODIS evapotranspiration (ET); (3) GRACE terrestrial total water storage (TWS) anomaly; (4) ESA-CCI soil moisture; and (5) the combination of Q, ET, TWS, and SM in a hindcast experiment (2004-2010). The effects of error perturbations for both, the analysis and the forecasts are presented, and optimal trade-offs are discussed. While large perturbations are preferred for the analysis time step, steep deterioration is observed for longer lead times, for which more conservative error measures should be considered. From all the datasets, complementary GRACE TWS data together

  18. SCaLeM: A Framework for Characterizing and Analyzing Execution Models

    Energy Technology Data Exchange (ETDEWEB)

    Chavarría-Miranda, Daniel; Manzano Franco, Joseph B.; Krishnamoorthy, Sriram; Vishnu, Abhinav; Barker, Kevin J.; Hoisie, Adolfy

    2014-10-13

    As scalable parallel systems evolve towards more complex nodes with many-core architectures and larger trans-petascale & upcoming exascale deployments, there is a need to understand, characterize and quantify the underlying execution models being used on such systems. Execution models are a conceptual layer between applications & algorithms and the underlying parallel hardware and systems software on which those applications run. This paper presents the SCaLeM (Synchronization, Concurrency, Locality, Memory) framework for characterizing and execution models. SCaLeM consists of three basic elements: attributes, compositions and mapping of these compositions to abstract parallel systems. The fundamental Synchronization, Concurrency, Locality and Memory attributes are used to characterize each execution model, while the combinations of those attributes in the form of compositions are used to describe the primitive operations of the execution model. The mapping of the execution model’s primitive operations described by compositions, to an underlying abstract parallel system can be evaluated quantitatively to determine its effectiveness. Finally, SCaLeM also enables the representation and analysis of applications in terms of execution models, for the purpose of evaluating the effectiveness of such mapping.

  19. AN INTEGRATED MODELING FRAMEWORK FOR CARBON MANAGEMENT TECHNOLOGIES

    Energy Technology Data Exchange (ETDEWEB)

    Anand B. Rao; Edward S. Rubin; Michael B. Berkenpas

    2004-03-01

    CO{sub 2} capture and storage (CCS) is gaining widespread interest as a potential method to control greenhouse gas emissions from fossil fuel sources, especially electric power plants. Commercial applications of CO{sub 2} separation and capture technologies are found in a number of industrial process operations worldwide. Many of these capture technologies also are applicable to fossil fuel power plants, although applications to large-scale power generation remain to be demonstrated. This report describes the development of a generalized modeling framework to assess alternative CO{sub 2} capture and storage options in the context of multi-pollutant control requirements for fossil fuel power plants. The focus of the report is on post-combustion CO{sub 2} capture using amine-based absorption systems at pulverized coal-fired plants, which are the most prevalent technology used for power generation today. The modeling framework builds on the previously developed Integrated Environmental Control Model (IECM). The expanded version with carbon sequestration is designated as IECM-cs. The expanded modeling capability also includes natural gas combined cycle (NGCC) power plants and integrated coal gasification combined cycle (IGCC) systems as well as pulverized coal (PC) plants. This report presents details of the performance and cost models developed for an amine-based CO{sub 2} capture system, representing the baseline of current commercial technology. The key uncertainties and variability in process design, performance and cost parameters which influence the overall cost of carbon mitigation also are characterized. The new performance and cost models for CO{sub 2} capture systems have been integrated into the IECM-cs, along with models to estimate CO{sub 2} transport and storage costs. The CO{sub 2} control system also interacts with other emission control technologies such as flue gas desulfurization (FGD) systems for SO{sub 2} control. The integrated model is applied to

  20. Retrofitting Non-Cognitive-Diagnostic Reading Assessment under the Generalized DINA Model Framework

    Science.gov (United States)

    Chen, Huilin; Chen, Jinsong

    2016-01-01

    Cognitive diagnosis models (CDMs) are psychometric models developed mainly to assess examinees' specific strengths and weaknesses in a set of skills or attributes within a domain. By adopting the Generalized-DINA model framework, the recently developed general modeling framework, we attempted to retrofit the PISA reading assessments, a…

  1. Model-Based Reasoning in the Physics Laboratory: Framework and Initial Results

    Science.gov (United States)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-01-01

    We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable…

  2. The outcome competency framework for practitioners in infection prevention and control: use of the outcome logic model for evaluation

    Science.gov (United States)

    Curran, E; Loveday, HP; Kiernan, MA; Tannahill, M

    2013-01-01

    Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences.

  3. The Goddard Snow Radiance Assimilation Project: An Integrated Snow Radiance and Snow Physics Modeling Framework for Snow/cold Land Surface Modeling

    Science.gov (United States)

    Kim, E.; Tedesco, M.; Reichle, R.; Choudhury, B.; Peters-Lidard C.; Foster, J.; Hall, D.; Riggs, G.

    2006-01-01

    Microwave-based retrievals of snow parameters from satellite observations have a long heritage and have so far been generated primarily by regression-based empirical "inversion" methods based on snapshots in time. Direct assimilation of microwave radiance into physical land surface models can be used to avoid errors associated with such retrieval/inversion methods, instead utilizing more straightforward forward models and temporal information. This approach has been used for years for atmospheric parameters by the operational weather forecasting community with great success. Recent developments in forward radiative transfer modeling, physical land surface modeling, and land data assimilation are converging to allow the assembly of an integrated framework for snow/cold lands modeling and radiance assimilation. The objective of the Goddard snow radiance assimilation project is to develop such a framework and explore its capabilities. The key elements of this framework include: a forward radiative transfer model (FRTM) for snow, a snowpack physical model, a land surface water/energy cycle model, and a data assimilation scheme. In fact, multiple models are available for each element enabling optimization to match the needs of a particular study. Together these form a modular and flexible framework for self-consistent, physically-based remote sensing and water/energy cycle studies. In this paper we will describe the elements and the integration plan. All modules will operate within the framework of the Land Information System (LIS), a land surface modeling framework with data assimilation capabilities running on a parallel-node computing cluster. Capabilities for assimilation of snow retrieval products are already under development for LIS. We will describe plans to add radiance-based assimilation capabilities. Plans for validation activities using field measurements will also be discussed.

  4. A Global Modeling Framework for Plasma Kinetics: Development and Applications

    Science.gov (United States)

    Parsey, Guy Morland

    The modern study of plasmas, and applications thereof, has developed synchronously with com- puter capabilities since the mid-1950s. Complexities inherent to these charged-particle, many- body, systems have resulted in the development of multiple simulation methods (particle-in-cell, fluid, global modeling, etc.) in order to both explain observed phenomena and predict outcomes of plasma applications. Recognizing that different algorithms are chosen to best address specific topics of interest, this thesis centers around the development of an open-source global model frame- work for the focused study of non-equilibrium plasma kinetics. After verification and validation of the framework, it was used to study two physical phenomena: plasma-assisted combustion and the recently proposed optically-pumped rare gas metastable laser. Global models permeate chemistry and plasma science, relying on spatial averaging to focus attention on the dynamics of reaction networks. Defined by a set of species continuity and energy conservation equations, the required data and constructed systems are conceptually similar across most applications, providing a light platform for exploratory and result-search parameter scan- ning. Unfortunately, it is common practice for custom code to be developed for each application-- an enormous duplication of effort which negatively affects the quality of the software produced. Presented herein, the Python-based Kinetic Global Modeling framework (KGMf) was designed to support all modeling phases: collection and analysis of reaction data, construction of an exportable system of model ODEs, and a platform for interactive evaluation and post-processing analysis. A symbolic ODE system is constructed for interactive manipulation and generation of a Jacobian, both of which are compiled as operation-optimized C-code. Plasma-assisted combustion and ignition (PAC/PAI) embody the modernization of burning fuel by opening up new avenues of control and optimization

  5. Young diabetics' compliance in the framework of the MIMIC model.

    Science.gov (United States)

    Kyngäs, H; Hentinen, M; Koivukangas, P; Ohinmaa, A

    1996-11-01

    The compliance of 346 young diabetics aged 13-17 years with health regimens is analysed in the framework of a MIMIC (multiple indicators, multiple causes) model. The data were compiled by means of a questionnaire on compliance, conditions for compliance, the meaning attached to treatment and the impact of the disease, and the model constructed using the LISREL VII programme, treating compliance as an unobserved variable formulated in terms of observed causes (x-variables) and observed indicators (y-variables). The resulting solutions are entirely satisfactory. The goodness-of-fit index is 0.983, the root mean square residual 0.058 and the chi-squared statistic 43.35 (P compliance to be indicated by self-care behaviour, responsibility for treatment, intention to pursue the treatment and collaboration with the physician, and to be greatly determined by motivation, experience of the results of treatment and having the energy and will-power to pursue the treatment and, to a lesser extent, by a sense of normality and fear.

  6. Smart licensing and environmental flows: Modeling framework and sensitivity testing

    Science.gov (United States)

    Wilby, R. L.; Fenn, C. R.; Wood, P. J.; Timlett, R.; Lequesne, T.

    2011-12-01

    Adapting to climate change is just one among many challenges facing river managers. The response will involve balancing the long-term water demands of society with the changing needs of the environment in sustainable and cost effective ways. This paper describes a modeling framework for evaluating the sensitivity of low river flows to different configurations of abstraction licensing under both historical climate variability and expected climate change. A rainfall-runoff model is used to quantify trade-offs among environmental flow (e-flow) requirements, potential surface and groundwater abstraction volumes, and the frequency of harmful low-flow conditions. Using the River Itchen in southern England as a case study it is shown that the abstraction volume is more sensitive to uncertainty in the regional climate change projection than to the e-flow target. It is also found that "smarter" licensing arrangements (involving a mix of hands off flows and "rising block" abstraction rules) could achieve e-flow targets more frequently than conventional seasonal abstraction limits, with only modest reductions in average annual yield, even under a hotter, drier climate change scenario.

  7. Python framework for kinetic modeling of electronically excited reaction pathways

    Science.gov (United States)

    Verboncoeur, John; Parsey, Guy; Guclu, Yaman; Christlieb, Andrew

    2012-10-01

    The use of plasma energy to enhance and control the chemical reactions during combustion, a technology referred to as ``plasma assisted combustion'' (PAC), can result in a variety of beneficial effects: e.g. stable lean operation, pollution reduction, and wider range of p-T operating conditions. While experimental evidence abounds, theoretical understanding of PAC is at best incomplete, and numerical tools still lack in reliable predictive capabilities. In the context of a joint experimental-numerical effort at Michigan State University, we present here an open-source modular Python framework dedicated to the dynamic optimization of non-equilibrium PAC systems. Multiple sources of experimental reaction data, e.g. reaction rates, cross-sections and oscillator strengths, are used in order to quantify the effect of data uncertainty and limiting assumptions. A collisional-radiative model (CRM) is implemented to organize reactions by importance and as a potential means of measuring a non-Maxwellian electron energy distribution function (EEDF), when coupled to optical emission spectroscopy data. Finally, we explore scaling laws in PAC parameter space using a kinetic global model (KGM) accelerated with CRM optimized reaction sequences and sparse stiff integrators.

  8. A modeling and simulation framework for electrokinetic nanoparticle treatment

    Science.gov (United States)

    Phillips, James

    2011-12-01

    The focus of this research is to model and provide a simulation framework for the packing of differently sized spheres within a hard boundary. The novel contributions of this dissertation are the cylinders of influence (COI) method and sectoring method implementations. The impetus for this research stems from modeling electrokinetic nanoparticle (EN) treatment, which packs concrete pores with differently sized nanoparticles. We show an improved speed of the simulation compared to previously published results of EN treatment simulation while obtaining similar porosity reduction results. We mainly focused on readily, commercially available particle sizes of 2 nm and 20 nm particles, but have the capability to model other sizes. Our simulation has graphical capabilities and can provide additional data unobtainable from physical experimentation. The data collected has a median of 0.5750 and a mean of 0.5504. The standard error is 0.0054 at alpha = 0.05 for a 95% confidence interval of 0.5504 +/- 0.0054. The simulation has produced maximum packing densities of 65% and minimum packing densities of 34%. Simulation data are analyzed using linear regression via the R statistical language to obtain two equations: one that describes porosity reduction based on all cylinder and particle characteristics, and another that focuses on describing porosity reduction based on cylinder diameter for 2 and 20 nm particles into pores of 100 nm height. Simulation results are similar to most physical results obtained from MIP and WLR. Some MIP results do not fall within the simulation limits; however, this is expected as MIP has been documented to be an inaccurate measure of pore distribution and porosity of concrete. Despite the disagreement between WLR and MIP, there is a trend that porosity reduction is higher two inches from the rebar as compared to the rebar-concrete interface. The simulation also detects a higher porosity reduction further from the rebar. This may be due to particles

  9. Refining the operating model concept to enable systematic growth in operating maturity

    CSIR Research Space (South Africa)

    De Vries, M

    2010-10-01

    Full Text Available , management could move their attention away from focusing on lower-value activities to innovative ways to increase profits and growth. The Business-IT Alignment Framework (BIAF) defines business-IT alignment in terms of a paradigm of alignment, three...

  10. Modelling supported driving as an optimal control cycle: Framework and model characteristics

    CERN Document Server

    Wang, Meng; Daamen, Winnie; Hoogendoorn, Serge P; van Arem, Bart

    2014-01-01

    Driver assistance systems support drivers in operating vehicles in a safe, comfortable and efficient way, and thus may induce changes in traffic flow characteristics. This paper puts forward a receding horizon control framework to model driver assistance and cooperative systems. The accelerations of automated vehicles are controlled to optimise a cost function, assuming other vehicles driving at stationary conditions over a prediction horizon. The flexibility of the framework is demonstrated with controller design of Adaptive Cruise Control (ACC) and Cooperative ACC (C-ACC) systems. The proposed ACC and C-ACC model characteristics are investigated analytically, with focus on equilibrium solutions and stability properties. The proposed ACC model produces plausible human car-following behaviour and is unconditionally locally stable. By careful tuning of parameters, the ACC model generates similar stability characteristics as human driver models. The proposed C-ACC model results in convective downstream and abso...

  11. A new climate modeling framework for convection-resolving simulation at continental scale

    Science.gov (United States)

    Charpilloz, Christophe; di Girolamo, Salvatore; Arteaga, Andrea; Fuhrer, Oliver; Hoefler, Torsten; Schulthess, Thomas; Schär, Christoph

    2017-04-01

    Major uncertainties remain in our understanding of the processes that govern the water cycle in a changing climate and their representation in weather and climate models. Of particular concern are heavy precipitation events of convective origin (thunderstorms and rain showers). The aim of the crCLIM project [1] is to propose a new climate modeling framework that alleviates the I/O-bottleneck in large-scale, convection-resolving climate simulations and thus to enable new analysis techniques for climate scientists. Due to the large computational costs, convection-resolving simulations are currently restricted to small computational domains or very short time scales, unless the largest available supercomputers system such as hybrid CPU-GPU architectures are used [3]. Hence, the COSMO model has been adapted to run on these architectures for research and production purposes [2]. However, the amount of generated data also increases and storing this data becomes infeasible making the analysis of simulations results impractical. To circumvent this problem and enable high-resolution models in climate we propose a data-virtualization layer (DVL) that re-runs simulations on demand and transparently manages the data for the analysis, that means we trade off computational effort (time) for storage (space). This approach also requires a bit-reproducible version of the COSMO model that produces identical results on different architectures (CPUs and GPUs) [4] that will be coupled with a performance model in order enable optimal re-runs depending on requirements of the re-run and available resources. In this contribution, we discuss the strategy to develop the DVL, a first performance model, the challenge of bit-reproducibility and the first results of the crCLIM project. [1] http://www.c2sm.ethz.ch/research/crCLIM.html [2] O. Fuhrer, C. Osuna, X. Lapillonne, T. Gysi, M. Bianco, and T. Schulthess. "Towards gpu-accelerated operational weather forecasting." In The GPU Technology

  12. Testing the Youth Physical Activity Promotion Model: Fatness and Fitness as Enabling Factors

    Science.gov (United States)

    Chen, Senlin; Welk, Gregory J.; Joens-Matre, Roxane R.

    2014-01-01

    As the prevalence of childhood obesity increases, it is important to examine possible differences in psychosocial correlates of physical activity between normal weight and overweight children. The study examined fatness (weight status) and (aerobic) fitness as Enabling factors related to youth physical activity within the Youth Physical Activity…

  13. Testing the Youth Physical Activity Promotion Model: Fatness and Fitness as Enabling Factors

    Science.gov (United States)

    Chen, Senlin; Welk, Gregory J.; Joens-Matre, Roxane R.

    2014-01-01

    As the prevalence of childhood obesity increases, it is important to examine possible differences in psychosocial correlates of physical activity between normal weight and overweight children. The study examined fatness (weight status) and (aerobic) fitness as Enabling factors related to youth physical activity within the Youth Physical Activity…

  14. Modeling and simulation of complex systems a framework for efficient agent-based modeling and simulation

    CERN Document Server

    Siegfried, Robert

    2014-01-01

    Robert Siegfried presents a framework for efficient agent-based modeling and simulation of complex systems. He compares different approaches for describing structure and dynamics of agent-based models in detail. Based on this evaluation the author introduces the "General Reference Model for Agent-based Modeling and Simulation" (GRAMS). Furthermore he presents parallel and distributed simulation approaches for execution of agent-based models -from small scale to very large scale. The author shows how agent-based models may be executed by different simulation engines that utilize underlying hard

  15. A graphical model framework for decoding in the visual ERP-based BCI speller

    NARCIS (Netherlands)

    Martens, S.M.M.; Mooij, J.M.; Hill, N.J.; Farquhar, J.D.R.; Schölkopf, B.

    2011-01-01

    We present a graphical model framework for decoding in the visual ERP-based speller system. The proposed framework allows researchers to build generative models from which the decoding rules are obtained in a straightforward manner. We suggest two models for generating brain signals conditioned on

  16. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    Science.gov (United States)

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  17. Modeling Framework and Results to Inform Charging Infrastructure Investments

    Energy Technology Data Exchange (ETDEWEB)

    Melaina, Marc W [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Wood, Eric W [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-01

    The plug-in electric vehicle (PEV) market is experiencing rapid growth with dozens of battery electric (BEV) and plug-in hybrid electric (PHEV) models already available and billions of dollars being invested by automotive manufacturers in the PEV space. Electric range is increasing thanks to larger and more advanced batteries and significant infrastructure investments are being made to enable higher power fast charging. Costs are falling and PEVs are becoming more competitive with conventional vehicles. Moreover, new technologies such as connectivity and automation hold the promise of enhancing the value proposition of PEVs. This presentation outlines a suite of projects funded by the U.S. Department of Energy's Vehicle Technology Office to conduct assessments of the economic value and charging infrastructure requirements of the evolving PEV market. Individual assessments include national evaluations of PEV economic value (assuming 73M PEVs on the road in 2035), national analysis of charging infrastructure requirements (with community and corridor level resolution), and case studies of PEV ownership in Columbus, OH and Massachusetts.

  18. How to make more out of community data? A conceptual framework and its implementation as models and software.

    Science.gov (United States)

    Ovaskainen, Otso; Tikhonov, Gleb; Norberg, Anna; Guillaume Blanchet, F; Duan, Leo; Dunson, David; Roslin, Tomas; Abrego, Nerea

    2017-03-20

    Community ecology aims to understand what factors determine the assembly and dynamics of species assemblages at different spatiotemporal scales. To facilitate the integration between conceptual and statistical approaches in community ecology, we propose Hierarchical Modelling of Species Communities (HMSC) as a general, flexible framework for modern analysis of community data. While non-manipulative data allow for only correlative and not causal inference, this framework facilitates the formulation of data-driven hypotheses regarding the processes that structure communities. We model environmental filtering by variation and covariation in the responses of individual species to the characteristics of their environment, with potential contingencies on species traits and phylogenetic relationships. We capture biotic assembly rules by species-to-species association matrices, which may be estimated at multiple spatial or temporal scales. We operationalise the HMSC framework as a hierarchical Bayesian joint species distribution model, and implement it as R- and Matlab-packages which enable computationally efficient analyses of large data sets. Armed with this tool, community ecologists can make sense of many types of data, including spatially explicit data and time-series data. We illustrate the use of this framework through a series of diverse ecological examples.

  19. Conceptual Frameworks and Research Models on Resilience in Leadership

    Directory of Open Access Journals (Sweden)

    Janet Ledesma

    2014-08-01

    Full Text Available The purpose of this article was to discuss conceptual frameworks and research models on resilience theory. The constructs of resilience, the history of resilience theory, models of resilience, variables of resilience, career resilience, and organizational resilience will be examined and discussed as they relate to leadership development. The literature demonstrates that there is a direct relationship between the stress of the leader’s job and his or her ability to maintain resilience in the face of prolonged contact with adversity. This article discusses resilience theory as it relates to leadership development. The concept associated with resilience, which includes thriving and hardiness, is explored with the belief that resilient leaders are invaluable to the sustainability of an organization. In addition, the constructs of resilience and the history of resilience studies in the field of psychiatry, developmental psychopathy, human development, medicine, epidemiology, and the social sciences are examined. Survival, recovery, and thriving are concepts associated with resilience and describe the stage at which a person may be during or after facing adversity. The concept of “thriving” refers to a person’s ability to go beyond his or her original level of functioning and to grow and function despite repeated exposure to stressful experiences. The literature suggests a number of variables that characterize resilience and thriving. These variables include positive self-esteem, hardiness, strong coping skills, a sense of coherence, self-efficacy, optimism, strong social resources, adaptability, risk-taking, low fear of failure, determination, perseverance, and a high tolerance of uncertainty. These are reviewed in this article. The findings in this article suggest that those who develop leaders need to create safe environments to help emerging and existing leaders thrive as individuals and as organizational leaders in the area of resilience

  20. The Regional Hydrologic Extremes Assessment System: A software framework for hydrologic modeling and data assimilation.

    Science.gov (United States)

    Andreadis, Konstantinos M; Das, Narendra; Stampoulis, Dimitrios; Ines, Amor; Fisher, Joshua B; Granger, Stephanie; Kawata, Jessie; Han, Eunjin; Behrangi, Ali

    2017-01-01

    The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications. A spatially-enabled database is a key component of the software that can ingest a suite of satellite and model datasets while facilitating the interfacing with Geographic Information System (GIS) applications. The datasets ingested are obtained from numerous space-borne sensors and represent multiple components of the water cycle. The object-oriented design of the software allows for modularity and extensibility, showcased here with the coupling of the core hydrologic model with a crop growth model. RHEAS can exploit multi-threading to scale with increasing number of processors, while the database allows delivery of data products and associated uncertainty through a variety of GIS platforms. A set of three example implementations of RHEAS in the United States and Kenya are described to demonstrate the different features of the system in real-world applications.

  1. Contagion effect of enabling or coercive use of costing model within the managerial couple in lean organizations

    DEFF Research Database (Denmark)

    Kristensen, Thomas; Israelsen, Poul

    In the lean strategy is enabling formalization behaviour expected at the lower levels of management to be successful. We study the contagion effect between the superior, middle manager, of the lower level manager. This effect is proposed to be a dominant contingency variable for the use of costin...... models at the lower levels of management. Thus the use of costing models at the middle manager level is an important key to be successful with the lean package....

  2. Reversible CO binding enables tunable CO/H₂ and CO/N₂ separations in metal-organic frameworks with exposed divalent metal cations.

    Science.gov (United States)

    Bloch, Eric D; Hudson, Matthew R; Mason, Jarad A; Chavan, Sachin; Crocellà, Valentina; Howe, Joshua D; Lee, Kyuho; Dzubak, Allison L; Queen, Wendy L; Zadrozny, Joseph M; Geier, Stephen J; Lin, Li-Chiang; Gagliardi, Laura; Smit, Berend; Neaton, Jeffrey B; Bordiga, Silvia; Brown, Craig M; Long, Jeffrey R

    2014-07-30

    Six metal-organic frameworks of the M2(dobdc) (M = Mg, Mn, Fe, Co, Ni, Zn; dobdc(4-) = 2,5-dioxido-1,4-benzenedicarboxylate) structure type are demonstrated to bind carbon monoxide reversibly and at high capacity. Infrared spectra indicate that, upon coordination of CO to the divalent metal cations lining the pores within these frameworks, the C-O stretching frequency is blue-shifted, consistent with nonclassical metal-CO interactions. Structure determinations reveal M-CO distances ranging from 2.09(2) Å for M = Ni to 2.49(1) Å for M = Zn and M-C-O angles ranging from 161.2(7)° for M = Mg to 176.9(6)° for M = Fe. Electronic structure calculations employing density functional theory (DFT) resulted in good agreement with the trends apparent in the infrared spectra and crystal structures. These results represent the first crystallographically characterized magnesium and zinc carbonyl compounds and the first high-spin manganese(II), iron(II), cobalt(II), and nickel(II) carbonyl species. Adsorption isotherms indicate reversible adsorption, with capacities for the Fe, Co, and Ni frameworks approaching one CO per metal cation site at 1 bar, corresponding to loadings as high as 6.0 mmol/g and 157 cm(3)/cm(3). The six frameworks display (negative) isosteric heats of CO adsorption ranging from 52.7 to 27.2 kJ/mol along the series Ni > Co > Fe > Mg > Mn > Zn, following the Irving-Williams stability order. The reversible CO binding suggests that these frameworks may be of utility for the separation of CO from various industrial gas mixtures, including CO/H2 and CO/N2. Selectivities determined from gas adsorption isotherm data using ideal adsorbed solution theory (IAST) over a range of gas compositions at 1 bar and 298 K indicate that all six M2(dobdc) frameworks could potentially be used as solid adsorbents to replace current cryogenic distillation technologies, with the choice of M dictating adsorbent regeneration energy and the level of purity of the resulting gases.

  3. A LIGHT-WEIGHT MVC (MODEL-VIEW-CONTROLLER FRAMEWORK FOR SMART DEVICE APPLICATION

    Directory of Open Access Journals (Sweden)

    Budi Darma Laksana

    2006-01-01

    Full Text Available In this paper, a light-weight MVC framework for smart device application is designed and implemented. The primary goal of the work is to provide a MVC framework for a commercial smart device product development. To this end, the developed framework presents integration between the classic design patterns, MVC and state-of-the-art technology XAML by adapting a MVC framework of an open source XAML efforts, MyXaml into .NET Compact Framework. As the compact framework only comprises 12% of .NET Framework library, some design and architectural changes of the existing framework need to be done to achieve the same abstraction level. The adapted framework enables to reduce the complexity of the smart device application development, reuse each component of the MVC separately in different project and provide a more manageable source code as the system architecture is more apparent from the source code itself as well as provide a commonality of the development pattern. A prototype of simple database interface application was built to show these benefits.

  4. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  5. Enabling Integrated Decision Making for Electronic-Commerce by Modelling an Enterprise's Sharable Knowledge.

    Science.gov (United States)

    Kim, Henry M.

    2000-01-01

    An enterprise model, a computational model of knowledge about an enterprise, is a useful tool for integrated decision-making by e-commerce suppliers and customers. Sharable knowledge, once represented in an enterprise model, can be integrated by the modeled enterprise's e-commerce partners. Presents background on enterprise modeling, followed by…

  6. Pilot project as enabler?

    DEFF Research Database (Denmark)

    Neisig, Margit; Glimø, Helle; Holm, Catrine Granzow;

    This article deals with a systemic perspective on transition. The field of study addressed is a pilot project as enabler of transition in a highly complex polycentric context. From a Luhmannian systemic approach, a framework is created to understand and address barriers of change occurred using p...

  7. Airline Sustainability Modeling: A New Framework with Application of Bayesian Structural Equation Modeling

    Directory of Open Access Journals (Sweden)

    Hashem Salarzadeh Jenatabadi

    2016-11-01

    Full Text Available There are many factors which could influence the sustainability of airlines. The main purpose of this study is to introduce a framework for a financial sustainability index and model it based on structural equation modeling (SEM with maximum likelihood and Bayesian predictors. The introduced framework includes economic performance, operational performance, cost performance, and financial performance. Based on both Bayesian SEM (Bayesian-SEM and Classical SEM (Classical-SEM, it was found that economic performance with both operational performance and cost performance are significantly related to the financial performance index. The four mathematical indices employed are root mean square error, coefficient of determination, mean absolute error, and mean absolute percentage error to compare the efficiency of Bayesian-SEM and Classical-SEM in predicting the airline financial performance. The outputs confirmed that the framework with Bayesian prediction delivered a good fit with the data, although the framework predicted with a Classical-SEM approach did not prepare a well-fitting model. The reasons for this discrepancy between Classical and Bayesian predictions, as well as the potential advantages and caveats with the application of Bayesian approach in airline sustainability studies, are debated.

  8. Legitimising neural network river forecasting models: a new data-driven mechanistic modelling framework

    Science.gov (United States)

    Mount, N. J.; Dawson, C. W.; Abrahart, R. J.

    2013-01-01

    In this paper we address the difficult problem of gaining an internal, mechanistic understanding of a neural network river forecasting (NNRF) model. Neural network models in hydrology have long been criticised for their black-box character, which prohibits adequate understanding of their modelling mechanisms and has limited their broad acceptance by hydrologists. In response, we here present a new, data-driven mechanistic modelling (DDMM) framework that incorporates an evaluation of the legitimacy of a neural network's internal modelling mechanism as a core element in the model development process. The framework is exemplified for two NNRF modelling scenarios, and uses a novel adaptation of first order, partial derivate, relative sensitivity analysis methods as the means by which each model's mechanistic legitimacy is explored. The results demonstrate the limitations of standard, goodness-of-fit validation procedures applied by NNRF modellers, by highlighting how the internal mechanisms of complex models that produce the best fit scores can have much lower legitimacy than simpler counterparts whose scores are only slightly inferior. The study emphasises the urgent need for better mechanistic understanding of neural network-based hydrological models and the further development of methods for elucidating their mechanisms.

  9. Improvements in the Scalability of the NASA Goddard Multiscale Modeling Framework for Hurricane Climate Studies

    Science.gov (United States)

    Shen, Bo-Wen; Tao, Wei-Kuo; Chern, Jiun-Dar

    2007-01-01

    Improving our understanding of hurricane inter-annual variability and the impact of climate change (e.g., doubling CO2 and/or global warming) on hurricanes brings both scientific and computational challenges to researchers. As hurricane dynamics involves multiscale interactions among synoptic-scale flows, mesoscale vortices, and small-scale cloud motions, an ideal numerical model suitable for hurricane studies should demonstrate its capabilities in simulating these interactions. The newly-developed multiscale modeling framework (MMF, Tao et al., 2007) and the substantial computing power by the NASA Columbia supercomputer show promise in pursuing the related studies, as the MMF inherits the advantages of two NASA state-of-the-art modeling components: the GEOS4/fvGCM and 2D GCEs. This article focuses on the computational issues and proposes a revised methodology to improve the MMF's performance and scalability. It is shown that this prototype implementation enables 12-fold performance improvements with 364 CPUs, thereby making it more feasible to study hurricane climate.

  10. Identifying the barriers and enablers for a triage, treatment, and transfer clinical intervention to manage acute stroke patients in the emergency department: a systematic review using the theoretical domains framework (TDF).

    Science.gov (United States)

    Craig, Louise E; McInnes, Elizabeth; Taylor, Natalie; Grimley, Rohan; Cadilhac, Dominique A; Considine, Julie; Middleton, Sandy

    2016-11-28

    Clinical guidelines recommend that assessment and management of patients with stroke commences early including in emergency departments (ED). To inform the development of an implementation intervention targeted in ED, we conducted a systematic review of qualitative and quantitative studies to identify relevant barriers and enablers to six key clinical behaviours in acute stroke care: appropriate triage, thrombolysis administration, monitoring and management of temperature, blood glucose levels, and of swallowing difficulties and transfer of stroke patients in ED. Studies of any design, conducted in ED, where barriers or enablers based on primary data were identified for one or more of these six clinical behaviours. Major biomedical databases (CINAHL, OVID SP EMBASE, OVID SP MEDLINE) were searched using comprehensive search strategies. The barriers and enablers were categorised using the theoretical domains framework (TDF). The behaviour change technique (BCT) that best aligned to the strategy each enabler represented was selected for each of the reported enablers using a standard taxonomy. Five qualitative studies and four surveys out of the 44 studies identified met the selection criteria. The majority of barriers reported corresponded with the TDF domains of "environmental, context and resources" (such as stressful working conditions or lack of resources) and "knowledge" (such as lack of guideline awareness or familiarity). The majority of enablers corresponded with the domains of "knowledge" (such as education for physicians on the calculated risk of haemorrhage following intravenous thrombolysis [tPA]) and "skills" (such as providing opportunity to treat stroke cases of varying complexity). The total number of BCTs assigned was 18. The BCTs most frequently assigned to the reported enablers were "focus on past success" and "information about health consequences." Barriers and enablers for the delivery of key evidence-based protocols in an emergency setting have

  11. A Physics-Based Modeling Framework for Prognostic Studies

    Science.gov (United States)

    Kulkarni, Chetan S.

    2014-01-01

    Prognostics and Health Management (PHM) methodologies have emerged as one of the key enablers for achieving efficient system level maintenance as part of a busy operations schedule, and lowering overall life cycle costs. PHM is also emerging as a high-priority issue in critical applications, where the focus is on conducting fundamental research in the field of integrated systems health management. The term diagnostics relates to the ability to detect and isolate faults or failures in a system. Prognostics on the other hand is the process of predicting health condition and remaining useful life based on current state, previous conditions and future operating conditions. PHM methods combine sensing, data collection, interpretation of environmental, operational, and performance related parameters to indicate systems health under its actual application conditions. The development of prognostics methodologies for the electronics field has become more important as more electrical systems are being used to replace traditional systems in several applications in the aeronautics, maritime, and automotive fields. The development of prognostics methods for electronics presents several challenges due to the great variety of components used in a system, a continuous development of new electronics technologies, and a general lack of understanding of how electronics fail. Similarly with electric unmanned aerial vehicles, electrichybrid cars, and commercial passenger aircraft, we are witnessing a drastic increase in the usage of batteries to power vehicles. However, for battery-powered vehicles to operate at maximum efficiency and reliability, it becomes crucial to both monitor battery health and performance and to predict end of discharge (EOD) and end of useful life (EOL) events. We develop an electrochemistry-based model of Li-ion batteries that capture the significant electrochemical processes, are computationally efficient, capture the effects of aging, and are of suitable

  12. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Diederick

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  13. Designing a framework to design a business model for the 'bottom of the pyramid' population

    NARCIS (Netherlands)

    Ver loren van Themaat, Tanye; Schutte, Cornelius S.L.; Lutters, Eric

    2013-01-01

    This article presents a framework for developing and designing a business model to target the bottom of the pyramid (BoP) population. Using blue ocean strategy and business model literature, integrated with research on the BoP, the framework offers a systematic approach for organisations to analyse

  14. A Framework for Parameter Estimation and Model Selection from Experimental Data in Systems Biology Using Approximate Bayesian Computation

    Science.gov (United States)

    Liepe, Juliane; Kirk, Paul; Filippi, Sarah; Toni, Tina; Barnes, Chris P.; Stumpf, Michael P.H.

    2016-01-01

    As modeling becomes a more widespread practice in the life- and biomedical sciences, we require reliable tools to calibrate models against ever more complex and detailed data. Here we present an approximate Bayesian computation framework and software environment, ABC-SysBio, which enables parameter estimation and model selection in the Bayesian formalism using Sequential Monte-Carlo approaches. We outline the underlying rationale, discuss the computational and practical issues, and provide detailed guidance as to how the important tasks of parameter inference and model selection can be carried out in practice. Unlike other available packages, ABC-SysBio is highly suited for investigating in particular the challenging problem of fitting stochastic models to data. Although computationally expensive, the additional insights gained in the Bayesian formalism more than make up for this cost, especially in complex problems. PMID:24457334

  15. A modeling framework for the evolution and spread of antibiotic resistance: literature review and model categorization.

    Science.gov (United States)

    Spicknall, Ian H; Foxman, Betsy; Marrs, Carl F; Eisenberg, Joseph N S

    2013-08-15

    Antibiotic-resistant infections complicate treatment and increase morbidity and mortality. Mathematical modeling has played an integral role in improving our understanding of antibiotic resistance. In these models, parameter sensitivity is often assessed, while model structure sensitivity is not. To examine the implications of this, we first reviewed the literature on antibiotic-resistance modeling published between 1993 and 2011. We then classified each article's model structure into one or more of 6 categories based on the assumptions made in those articles regarding within-host and population-level competition between antibiotic-sensitive and antibiotic-resistant strains. Each model category has different dynamic implications with respect to how antibiotic use affects resistance prevalence, and therefore each may produce different conclusions about optimal treatment protocols that minimize resistance. Thus, even if all parameter values are correctly estimated, inferences may be incorrect because of the incorrect selection of model structure. Our framework provides insight into model selection.

  16. The Marine Virtual Laboratory (version 2.1): enabling efficient ocean model configuration

    Science.gov (United States)

    Oke, Peter R.; Proctor, Roger; Rosebrock, Uwe; Brinkman, Richard; Cahill, Madeleine L.; Coghlan, Ian; Divakaran, Prasanth; Freeman, Justin; Pattiaratchi, Charitha; Roughan, Moninya; Sandery, Paul A.; Schaeffer, Amandine; Wijeratne, Sarath

    2016-09-01

    The technical steps involved in configuring a regional ocean model are analogous for all community models. All require the generation of a model grid, preparation and interpolation of topography, initial conditions, and forcing fields. Each task in configuring a regional ocean model is straightforward - but the process of downloading and reformatting data can be time-consuming. For an experienced modeller, the configuration of a new model domain can take as little as a few hours - but for an inexperienced modeller, it can take much longer. In pursuit of technical efficiency, the Australian ocean modelling community has developed the Web-based MARine Virtual Laboratory (WebMARVL). WebMARVL allows a user to quickly and easily configure an ocean general circulation or wave model through a simple interface, reducing the time to configure a regional model to a few minutes. Through WebMARVL, a user is prompted to define the basic options needed for a model configuration, including the model, run duration, spatial extent, and input data. Once all aspects of the configuration are selected, a series of data extraction, reprocessing, and repackaging services are run, and a "take-away bundle" is prepared for download. Building on the capabilities developed under Australia's Integrated Marine Observing System, WebMARVL also extracts all of the available observations for the chosen time-space domain. The user is able to download the take-away bundle and use it to run the model of his or her choice. Models supported by WebMARVL include three community ocean general circulation models and two community wave models. The model configuration from the take-away bundle is intended to be a starting point for scientific research. The user may subsequently refine the details of the model set-up to improve the model performance for the given application. In this study, WebMARVL is described along with a series of results from test cases comparing WebMARVL-configured models to observations

  17. A Framework Model for an Order Fulfillment System Based on Service Oriented Architecture

    Institute of Scientific and Technical Information of China (English)

    YANG Li-xi; LI Shi-qi

    2008-01-01

    To effectively implement order fulfillment, we present an integrated framework model focusing on the whole process of order fulfillment. Firstly, five aims of the OFS (order fulfillment system) are built. Then after discussing three major processes of order fulfillment, we summarize functional and quality attributes of the OFS. Subsequently, we investigate SOA (Service Oriented Architecture) and present a SOA meta-model to be an integrated framework and to fulfill quality requirements. Moreover, based on the SOA meta-model, we construct a conceptual framework model that aims to conveniently integrate other functions from different systems into the order fulfillment system. This model offers enterprises a new approach to implementing order fulfillment.

  18. NUMERICAL MODELS AS ENABLING TOOLS FOR TIDAL-STREAM ENERGY EXTRACTION AND ENVIRONMENTAL IMPACT ASSESSMENT

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Zhaoqing; Wang, Taiping

    2016-06-24

    This paper presents a modeling study conducted to evaluate tidal-stream energy extraction and its associated potential environmental impacts using a three-dimensional unstructured-grid coastal ocean model, which was coupled with a water-quality model and a tidal-turbine module.

  19. Model-Based Reasoning in the Upper-Division Physics Laboratory: Framework and Initial Results

    CERN Document Server

    Zwickl, Benjamin M; Finkelstein, Noah; Lewandowski, H J

    2014-01-01

    Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). We review and extend existing frameworks on modeling to develop a new framework that more naturally describes model-based reasoning in upper-division physics labs. A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to document examples of model-based reasoning in the laboratory and refine the modeling framework. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of mod...

  20. Environmental Model Interoperability Enabled by Open Geospatial Standards - Results of a Feasibility Study (Invited)

    Science.gov (United States)

    Benedict, K. K.; Yang, C.; Huang, Q.

    2010-12-01

    The availability of high-speed research networks such as the US National Lambda Rail and the GÉANT network, scalable on-demand commodity computing resources provided by public and private "cloud" computing systems, and increasing demand for rapid access to the products of environmental models for both research and public policy development contribute to a growing need for the evaluation and development of environmental modeling systems that distribute processing, storage, and data delivery capabilities between network connected systems. In an effort to address the feasibility of developing a standards-based distributed modeling system in which model execution systems are physically separate from data storage and delivery systems, the research project presented in this paper developed a distributed dust forecasting system in which two nested atmospheric dust models are executed at George Mason University (GMU, in Fairfax, VA) while data and model output processing services are hosted at the University of New Mexico (UNM, in Albuquerque, NM). Exchange of model initialization and boundary condition parameters between the servers at UNM and the model execution systems at GMU is accomplished through Open Geospatial Consortium (OGC) Web Coverage Services (WCS) and Web Feature Services (WFS) while model outputs are pushed from GMU systems back to UNM using a REST web service interface. In addition to OGC and non-OGC web services for exchange between UNM and GMU, the servers at UNM also provide access to the input meteorological model products, intermediate and final dust model outputs, and other products derived from model outputs through OGC WCS, WFS, and OGC Web Map Services (WMS). The performance of the nested versus non-nested models is assessed in this research, with the results of the performance analysis providing the core content of the produced feasibility study. System integration diagram illustrating the storage and service platforms hosted at the Earth Data

  1. A High Fidelity Multiphysics Framework for Modeling CRUD Deposition on PWR Fuel Rods

    Science.gov (United States)

    Walter, Daniel John

    Corrosion products on the fuel cladding surfaces within pressurized water reactor fuel assemblies have had a significant impact on reactor operation. These types of deposits are referred to as CRUD and can lead to power shifts, as a consequence of the accumulation of solid boron phases on the fuel rod surfaces. Corrosion deposits can also lead to fuel failure resulting from localized corrosion, where the increased thermal resistance of the deposit leads to higher cladding temperatures. The prediction of these occurrences requires a comprehensive model of local thermal hydraulic and chemical processes occurring in close proximity to the cladding surface, as well as their driving factors. Such factors include the rod power distribution, coolant corrosion product concentration, as well as the feedbacks between heat transfer, fluid dynamics, chemistry, and neutronics. To correctly capture the coupled physics and corresponding feedbacks, a high fidelity framework is developed that predicts three-dimensional CRUD deposition on a rod-by-rod basis. Multiphysics boundary conditions resulting from the coupling of heat transfer, fluid dynamics, coolant chemistry, CRUD deposition, neutron transport, and nuclide transmutation inform the CRUD deposition solver. Through systematic parametric sensitivity studies of the CRUD property inputs, coupled boundary conditions, and multiphysics feedback mechanisms, the most important variables of multiphysics CRUD modeling are identified. Moreover, the modeling framework is challenged with a blind comparison of plant data to predictions by a simulation of a sub-assembly within the Seabrook nuclear plant that experienced CRUD induced fuel failures. The physics within the computational framework are loosely coupled via an operator-splitting technique. A control theory approach is adopted to determine the temporal discretization at which to execute a data transfer from one physics to another. The coupled stepsize selection is viewed as a

  2. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio;

    2011-01-01

    branches; the first branch deals with single-scale model development while the second branch introduces features for multiscale model development to the methodology. In this paper, the emphasis is on single-scale model development and application part. The modeling framework and the supported stepwise......Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer......-aided methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task...

  3. A prototype framework for models of socio-hydrology: identification of key feedback loops and parameterisation approach

    Science.gov (United States)

    Elshafei, Y.; Sivapalan, M.; Tonts, M.; Hipsey, M. R.

    2014-06-01

    It is increasingly acknowledged that, in order to sustainably manage global freshwater resources, it is critical that we better understand the nature of human-hydrology interactions at the broader catchment system scale. Yet to date, a generic conceptual framework for building models of catchment systems that include adequate representation of socioeconomic systems - and the dynamic feedbacks between human and natural systems - has remained elusive. In an attempt to work towards such a model, this paper outlines a generic framework for models of socio-hydrology applicable to agricultural catchments, made up of six key components that combine to form the coupled system dynamics: namely, catchment hydrology, population, economics, environment, socioeconomic sensitivity and collective response. The conceptual framework posits two novel constructs: (i) a composite socioeconomic driving variable, termed the Community Sensitivity state variable, which seeks to capture the perceived level of threat to a community's quality of life, and acts as a key link tying together one of the fundamental feedback loops of the coupled system, and (ii) a Behavioural Response variable as the observable feedback mechanism, which reflects land and water management decisions relevant to the hydrological context. The framework makes a further contribution through the introduction of three macro-scale parameters that enable it to normalise for differences in climate, socioeconomic and political gradients across study sites. In this way, the framework provides for both macro-scale contextual parameters, which allow for comparative studies to be undertaken, and catchment-specific conditions, by way of tailored "closure relationships", in order to ensure that site-specific and application-specific contexts of socio-hydrologic problems can be accommodated. To demonstrate how such a framework would be applied, two socio-hydrological case studies, taken from the Australian experience, are presented

  4. A unified framework for modeling landscape evolution by discrete flows

    Science.gov (United States)

    Shelef, Eitan; Hilley, George E.

    2016-05-01

    Topographic features such as branched valley networks and undissected convex-up hillslopes are observed in disparate physical environments. In some cases, these features are formed by sediment transport processes that occur discretely in space and time, while in others, by transport processes that are uniformly distributed across the landscape. This paper presents an analytical framework that reconciles the basic attributes of such sediment transport processes with the topographic features that they form and casts those in terms that are likely common to different physical environments. In this framework, temporal changes in surface elevation reflect the frequency with which the landscape is traversed by geophysical flows generated discretely in time and space. This frequency depends on the distance to which flows travel downslope, which depends on the dynamics of individual flows, the lithologic and topographic properties of the underlying substrate, and the coevolution of topography, erosion, and the routing of flows over the topographic surface. To explore this framework, we postulate simple formulations for sediment transport and flow runout distance and demonstrate that the conditions for hillslope and channel network formation can be cast in terms of fundamental parameters such as distance from drainage divide and a friction-like coefficient that describes a flow's resistance to motion. The framework we propose is intentionally general, but the postulated formulas can be substituted with those that aim to describe a specific process and to capture variations in the size distribution of such flow events.

  5. Framework for Modelling Multiple Input Complex Aggregations for Interactive Installations

    DEFF Research Database (Denmark)

    Padfield, Nicolas; Andreasen, Troels

    2012-01-01

    We describe a generalized framework as a method and design tool for creating interactive installations with a demand for exploratory meaning creation, not limited to the design stage, but extending into the stage where the installation meets participants and audience. The proposed solution is bas...

  6. Parametric design and analysis framework with integrated dynamic models

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    2014-01-01

    of building energy and indoor environment, are generally confined to late in the design process. Consequence based design is a framework intended for the early design stage. It involves interdisciplinary expertise that secures validity and quality assurance with a simulationist while sustaining autonomous...

  7. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    Science.gov (United States)

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  8. Holland's RIASEC Model as an Integrative Framework for Individual Differences

    Science.gov (United States)

    Armstrong, Patrick Ian; Day, Susan X.; McVay, Jason P.; Rounds, James

    2008-01-01

    Using data from published sources, the authors investigated J. L. Holland's (1959, 1997) theory of interest types as an integrative framework for organizing individual differences variables that are used in counseling psychology. Holland's interest types were used to specify 2- and 3-dimensional interest structures. In Study 1, measures of…

  9. The unified model of vegetarian identity: A conceptual framework for understanding plant-based food choices.

    Science.gov (United States)

    Rosenfeld, Daniel L; Burrow, Anthony L

    2017-05-01

    By departing from social norms regarding food behaviors, vegetarians acquire membership in a distinct social group and can develop a salient vegetarian identity. However, vegetarian identities are diverse, multidimensional, and unique to each individual. Much research has identified fundamental psychological aspects of vegetarianism, and an identity framework that unifies these findings into common constructs and conceptually defines variables is needed. Integrating psychological theories of identity with research on food choices and vegetarianism, this paper proposes a conceptual model for studying vegetarianism: The Unified Model of Vegetarian Identity (UMVI). The UMVI encompasses ten dimensions-organized into three levels (contextual, internalized, and externalized)-that capture the role of vegetarianism in an individual's self-concept. Contextual dimensions situate vegetarianism within contexts; internalized dimensions outline self-evaluations; and externalized dimensions describe enactments of identity through behavior. Together, these dimensions form a coherent vegetarian identity, characterizing one's thoughts, feelings, and behaviors regarding being vegetarian. By unifying dimensions that capture psychological constructs universally, the UMVI can prevent discrepancies in operationalization, capture the inherent diversity of vegetarian identities, and enable future research to generate greater insight into how people understand themselves and their food choices.

  10. Autogenerator-based modelling framework for development of strategic games simulations: rational pigs game extended.

    Science.gov (United States)

    Fabac, Robert; Radošević, Danijel; Magdalenić, Ivan

    2014-01-01

    When considering strategic games from the conceptual perspective that focuses on the questions of participants' decision-making rationality, the very issues of modelling and simulation are rarely discussed. The well-known Rational Pigs matrix game has been relatively intensively analyzed in terms of reassessment of the logic of two players involved in asymmetric situations as gluttons that differ significantly by their attributes. This paper presents a successful attempt of using autogenerator for creating the framework of the game, including the predefined scenarios and corresponding payoffs. Autogenerator offers flexibility concerning the specification of game parameters, which consist of variations in the number of simultaneous players and their features and game objects and their attributes as well as some general game characteristics. In the proposed approach the model of autogenerator was upgraded so as to enable program specification updates. For the purpose of treatment of more complex strategic scenarios, we created the Rational Pigs Game Extended (RPGE), in which the introduction of a third glutton entails significant structural changes. In addition, due to the existence of particular attributes of the new player, "the tramp," one equilibrium point from the original game is destabilized which has an influence on the decision-making of rational players.

  11. Total Quality Management (TQM framework for e-learning based on EFQM and Kirkpatrick models

    Directory of Open Access Journals (Sweden)

    Jeanne Schreurs

    2006-07-01

    Full Text Available The EFQM excellence model is a famous quality management tool. We have translated it to be useful in e-learning quality management. EFQM will be used as a framework for self-evaluation. We developed the e-learning stakeholder model. We identified the main criterion and positioned them in the stakeholder model. We present short the Kirkpatrick evaluation model of e-learning. We developed a Kirkpatrick-EFQM self-assessment framework. We propose the limited learner-centric self-assessment framework. A preliminary set of quality criteria have been identified for self-assessment by the learners.

  12. Software Process Improvement Framework Based on CMMI Continuous Model Using QFD

    Directory of Open Access Journals (Sweden)

    Yonghui Cao

    2013-01-01

    Full Text Available In the rapid technological innovation and changes era, the key to the survival company is the continuous improvement of its process. In this paper, we introduce Software Process Improvement (SPI and Quality Function Deployment (QFD; and for combining also the staged model and the continuous model in CMMI, the Software Process Improvement framework with CMMI has two parts: 1 Software Process Improvement framework with CMMI staged model based on QFD and 2 SPI framework for C MI Mbased on QFD continuous model. Finally, we also draw conclusions.

  13. Open Knee: Open Source Modeling & Simulation to Enable Scientific Discovery and Clinical Care in Knee Biomechanics

    Science.gov (United States)

    Erdemir, Ahmet

    2016-01-01

    Virtual representations of the knee joint can provide clinicians, scientists, and engineers the tools to explore mechanical function of the knee and its tissue structures in health and disease. Modeling and simulation approaches such as finite element analysis also provide the possibility to understand the influence of surgical procedures and implants on joint stresses and tissue deformations. A large number of knee joint models are described in the biomechanics literature. However, freely accessible, customizable, and easy-to-use models are scarce. Availability of such models can accelerate clinical translation of simulations, where labor intensive reproduction of model development steps can be avoided. The interested parties can immediately utilize readily available models for scientific discovery and for clinical care. Motivated by this gap, this study aims to describe an open source and freely available finite element representation of the tibiofemoral joint, namely Open Knee, which includes detailed anatomical representation of the joint's major tissue structures, their nonlinear mechanical properties and interactions. Three use cases illustrate customization potential of the model, its predictive capacity, and its scientific and clinical utility: prediction of joint movements during passive flexion, examining the role of meniscectomy on contact mechanics and joint movements, and understanding anterior cruciate ligament mechanics. A summary of scientific and clinically directed studies conducted by other investigators are also provided. The utilization of this open source model by groups other than its developers emphasizes the premise of model sharing as an accelerator of simulation-based medicine. Finally, the imminent need to develop next generation knee models are noted. These are anticipated to incorporate individualized anatomy and tissue properties supported by specimen-specific joint mechanics data for evaluation, all acquired in vitro from varying age

  14. From Principles to Details: Integrated Framework for Architecture Modelling of Large Scale Software Systems

    Directory of Open Access Journals (Sweden)

    Andrzej Zalewski

    2013-06-01

    Full Text Available There exist numerous models of software architecture (box models, ADL’s, UML, architectural decisions, architecture modelling frameworks (views, enterprise architecture frameworks and even standards recommending practice for the architectural description. We show in this paper, that there is still a gap between these rather abstract frameworks/standards and existing architecture models. Frameworks and standards define what should be modelled rather than which models should be used and how these models are related to each other. We intend to prove that a less abstract modelling framework is needed for the effective modelling of large scale software intensive systems. It should provide a more precise guidance kinds of models to be employed and how they should relate to each other. The paper defines principles that can serve as base for an integrated model. Finally, structure of such a model has been proposed. It comprises three layers: the upper one – architectural policy – reflects corporate policy and strategies in architectural terms, the middle one –system organisation pattern – represents the core structural concepts and their rationale at a given level of scope, the lower one contains detailed architecture models. Architectural decisions play an important role here: they model the core architectural concepts explaining detailed models as well as organise the entire integrated model and the relations between its submodels.

  15. System modeling with the DISC framework: evidence from safety-critical domains.

    Science.gov (United States)

    Reiman, Teemu; Pietikäinen, Elina; Oedewald, Pia; Gotcheva, Nadezhda

    2012-01-01

    The objective of this paper is to illustrate the development and application of the Design for Integrated Safety Culture (DISC) framework for system modeling by evaluating organizational potential for safety in nuclear and healthcare domains. The DISC framework includes criteria for good safety culture and a description of functions that the organization needs to implement in order to orient the organization toward the criteria. Three case studies will be used to illustrate the utilization of the DISC framework in practice.

  16. Experimental development based on mapping rule between requirements analysis model and web framework specific design model.

    Science.gov (United States)

    Okuda, Hirotaka; Ogata, Shinpei; Matsuura, Saeko

    2013-12-01

    Model Driven Development is a promising approach to develop high quality software systems. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface prototype from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the prototype. We proposes a mapping rule in which design information independent of each web application framework implementation is defined based on the requirements analysis model, so as to improve the traceability to the final product from the valid requirements analysis model. This paper discusses the result of applying our method to the development of a Group Work Support System that is currently running in our department.

  17. Parallelization and High-Performance Computing Enables Automated Statistical Inference of Multi-scale Models.

    Science.gov (United States)

    Jagiella, Nick; Rickert, Dennis; Theis, Fabian J; Hasenauer, Jan

    2017-02-22

    Mechanistic understanding of multi-scale biological processes, such as cell proliferation in a changing biological tissue, is readily facilitated by computational models. While tools exist to construct and simulate multi-scale models, the statistical inference of the unknown model parameters remains an open problem. Here, we present and benchmark a parallel approximate Bayesian computation sequential Monte Carlo (pABC SMC) algorithm, tailored for high-performance computing clusters. pABC SMC is fully automated and returns reliable parameter estimates and confidence intervals. By running the pABC SMC algorithm for ∼10(6) hr, we parameterize multi-scale models that accurately describe quantitative growth curves and histological data obtained in vivo from individual tumor spheroid growth in media droplets. The models capture the hybrid deterministic-stochastic behaviors of 10(5)-10(6) of cells growing in a 3D dynamically changing nutrient environment. The pABC SMC algorithm reliably converges to a consistent set of parameters. Our study demonstrates a proof of principle for robust, data-driven modeling of multi-scale biological systems and the feasibility of multi-scale model parameterization through statistical inference.

  18. A Variational Bayes Genomic-Enabled Prediction Model with Genotype × Environment Interaction

    Directory of Open Access Journals (Sweden)

    Osval A. Montesinos-López

    2017-06-01

    Full Text Available There are Bayesian and non-Bayesian genomic models that take into account G×E interactions. However, the computational cost of implementing Bayesian models is high, and becomes almost impossible when the number of genotypes, environments, and traits is very large, while, in non-Bayesian models, there are often important and unsolved convergence problems. The variational Bayes method is popular in machine learning, and, by approximating the probability distributions through optimization, it tends to be faster than Markov Chain Monte Carlo methods. For this reason, in this paper, we propose a new genomic variational Bayes version of the Bayesian genomic model with G×E using half-t priors on each standard deviation (SD term to guarantee highly noninformative and posterior inferences that are not sensitive to the choice of hyper-parameters. We show the complete theoretical derivation of the full conditional and the variational posterior distributions, and their implementations. We used eight experimental genomic maize and wheat data sets to illustrate the new proposed variational Bayes approximation, and compared its predictions and implementation time with a standard Bayesian genomic model with G×E. Results indicated that prediction accuracies are slightly higher in the standard Bayesian model with G×E than in its variational counterpart, but, in terms of computation time, the variational Bayes genomic model with G×E is, in general, 10 times faster than the conventional Bayesian genomic model with G×E. For this reason, the proposed model may be a useful tool for researchers who need to predict and select genotypes in several environments.

  19. Climate model emulation in an integrated assessment framework: a case study for mitigation policies in the electricity sector

    Science.gov (United States)

    Foley, A. M.; Holden, P. B.; Edwards, N. R.; Mercure, J.-F.; Salas, P.; Pollitt, H.; Chewpreecha, U.

    2016-02-01

    We present a carbon-cycle-climate modelling framework using model emulation, designed for integrated assessment modelling, which introduces a new emulator of the carbon cycle (GENIEem). We demonstrate that GENIEem successfully reproduces the CO2 concentrations of the Representative Concentration Pathways when forced with the corresponding CO2 emissions and non-CO2 forcing. To demonstrate its application as part of the integrated assessment framework, we use GENIEem along with an emulator of the climate (PLASIM-ENTSem) to evaluate global CO2 concentration levels and spatial temperature and precipitation response patterns resulting from CO2 emission scenarios. These scenarios are modelled using a macroeconometric model (E3MG) coupled to a model of technology substitution dynamics (FTT), and represent different emissions reduction policies applied solely in the electricity sector, without mitigation in the rest of the economy. The effect of cascading uncertainty is apparent, but despite uncertainties, it is clear that in all scenarios, global mean temperatures in excess of 2 °C above pre-industrial levels are projected by the end of the century. Our approach also highlights the regional temperature and precipitation patterns associated with the global mean temperature change occurring in these scenarios, enabling more robust impacts modelling and emphasizing the necessity of focusing on spatial patterns in addition to global mean temperature change.

  20. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  1. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  2. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    Science.gov (United States)

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  3. A GIS-Enabled, Michigan-Specific, Hierarchical Groundwater Modeling and Visualization System

    Science.gov (United States)

    Liu, Q.; Li, S.; Mandle, R.; Simard, A.; Fisher, B.; Brown, E.; Ross, S.

    2005-12-01

    Efficient management of groundwater resources relies on a comprehensive database that represents the characteristics of the natural groundwater system as well as analysis and modeling tools to describe the impacts of decision alternatives. Many agencies in Michigan have spent several years compiling expensive and comprehensive surface water and groundwater inventories and other related spatial data that describe their respective areas of responsibility. However, most often this wealth of descriptive data has only been utilized for basic mapping purposes. The benefits from analyzing these data, using GIS analysis functions or externally developed analysis models or programs, has yet to be systematically realized. In this talk, we present a comprehensive software environment that allows Michigan groundwater resources managers and frontline professionals to make more effective use of the available data and improve their ability to manage and protect groundwater resources, address potential conflicts, design cleanup schemes, and prioritize investigation activities. In particular, we take advantage of the Interactive Ground Water (IGW) modeling system and convert it to a customized software environment specifically for analyzing, modeling, and visualizing the Michigan statewide groundwater database. The resulting Michigan IGW modeling system (IGW-M) is completely window-based, fully interactive, and seamlessly integrated with a GIS mapping engine. The system operates in real-time (on the fly) providing dynamic, hierarchical mapping, modeling, spatial analysis, and visualization. Specifically, IGW-M allows water resources and environmental professionals in Michigan to: * Access and utilize the extensive data from the statewide groundwater database, interactively manipulate GIS objects, and display and query the associated data and attributes; * Analyze and model the statewide groundwater database, interactively convert GIS objects into numerical model features

  4. Parametric Generation of Polygonal Tree Models for Rendering on Tessellation-Enabled Hardware

    OpenAIRE

    Nystad, Jørgen

    2010-01-01

    The main contribution of this thesis is a parametric method for generation of single-mesh polygonal tree models that follow natural rules as indicated by da Vinci in his notebooks. Following these rules allow for a relatively simple scheme of connecting branches to parent branches. Proper branch connection is a requirement for gaining the benefits of subdivision. Techniques for proper texture coordinate generation and subdivision are also explored.The result is a tree model generation scheme ...

  5. Enabling Energy-Awareness in the Semantic 3d City Model of Vienna

    Science.gov (United States)

    Agugiaro, G.

    2016-09-01

    This paper presents and discusses the first results regarding selection, analysis, preparation and eventual integration of a number of energy-related datasets, chosen in order to enrich a CityGML-based semantic 3D city model of Vienna. CityGML is an international standard conceived specifically as information and data model for semantic city models at urban and territorial scale. The still-in-development Energy Application Domain Extension (ADE) is a CityGML extension conceived to specifically model, manage and store energy-related features and attributes for buildings. The work presented in this paper is embedded within the European Marie-Curie ITN project "CINERGY, Smart cities with sustainable energy systems", which aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. Given the scope and scale of the project, it is therefore vital to set up a common, unique and spatio-semantically coherent urban data model to be used as information hub for all applications being developed. This paper reports about the experiences done so far, it describes the test area in Vienna, Austria, and the available data sources, it shows and exemplifies the main data integration issues, the strategies developed to solve them in order to obtain the enriched 3D city model. The first results as well as some comments about their quality and limitations are presented, together with the discussion regarding the next steps and some planned improvements.

  6. The DSET Tool Library: A software approach to enable data exchange between climate system models

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, J. [Lawrence Livermore National Lab., CA (United States)

    1994-12-01

    Climate modeling is a computationally intensive process. Until recently computers were not powerful enough to perform the complex calculations required to simulate the earth`s climate. As a result standalone programs were created that represent components of the earth`s climate (e.g., Atmospheric Circulation Model). However, recent advances in computing, including massively parallel computing, make it possible to couple the components forming a complete earth climate simulation. The ability to couple different climate model components will significantly improve our ability to predict climate accurately and reliably. Historically each major component of the coupled earth simulation is a standalone program designed independently with different coordinate systems and data representations. In order for two component models to be coupled, the data of one model must be mapped to the coordinate system of the second model. The focus of this project is to provide a general tool to facilitate the mapping of data between simulation components, with an emphasis on using object-oriented programming techniques to provide polynomial interpolation, line and area weighting, and aggregation services.

  7. Modelling plasticity of unsaturated soils in a thermodynamically consistent framework

    CERN Document Server

    Coussy, O

    2010-01-01

    Constitutive equations of unsaturated soils are often derived in a thermodynamically consistent framework through the use a unique 'effective' interstitial pressure. This later is naturally chosen as the space averaged interstitial pressure. However, experimental observations have revealed that two stress state variables were needed to describe the stress-strain-strength behaviour of unsaturated soils. The thermodynamics analysis presented here shows that the most general approach to the behaviour of unsaturated soils actually requires three stress state variables: the suction, which is required to describe the retention properties of the soil and two effective stresses, which are required to describe the soil deformation at water saturation held constant. Actually, it is shown that a simple assumption related to internal deformation leads to the need of a unique effective stress to formulate the stress-strain constitutive equation describing the soil deformation. An elastoplastic framework is then presented ...

  8. Enabling high-quality observations of surface imperviousness for water runoff modelling from unmanned aerial vehicles

    Science.gov (United States)

    Tokarczyk, Piotr; Leitao, Joao Paulo; Rieckermann, Jörg; Schindler, Konrad; Blumensaat, Frank

    2015-04-01

    Modelling rainfall-runoff in urban areas is increasingly applied to support flood risk assessment particularly against the background of a changing climate and an increasing urbanization. These models typically rely on high-quality data for rainfall and surface characteristics of the area. While recent research in urban drainage has been focusing on providing spatially detailed rainfall data, the technological advances in remote sensing that ease the acquisition of detailed land-use information are less prominently discussed within the community. The relevance of such methods increase as in many parts of the globe, accurate land-use information is generally lacking, because detailed image data is unavailable. Modern unmanned air vehicles (UAVs) allow acquiring high-resolution images on a local level at comparably lower cost, performing on-demand repetitive measurements, and obtaining a degree of detail tailored for the purpose of the study. In this study, we investigate for the first time the possibility to derive high-resolution imperviousness maps for urban areas from UAV imagery and to use this information as input for urban drainage models. To do so, an automatic processing pipeline with a modern classification method is tested and applied in a state-of-the-art urban drainage modelling exercise. In a real-life case study in the area of Lucerne, Switzerland, we compare imperviousness maps generated from a consumer micro-UAV and standard large-format aerial images acquired by the Swiss national mapping agency (swisstopo). After assessing their correctness, we perform an end-to-end comparison, in which they are used as an input for an urban drainage model. Then, we evaluate the influence which different image data sources and their processing methods have on hydrological and hydraulic model performance. We analyze the surface runoff of the 307 individual sub-catchments regarding relevant attributes, such as peak runoff and volume. Finally, we evaluate the model

  9. A Framework for Modeling and Simulation of the Artificial

    Science.gov (United States)

    2012-01-01

    valid OMB control number. 1. REPORT DATE 2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE A Framework for...style of symphonic, folk, or jazz . A musical performance can also therefore have an ensemble of orches- tra, small group, or soloist. With no...constraint m3 :musical-performance (==> (equale (e@ style) jazz ) (or (equale (e@ ensemble) small-group) (equale (e@ ensemble) orchestra)))) (orv (ifv

  10. A Modular Simulation Framework for Assessing Swarm Search Models

    Science.gov (United States)

    2014-09-01

    Simulink . American Institute of Aeronautics and Astronautics, 2011. [31] MathWorks. (2014, Jun. 25). Strategies for efficient use of memory - MATLAB ... Simulink . [Online]. Available: http://www.mathworks.com/help/ matlab / matlab \\ _prog/strategies-for-efficient-use-of-memory.html [32] J. P. C. Kleijnen, S. M...explored utilizing today’s search decision support and analysis tools. This thesis develops a framework in MATLAB that allows the investigation of search

  11. CM-DataONE: A Framework for collaborative analysis of climate model output

    Science.gov (United States)

    Xu, Hao; Bai, Yuqi; Li, Sha; Dong, Wenhao; Huang, Wenyu; Xu, Shiming; Lin, Yanluan; Wang, Bin

    2015-04-01

    CM-DataONE is a distributed collaborative analysis framework for climate model data which aims to break through the data access barriers of increasing file size and to accelerate research process. As data size involved in project such as the fifth Coupled Model Intercomparison Project (CMIP5) has reached petabytes, conventional methods for analysis and diagnosis of model outputs have been rather time-consuming and redundant. CM-DataONE is developed for data publishers and researchers from relevant areas. It can enable easy access to distributed data and provide extensible analysis functions based on tools such as NCAR Command Language, NetCDF Operators (NCO) and Climate Data Operators (CDO). CM-DataONE can be easily installed, configured, and maintained. The main web application has two separate parts which communicate with each other through APIs based on HTTP protocol. The analytic server is designed to be installed in each data node while a data portal can be configured anywhere and connect to a nearest node. Functions such as data query, analytic task submission, status monitoring, visualization and product downloading are provided to end users by data portal. Data conform to CMIP5 Model Output Format in each peer node can be scanned by the server and mapped to a global information database. A scheduler included in the server is responsible for task decomposition, distribution and consolidation. Analysis functions are always executed where data locate. Analysis function package included in the server has provided commonly used functions such as EOF analysis, trend analysis and time series. Functions are coupled with data by XML descriptions and can be easily extended. Various types of results can be obtained by users for further studies. This framework has significantly decreased the amount of data to be transmitted and improved efficiency in model intercomparison jobs by supporting online analysis and multi-node collaboration. To end users, data query is

  12. Advances in Geoscience Modeling: Smart Modeling Frameworks, Self-Describing Models and the Role of Standardized Metadata

    Science.gov (United States)

    Peckham, Scott

    2016-04-01

    Over the last decade, model coupling frameworks like CSDMS (Community Surface Dynamics Modeling System) and ESMF (Earth System Modeling Framework) have developed mechanisms that make it much easier for modelers to connect heterogeneous sets of process models in a plug-and-play manner to create composite "system models". These mechanisms greatly simplify code reuse, but must simultaneously satisfy many different design criteria. They must be able to mediate or compensate for differences between the process models, such as their different programming languages, computational grids, time-stepping schemes, variable names and variable units. However, they must achieve this interoperability in a way that: (1) is noninvasive, requiring only relatively small and isolated changes to the original source code, (2) does not significantly reduce performance, (3) is not time-consuming or confusing for a model developer to implement, (4) can very easily be updated to accommodate new versions of a given process model and (5) does not shift the burden of providing model interoperability to the model developers. In tackling these design challenges, model framework developers have learned that the best solution is to provide each model with a simple, standardized interface, i.e. a set of standardized functions that make the model: (1) fully-controllable by a caller (e.g. a model framework) and (2) self-describing with standardized metadata. Model control functions are separate functions that allow a caller to initialize the model, advance the model's state variables in time and finalize the model. Model description functions allow a caller to retrieve detailed information on the model's input and output variables, its computational grid and its timestepping scheme. If the caller is a modeling framework, it can use the self description functions to learn about each process model in a collection to be coupled and then automatically call framework service components (e.g. regridders

  13. Collaborative Project. A Flexible Atmospheric Modeling Framework for the Community Earth System Model (CESM)

    Energy Technology Data Exchange (ETDEWEB)

    Gettelman, Andrew [University Corporation For Atmospheric Research (UCAR), Boulder, CO (United States)

    2015-10-01

    In this project we have been upgrading the Multiscale Modeling Framework (MMF) in the Community Atmosphere Model (CAM), also known as Super-Parameterized CAM (SP-CAM). This has included a major effort to update the coding standards and interface with CAM so that it can be placed on the main development trunk. It has also included development of a new software structure for CAM to be able to handle sub-grid column information. These efforts have formed the major thrust of the work.

  14. A PLM components monitoring framework for SMEs based on a PLM maturity model and FAHP methodology

    OpenAIRE

    Zhang, Haiqing; Sekhari, Aicha; Ouzrout, Yacine; Bouras, Abdelaziz

    2014-01-01

    Right PLM components selection and investments increase business advantages. This paper develops a PLM components monitoring framework to assess and guide PLM implementation in small and middle enterprises (SMEs). The framework builds upon PLM maturity models and decision-making methodology. PLM maturity model has the capability to analyze PLM functionalities and evaluate PLM components. A proposed PLM components maturity assessment (PCMA) model can obtain general maturity levels of PLM compo...

  15. A learning-enabled neuron array IC based upon transistor channel models of biological phenomena.

    Science.gov (United States)

    Brink, S; Nease, S; Hasler, P; Ramakrishnan, S; Wunderlich, R; Basu, A; Degnan, B

    2013-02-01

    We present a single-chip array of 100 biologically-based electronic neuron models interconnected to each other and the outside environment through 30,000 synapses. The chip was fabricated in a standard 350 nm CMOS IC process. Our approach used dense circuit models of synaptic behavior, including biological computation and learning, as well as transistor channel models. We use Address-Event Representation (AER) spike communication for inputs and outputs to this IC. We present the IC architecture and infrastructure, including IC chip, configuration tools, and testing platform. We present measurement of small network of neurons, measurement of STDP neuron dynamics, and measurement from a compiled spiking neuron WTA topology, all compiled into this IC.

  16. ENABLING “ENERGY-AWARENESS” IN THE SEMANTIC 3D CITY MODEL OF VIENNA

    Directory of Open Access Journals (Sweden)

    G. Agugiaro

    2016-09-01

    Full Text Available This paper presents and discusses the first results regarding selection, analysis, preparation and eventual integration of a number of energy-related datasets, chosen in order to enrich a CityGML-based semantic 3D city model of Vienna. CityGML is an international standard conceived specifically as information and data model for semantic city models at urban and territorial scale. The still-in-development Energy Application Domain Extension (ADE is a CityGML extension conceived to specifically model, manage and store energy-related features and attributes for buildings. The work presented in this paper is embedded within the European Marie-Curie ITN project “CINERGY, Smart cities with sustainable energy systems”, which aims, among the rest, at developing urban decision making and operational optimisation software tools to minimise non-renewable energy use in cities. Given the scope and scale of the project, it is therefore vital to set up a common, unique and spatio-semantically coherent urban data model to be used as information hub for all applications being developed. This paper reports about the experiences done so far, it describes the test area in Vienna, Austria, and the available data sources, it shows and exemplifies the main data integration issues, the strategies developed to solve them in order to obtain the enriched 3D city model. The first results as well as some comments about their quality and limitations are presented, together with the discussion regarding the next steps and some planned improvements.

  17. Reconstruction of a high-quality metabolic model enables the identification of gene overexpression targets for enhanced antibiotic production in Streptomyces coelicolor A3(2).

    Science.gov (United States)

    Kim, Minsuk; Sang Yi, Jeong; Kim, Joonwon; Kim, Ji-Nu; Kim, Min Woo; Kim, Byung-Gee

    2014-09-01

    Streptomycetes are industrially and pharmaceutically important bacteria that produce a variety of secondary metabolites including antibiotics. Streptomycetes have a complex metabolic network responsible for the production of secondary metabolites and the utilization of organic residues present in soil. In this study, we reconstructed a high-quality metabolic model for Streptomyces coelicolor A3(2), designated iMK1208, in order to understand and engineer the metabolism of this model species. In comparison to iIB711, the previous metabolic model for S. coelicolor, the predictive power of iMK1208 was enhanced by the recent insights that enabled the incorporation of an updated biomass equation, stoichiometric matrix, and energetic parameters. iMK1208 was validated by comparing predictions with the experimental data for growth capability in various growth media. Furthermore, we applied a strain-design algorithm, flux scanning based on enforced objective flux (FSEOF), to iMK1208 for actinorhodin overproduction. FSEOF results identified not only previously known gene overexpression targets such as actII-ORF4 and acetyl-CoA carboxylase, but also novel targets such as branched-chain α-keto acid dehydrogenase (BCDH). We constructed and evaluated the BCDH overexpression mutant, which showed a 52-fold increase in actinorhodin production, validating the prediction power of iMK1208. Hence iMK1208 was shown to be a useful and valuable framework for studying the biotechnologically important Streptomyces species using the principles of systems biology and metabolic engineering.

  18. Assessing Students' Understandings of Biological Models and Their Use in Science to Evaluate a Theoretical Framework

    Science.gov (United States)

    Grünkorn, Juliane; Upmeier zu Belzen, Annette; Krüger, Dirk

    2014-01-01

    Research in the field of students' understandings of models and their use in science describes different frameworks concerning these understandings. Currently, there is no conjoint framework that combines these structures and so far, no investigation has focused on whether it reflects students' understandings sufficiently (empirical evaluation).…

  19. Applying the Nominal Response Model within a Longitudinal Framework to Construct the Positive Family Relationships Scale

    Science.gov (United States)

    Preston, Kathleen Suzanne Johnson; Parral, Skye N.; Gottfried, Allen W.; Oliver, Pamella H.; Gottfried, Adele Eskeles; Ibrahim, Sirena M.; Delany, Danielle

    2015-01-01

    A psychometric analysis was conducted using the nominal response model under the item response theory framework to construct the Positive Family Relationships scale. Using data from the Fullerton Longitudinal Study, this scale was constructed within a long-term longitudinal framework spanning middle childhood through adolescence. Items tapping…

  20. Robustness Analysis of Road Networks: a Framework with Combined DTA Models

    NARCIS (Netherlands)

    Li, M.

    2008-01-01

    Network robustness is the ability of a road network functioning properly facing unpredictable and exceptional incidents. A systematical framework with combined dynamic traffic assignment (DTA) models is designed for the analysis of road network robustness. With this framework, network performance co

  1. A model based safety architecture framework for Dutch high speed train lines

    NARCIS (Netherlands)

    Schuitemaker, K.; Braakhuis, J.G.; Rajabalinejad, M.

    2015-01-01

    This paper presents a model-based safety architecture framework (MBSAF) for capturing and sharing architectural knowledge of safety cases of safetycritical systems of systems (SoS). Whilst architecture frameworks in the systems engineering domain consider safety often as dependent attribute, this st

  2. Towards a Framework for Modelling and Verification of Relay Interlocking Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2010-01-01

    This paper describes a framework currently under development for modelling, simulation, and verification of relay interlocking systems as used by the Danish railways. The framework is centred around a domain-specific language (DSL) for describing such systems, and provides (1) a graphical editor ...

  3. Integration of the DAYCENT Biogeochemical Model within a Multi-Model Framework

    Energy Technology Data Exchange (ETDEWEB)

    David Muth

    2012-07-01

    Agricultural residues are the largest near term source of cellulosic 13 biomass for bioenergy production, but removing agricultural residues sustainably 14 requires considering the critical roles that residues play in the agronomic system. 15 Determining sustainable removal rates for agricultural residues has received 16 significant attention and integrated modeling strategies have been built to evaluate 17 sustainable removal rates considering soil erosion and organic matter constraints. 18 However the current integrated model does not quantitatively assess soil carbon 19 and long term crop yields impacts of residue removal. Furthermore the current 20 integrated model does not evaluate the greenhouse gas impacts of residue 21 removal, specifically N2O and CO2 gas fluxes from the soil surface. The DAYCENT 22 model simulates several important processes for determining agroecosystem 23 performance. These processes include daily Nitrogen-gas flux, daily carbon dioxide 24 flux from soil respiration, soil organic carbon and nitrogen, net primary productivity, 25 and daily water and nitrate leaching. Each of these processes is an indicator of 26 sustainability when evaluating emerging cellulosic biomass production systems for 27 bioenergy. A potentially vulnerable cellulosic biomass resource is agricultural 28 residues. This paper presents the integration of the DAYCENT model with the 29 existing integration framework modeling tool to investigate additional environment 30 impacts of agricultural residue removal. The integrated model is extended to 31 facilitate two-way coupling between DAYCENT and the existing framework. The 32 extended integrated model is applied to investigate additional environmental 33 impacts from a recent sustainable agricultural residue removal dataset. The 34 integrated model with DAYCENT finds some differences in sustainable removal 35 rates compared to previous results for a case study county in Iowa. The extended 36 integrated model with

  4. Neonatal tolerance induction enables accurate evaluation of gene therapy for MPS I in a canine model.

    Science.gov (United States)

    Hinderer, Christian; Bell, Peter; Louboutin, Jean-Pierre; Katz, Nathan; Zhu, Yanqing; Lin, Gloria; Choa, Ruth; Bagel, Jessica; O'Donnell, Patricia; Fitzgerald, Caitlin A; Langan, Therese; Wang, Ping; Casal, Margret L; Haskins, Mark E; Wilson, James M

    2016-09-01

    High fidelity animal models of human disease are essential for preclinical evaluation of novel gene and protein therapeutics. However, these studies can be complicated by exaggerated immune responses against the human transgene. Here we demonstrate that dogs with a genetic deficiency of the enzyme α-l-iduronidase (IDUA), a model of the lysosomal storage disease mucopolysaccharidosis type I (MPS I), can be rendered immunologically tolerant to human IDUA through neonatal exposure to the enzyme. Using MPS I dogs tolerized to human IDUA as neonates, we evaluated intrathecal delivery of an adeno-associated virus serotype 9 vector expressing human IDUA as a therapy for the central nervous system manifestations of MPS I. These studies established the efficacy of the human vector in the canine model, and allowed for estimation of the minimum effective dose, providing key information for the design of first-in-human trials. This approach can facilitate evaluation of human therapeutics in relevant animal models, and may also have clinical applications for the prevention of immune responses to gene and protein replacement therapies. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    Science.gov (United States)

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  6. Thermal modelling approaches to enable mitigation measures implementation for salmonid gravel stages in hydropeaking rivers

    Science.gov (United States)

    Casas-Mulet, R.; Alfredsen, K. T.

    2016-12-01

    The dewatering of salmon spawning redds can lead to early life stages mortality due to hydropeaking operations, with higher impact on the alevins stages as they have lower tolerance to dewatering than the eggs. Targeted flow-related mitigations measures can reduce such mortality, but it is essential to understand how hydropeaking change thermal regimes in rivers and may impact embryo development; only then optimal measures can be implemented at the right development stage. We present a set of experimental approaches and modelling tools for the estimation of hatch and swim-up dates based on water temperature data in the river Lundesokna (Norway). We identified critical periods for gravel-stages survival and through comparing hydropeaking vs unregulated thermal and hydrological regimes, we established potential flow-release measures to minimise mortality. Modelling outcomes were then used assess the cost-efficiency of each measure. The combinations of modelling tools used in this study were overall satisfactory and their application can be useful especially in systems where little field data is available. Targeted measures built on well-informed modelling approaches can be pre-tested based on their efficiency to mitigate dewatering effects vs. the hydropower system capacity to release or conserve water for power production. Overall, environmental flow releases targeting specific ecological objectives can provide better cost-effective options than conventional operational rules complying with general legislation.

  7. Developmental Impact Analysis of an ICT-Enabled Scalable Healthcare Model in BRICS Economies

    Directory of Open Access Journals (Sweden)

    Dhrubes Biswas

    2012-06-01

    Full Text Available This article highlights the need for initiating a healthcare business model in a grassroots, emerging-nation context. This article’s backdrop is a history of chronic anomalies afflicting the healthcare sector in India and similarly placed BRICS nations. In these countries, a significant percentage of populations remain deprived of basic healthcare facilities and emergency services. Community (primary care services are being offered by public and private stakeholders as a panacea to the problem. Yet, there is an urgent need for specialized (tertiary care services at all levels. As a response to this challenge, an all-inclusive health-exchange system (HES model, which utilizes information communication technology (ICT to provide solutions in rural India, has been developed. The uniqueness of the model lies in its innovative hub-and-spoke architecture and its emphasis on affordability, accessibility, and availability to the masses. This article describes a developmental impact analysis (DIA that was used to assess the impact of this model. The article contributes to the knowledge base of readers by making them aware of the healthcare challenges emerging nations are facing and ways to mitigate those challenges using entrepreneurial solutions.

  8. Plant parameters for plant functional groups of western rangelands to enable process-based simulation modeling

    Science.gov (United States)

    Regional environmental assessments with process-based models require realistic estimates of plant parameters for the primary plant functional groups in the region. “Functional group” in this context is an operational term, based on similarities in plant type and in plant parameter values. Likewise...

  9. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    Science.gov (United States)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a

  10. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment.

    Science.gov (United States)

    Kibria, Muhammad Golam; Fattah, Sheik Mohammad Mostakim; Jeong, Kwanghyeon; Chong, Ilyoung; Jeong, Youn-Kwae

    2015-09-18

    User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.

  11. A User-Centric Knowledge Creation Model in a Web of Object-Enabled Internet of Things Environment

    Directory of Open Access Journals (Sweden)

    Muhammad Golam Kibria

    2015-09-01

    Full Text Available User-centric service features in a Web of Object-enabled Internet of Things environment can be provided by using a semantic ontology that classifies and integrates objects on the World Wide Web as well as shares and merges context-aware information and accumulated knowledge. The semantic ontology is applied on a Web of Object platform to virtualize the real world physical devices and information to form virtual objects that represent the features and capabilities of devices in the virtual world. Detailed information and functionalities of multiple virtual objects are combined with service rules to form composite virtual objects that offer context-aware knowledge-based services, where context awareness plays an important role in enabling automatic modification of the system to reconfigure the services based on the context. Converting the raw data into meaningful information and connecting the information to form the knowledge and storing and reusing the objects in the knowledge base can both be expressed by semantic ontology. In this paper, a knowledge creation model that synchronizes a service logistic model and a virtual world knowledge model on a Web of Object platform has been proposed. To realize the context-aware knowledge-based service creation and execution, a conceptual semantic ontology model has been developed and a prototype has been implemented for a use case scenario of emergency service.

  12. Temporo-spatial model construction using the MML and software framework.

    Science.gov (United States)

    Chang, David C; Dokos, Socrates; Lovell, Nigel H

    2011-12-01

    Development of complex temporo-spatial biological computational models can be a time consuming and arduous task. These models may contain hundreds of differential equations as well as realistic geometries that may require considerable investment in time to ensure that all model components are correctly implemented and error free. To tackle this problem, the Modeling Markup Languages (MML) and software framework is a modular XML/HDF5-based specification and toolkits that aims to simplify this process. The main goal of this framework is to encourage reusability, sharing and storage. To achieve this, the MML framework utilizes the CellML specification and repository, which comprises an extensive range of curated models available for use. The MML framework is an open-source project available at http://mml.gsbme.unsw.edu.au.

  13. A general mathematical framework for representing soil organic matter dynamics in biogeochemistry models

    Science.gov (United States)

    Sierra, C. A.; Mueller, M.

    2013-12-01

    Recent work have highlighted the importance of nonlinear interactions in representing the decomposition of soil organic matter (SOM). It is unclear however how to integrate these concepts into larger biogeochemical models or into a more general mathematical description of the decomposition process. Here we present a mathematical framework that generalizes both previous decomposition models and recent ideas about nonlinear microbial interactions. The framework is based on a set of four basic principles: 1) mass balance, 2) heterogeneity in the decomposability of SOM, 3) transformations in the decomposability of SOM over time, 4) energy limitation of decomposers. This framework generalizes a large majority of SOM decomposition models proposed to date. We illustrate the application of this framework to the development of a continuous model that includes the ideas in the Dual Arrhenius Michaelis-Menten Model (DAMM) for explicitly representing temperature-moisture limitations of enzyme activity in the decomposition of heterogenous substrates.

  14. Modelling the dynamics of an experimental host-pathogen microcosm within a hierarchical Bayesian framework.

    Directory of Open Access Journals (Sweden)

    David Lunn

    Full Text Available The advantages of Bayesian statistical approaches, such as flexibility and the ability to acknowledge uncertainty in all parameters, have made them the prevailing method for analysing the spread of infectious diseases in human or animal populations. We introduce a Bayesian approach to experimental host-pathogen systems that shares these attractive features. Since uncertainty in all parameters is acknowledged, existing information can be accounted for through prior distributions, rather than through fixing some parameter values. The non-linear dynamics, multi-factorial design, multiple measurements of responses over time and sampling error that are typical features of experimental host-pathogen systems can also be naturally incorporated. We analyse the dynamics of the free-living protozoan Paramecium caudatum and its specialist bacterial parasite Holospora undulata. Our analysis provides strong evidence for a saturable infection function, and we were able to reproduce the two waves of infection apparent in the data by separating the initial inoculum from the parasites released after the first cycle of infection. In addition, the parameter estimates from the hierarchical model can be combined to infer variations in the parasite's basic reproductive ratio across experimental groups, enabling us to make predictions about the effect of resources and host genotype on the ability of the parasite to spread. Even though the high level of variability between replicates limited the resolution of the results, this Bayesian framework has strong potential to be used more widely in experimental ecology.

  15. An Agent-Based Modeling Framework and Application for the Generic Nuclear Fuel Cycle

    Science.gov (United States)

    Gidden, Matthew J.

    Key components of a novel methodology and implementation of an agent-based, dynamic nuclear fuel cycle simulator, Cyclus , are presented. The nuclear fuel cycle is a complex, physics-dependent supply chain. To date, existing dynamic simulators have not treated constrained fuel supply, time-dependent, isotopic-quality based demand, or fuel fungibility particularly well. Utilizing an agent-based methodology that incorporates sophisticated graph theory and operations research techniques can overcome these deficiencies. This work describes a simulation kernel and agents that interact with it, highlighting the Dynamic Resource Exchange (DRE), the supply-demand framework at the heart of the kernel. The key agent-DRE interaction mechanisms are described, which enable complex entity interaction through the use of physics and socio-economic models. The translation of an exchange instance to a variant of the Multicommodity Transportation Problem, which can be solved feasibly or optimally, follows. An extensive investigation of solution performance and fidelity is then presented. Finally, recommendations for future users of Cyclus and the DRE are provided.

  16. A probabilistic generative model for quantification of DNA modifications enables analysis of demethylation pathways.

    Science.gov (United States)

    Äijö, Tarmo; Huang, Yun; Mannerström, Henrik; Chavez, Lukas; Tsagaratou, Ageliki; Rao, Anjana; Lähdesmäki, Harri

    2016-03-14

    We present a generative model, Lux, to quantify DNA methylation modifications from any combination of bisulfite sequencing approaches, including reduced, oxidative, TET-assisted, chemical-modification assisted, and methylase-assisted bisulfite sequencing data. Lux models all cytosine modifications (C, 5mC, 5hmC, 5fC, and 5caC) simultaneously together with experimental parameters, including bisulfite conversion and oxidation efficiencies, as well as various chemical labeling and protection steps. We show that Lux improves the quantification and comparison of cytosine modification levels and that Lux can process any oxidized methylcytosine sequencing data sets to quantify all cytosine modifications. Analysis of targeted data from Tet2-knockdown embryonic stem cells and T cells during development demonstrates DNA modification quantification at unprecedented detail, quantifies active demethylation pathways and reveals 5hmC localization in putative regulatory regions.

  17. Robust Workflow Systems + Flexible Geoprocessing Services = Geo-enabled Model Web?

    OpenAIRE

    GRANELL CANUT CARLOS

    2013-01-01

    The chapter begins briefly exploring the concept of modeling in geosciences which notably benefits from advances on the integration of geoprocessing services and workflow systems. In section 3, we provide a comprehensive background on the technology trends we treat in the chapter. On one hand we deal with workflow systems, categorized normally in the literature as scientific and business workflow systems (Barga and Gannon 2007). In particular, we introduce some prominent examples of scient...

  18. Spatiotemporal Stochastic Modeling of IoT Enabled Cellular Networks: Scalability and Stability Analysis

    KAUST Repository

    Gharbieh, Mohammad

    2017-05-02

    The Internet of Things (IoT) is large-scale by nature, which is manifested by the massive number of connected devices as well as their vast spatial existence. Cellular networks, which provide ubiquitous, reliable, and efficient wireless access, will play fundamental rule in delivering the first-mile access for the data tsunami to be generated by the IoT. However, cellular networks may have scalability problems to provide uplink connectivity to massive numbers of connected things. To characterize the scalability of cellular uplink in the context of IoT networks, this paper develops a traffic-aware spatiotemporal mathematical model for IoT devices supported by cellular uplink connectivity. The developed model is based on stochastic geometry and queueing theory to account for the traffic requirement per IoT device, the different transmission strategies, and the mutual interference between the IoT devices. To this end, the developed model is utilized to characterize the extent to which cellular networks can accommodate IoT traffic as well as to assess and compare three different transmission strategies that incorporate a combination of transmission persistency, backoff, and power-ramping. The analysis and the results clearly illustrate the scalability problem imposed by IoT on cellular network and offer insights into effective scenarios for each transmission strategy.

  19. Model-based reasoning in the physics laboratory: Framework and initial results

    Science.gov (United States)

    Zwickl, Benjamin M.; Hu, Dehui; Finkelstein, Noah; Lewandowski, H. J.

    2015-12-01

    [This paper is part of the Focused Collection on Upper Division Physics Courses.] We review and extend existing frameworks on modeling to develop a new framework that describes model-based reasoning in introductory and upper-division physics laboratories. Constructing and using models are core scientific practices that have gained significant attention within K-12 and higher education. Although modeling is a broadly applicable process, within physics education, it has been preferentially applied to the iterative development of broadly applicable principles (e.g., Newton's laws of motion in introductory mechanics). A significant feature of the new framework is that measurement tools (in addition to the physical system being studied) are subjected to the process of modeling. Think-aloud interviews were used to refine the framework and demonstrate its utility by documenting examples of model-based reasoning in the laboratory. When applied to the think-aloud interviews, the framework captures and differentiates students' model-based reasoning and helps identify areas of future research. The interviews showed how students productively applied similar facets of modeling to the physical system and measurement tools: construction, prediction, interpretation of data, identification of model limitations, and revision. Finally, we document students' challenges in explicitly articulating assumptions when constructing models of experimental systems and further challenges in model construction due to students' insufficient prior conceptual understanding. A modeling perspective reframes many of the seemingly arbitrary technical details of measurement tools and apparatus as an opportunity for authentic and engaging scientific sense making.

  20. The Framework Dedicated to Three Phase Flows Wellbore Modelling

    Directory of Open Access Journals (Sweden)

    Bartlomiej Bielecki

    2015-01-01

    Full Text Available To predict physical properties in a wellbore during oil and gas production, scientists use empirical correlations or mechanistic approach algorithms. The typical research in this field is concentrated on a single property analysis as heat transfer, pressure, temperature, and so forth. Here the most proper correlations, regarding the subject, are presented. And the authors studied how to join all correlations into the full framework which returns all production parameters at every depth in a wellbore. Additionally, the presented simulation results are studied here. Based on presented algorithms, the proper tool has been applied and the results shown in this paper are taken from this application.

  1. Model-based visual tracking the OpenTL framework

    CERN Document Server

    Panin, Giorgio

    2011-01-01

    This book has two main goals: to provide a unifed and structured overview of this growing field, as well as to propose a corresponding software framework, the OpenTL library, developed by the author and his working group at TUM-Informatik. The main objective of this work is to show, how most real-world application scenarios can be naturally cast into a common description vocabulary, and therefore implemented and tested in a fully modular and scalable way, through the defnition of a layered, object-oriented software architecture.The resulting architecture covers in a seamless way all processin

  2. A model independent S/W framework for search-based software testing.

    Science.gov (United States)

    Oh, Jungsup; Baik, Jongmoon; Lim, Sung-Hwa

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model.

  3. Hydrogeologic Framework Model for the Saturated Zone Site Scale flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    T. Miller

    2004-11-15

    The purpose of this report is to document the 19-unit, hydrogeologic framework model (19-layer version, output of this report) (HFM-19) with regard to input data, modeling methods, assumptions, uncertainties, limitations, and validation of the model results in accordance with AP-SIII.10Q, Models. The HFM-19 is developed as a conceptual model of the geometric extent of the hydrogeologic units at Yucca Mountain and is intended specifically for use in the development of the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]). Primary inputs to this model report include the GFM 3.1 (DTN: MO9901MWDGFM31.000 [DIRS 103769]), borehole lithologic logs, geologic maps, geologic cross sections, water level data, topographic information, and geophysical data as discussed in Section 4.1. Figure 1-1 shows the information flow among all of the saturated zone (SZ) reports and the relationship of this conceptual model in that flow. The HFM-19 is a three-dimensional (3-D) representation of the hydrogeologic units surrounding the location of the Yucca Mountain geologic repository for spent nuclear fuel and high-level radioactive waste. The HFM-19 represents the hydrogeologic setting for the Yucca Mountain area that covers about 1,350 km2 and includes a saturated thickness of about 2.75 km. The boundaries of the conceptual model were primarily chosen to be coincident with grid cells in the Death Valley regional groundwater flow model (DTN: GS960808312144.003 [DIRS 105121]) such that the base of the site-scale SZ flow model is consistent with the base of the regional model (2,750 meters below a smoothed version of the potentiometric surface), encompasses the exploratory boreholes, and provides a framework over the area of interest for groundwater flow and radionuclide transport modeling. In depth, the model domain extends from land surface to the base of the regional groundwater flow model (D'Agnese et al. 1997 [DIRS 100131], p 2). For the site

  4. Baltes' SOC model of successful ageing as a potential framework for stroke rehabilitation.

    Science.gov (United States)

    Donnellan, C; O'Neill, D

    2014-01-01

    The aim of this paper is to explore approaches used to address some stroke rehabilitation interventions and to examine the potential use of one of the life-span theories called the Baltes' model of selective optimisation with compensation (SOC) as a potential framework. Some of the key considerations for a stroke rehabilitation intervention framework are highlighted including accommodating for the life management changes post stroke, alterations in self-regulation, acknowledge losses and focusing on a person-centred approach for transition from acute rehabilitation to the home or community setting. The Baltes' SOC model is then described in terms of these considerations for a stroke rehabilitation intervention framework. The Baltes' SOC model may offer further insights, including ageing considerations, for stroke rehabilitation approaches and interventions. It has potential to facilitate some of the necessary complexities of adjustment required in stroke rehabilitation. However, further development in terms of empirical support is required for using the model as a framework to structure stroke rehabilitation intervention. Implications for Rehabilitation There is a scarcity of theoretical frameworks that can facilitate and be inclusive for all the necessary complexities of adjustment, required in stroke rehabilitation. In addition to motor recovery post stroke, rehabilitation intervention frameworks should be goal orientated; address self-regulatory processes; be person-centred and use a common language for goal planning, setting and attainment. The Baltes' SOC model is one such framework that may address some of the considerations for stroke rehabilitation, including motor recovery and other life management aspects.

  5. SciDAC-Data, A Project to Enabling Data Driven Modeling of Exascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mubarak, M.; Ding, P.; Aliaga, L.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2016-10-10

    The SciDAC-Data project is a DOE funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab Data Center on the organization, movement, and consumption of High Energy Physics data. The project will analyze the analysis patterns and data organization that have been used by the NOvA, MicroBooNE, MINERvA and other experiments, to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership class exascale computing facilities. We will address the use of the SciDAC-Data distributions acquired from Fermilab Data Center’s analysis workflows and corresponding to around 71,000 HEP jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in HPC environments. In particular we describe in detail how the Sequential Access via Metadata (SAM) data handling system in combination with the dCache/Enstore based data archive facilities have been analyzed to develop the radically different models of the analysis of HEP data. We present how the simulation may be used to analyze the impact of design choices in archive facilities.

  6. Remote patient management: technology-enabled innovation and evolving business models for chronic disease care.

    Science.gov (United States)

    Coye, Molly Joel; Haselkorn, Ateret; DeMello, Steven

    2009-01-01

    Remote patient management (RPM) is a transformative technology that improves chronic care management while reducing net spending for chronic disease. Broadly deployed within the Veterans Health Administration and in many small trials elsewhere, RPM has been shown to support patient self-management, shift responsibilities to non-clinical providers, and reduce the use of emergency department and hospital services. Because transformative technologies offer major opportunities to advance national goals of improved quality and efficiency in health care, it is important to understand their evolution, the experiences of early adopters, and the business models that may support their deployment.

  7. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    Science.gov (United States)

    Fuchs, J. T.; Dunlap, B. H.; Clemens, J. C.; Meza, J. A.; Dennihy, E.; Koester, D.

    2017-03-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the log g -Teff plane are consistent for these three systematics.

  8. Understanding Systematics in ZZ Ceti Model Fitting to Enable Differential Seismology

    CERN Document Server

    Fuchs, J T; Clemens, J C; Meza, J A; Dennihy, E; Koester, D

    2016-01-01

    We are conducting a large spectroscopic survey of over 130 Southern ZZ Cetis with the Goodman Spectrograph on the SOAR Telescope. Because it employs a single instrument with high UV throughput, this survey will both improve the signal-to-noise of the sample of SDSS ZZ Cetis and provide a uniform dataset for model comparison. We are paying special attention to systematics in the spectral fitting and quantify three of those systematics here. We show that relative positions in the $\\log{g}$-$T_{\\rm eff}$ plane are consistent for these three systematics.

  9. A model-based framework for the analysis of team communication in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Yun Hyung [Knowledge and Information Management Department, Korea Institute of Nuclear Safety, 19 Guseong-Dong, Yuseong-Gu, Daejeon 335-338 (Korea, Republic of)], E-mail: yhchung@kins.re.kr; Yoon, Wan Chul [Intelligent Service Engineering, Korea Advanced Institute of Science and Technology, 373-1 Guseong-Dong, Yuseong-Gu, Daejeon 305-701 (Korea, Republic of); Min, Daihwan [Department of MIS, Korea University, 208 Seochang-Dong, Jochiwon-Eup, Yongi-Gun, Choongnam 339-700 (Korea, Republic of)

    2009-06-15

    Advanced human-machine interfaces are rapidly changing the interaction between humans and systems, with the level of abstraction of the presented information, the human task characteristics, and the modes of communication all affected. To accommodate the changes in the human/system co-working environment, an extended communication analysis framework is needed that can describe and relate the tasks, verbal exchanges, and information interface. This paper proposes an extended analytic framework, referred to as the H-H-S (human-human-system) communication analysis framework, which can model the changes in team communication that are emerging in these new working environments. The stage-specific decision-making model and analysis tool of the proposed framework make the analysis of team communication easier by providing visual clues. The usefulness of the proposed framework is demonstrated with an in-depth comparison of the characteristics of communication in the conventional and advanced main control rooms of nuclear power plants.

  10. A framework for modeling interregional population distribution and economic growth.

    Science.gov (United States)

    Ledent, J; Gordon, P

    1981-01-01

    "An integrated model is proposed to capture economic and demographic interactions in a system of regions. This model links the interregional economic model of Isard (1960) and the interregional demographic model of Rogers (1975) via functions describing consumption and migration patterns. Migration rates are determined jointly with labor force participation rates and unemployment rates."

  11. Drawing-to-learn: a framework for using drawings to promote model-based reasoning in biology.

    Science.gov (United States)

    Quillin, Kim; Thomas, Stephen

    2015-03-02

    The drawing of visual representations is important for learners and scientists alike, such as the drawing of models to enable visual model-based reasoning. Yet few biology instructors recognize drawing as a teachable science process skill, as reflected by its absence in the Vision and Change report's Modeling and Simulation core competency. Further, the diffuse research on drawing can be difficult to access, synthesize, and apply to classroom practice. We have created a framework of drawing-to-learn that defines drawing, categorizes the reasons for using drawing in the biology classroom, and outlines a number of interventions that can help instructors create an environment conducive to student drawing in general and visual model-based reasoning in particular. The suggested interventions are organized to address elements of affect, visual literacy, and visual model-based reasoning, with specific examples cited for each. Further, a Blooming tool for drawing exercises is provided, as are suggestions to help instructors address possible barriers to implementing and assessing drawing-to-learn in the classroom. Overall, the goal of the framework is to increase the visibility of drawing as a skill in biology and to promote the research and implementation of best practices.

  12. Framework of Pattern Recognition Model Based on the Cognitive Psychology

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    According to the fundamental theory of visual cognition mechanism and cognitive psychology,the visual pattern recognition model is introduced briefly.Three pattern recognition models,i.e.template-based matching model,prototype-based matching model and feature-based matching model are built and discussed separately.In addition,the influence of object background information and visual focus point to the result of pattern recognition is also discussed with the example of recognition for fuzzy letters and figures.

  13. Modeling Progressive Damage Using Local Displacement Discontinuities Within the FEAMAC Multiscale Modeling Framework

    Science.gov (United States)

    Ranatunga, Vipul; Bednarcyk, Brett A.; Arnold, Steven M.

    2010-01-01

    A method for performing progressive damage modeling in composite materials and structures based on continuum level interfacial displacement discontinuities is presented. The proposed method enables the exponential evolution of the interfacial compliance, resulting in unloading of the tractions at the interface after delamination or failure occurs. In this paper, the proposed continuum displacement discontinuity model has been used to simulate failure within both isotropic and orthotropic materials efficiently and to explore the possibility of predicting the crack path, therein. Simulation results obtained from Mode-I and Mode-II fracture compare the proposed approach with the cohesive element approach and Virtual Crack Closure Techniques (VCCT) available within the ABAQUS (ABAQUS, Inc.) finite element software. Furthermore, an eccentrically loaded 3-point bend test has been simulated with the displacement discontinuity model, and the resulting crack path prediction has been compared with a prediction based on the extended finite element model (XFEM) approach.

  14. Poly(ethylene glycol) (PEG) in a Polyethylene (PE) Framework: A Simple Model for Simulation Studies of a Soluble Polymer in an Open Framework.

    Science.gov (United States)

    Xie, Liangxu; Chan, Kwong-Yu; Quirke, Nick

    2017-08-16

    Canonical molecular dynamics simulations are performed to investigate the behavior of single-chain and multiple-chain poly(ethylene glycol) (PEG) contained within a cubic framework spanned by polyethylene (PE) chains. This simple model is the first of its kind to study the chemical physics of polymer-threaded organic frameworks, which are materials with potential applications in catalysis and separation processes. For a single-chain 9-mer, 14-mer, and 18-mer in a small framework, the PEG will interact strongly with the framework and assume a more linear geometry chain with an increased radius of gyration Rg compared to that of a large framework. The interaction between PEG and the framework decreases with increasing mesh size in both vacuum and water. In the limit of a framework with an infinitely large cavity (infinitely long linkers), PEG behavior approaches simulation results without a framework. The solvation of PEG is simulated by adding explicit TIP3P water molecules to a 6-chain PEG 14-mer aggregate confined in a framework. The 14-mer chains are readily solvated and leach out of a large 2.6 nm mesh framework. There are fewer water-PEG interactions in a small 1.0 nm mesh framework, as indicated by a smaller number of hydrogen bonds. The PEG aggregate, however, still partially dissolves but is retained within the 1.0 nm framework. The preliminary results illustrate the effectiveness of the simple model in studying polymer-threaded framework materials and in optimizing polymer or framework parameters for high performance.

  15. Modular degradable dendrimers enable small RNAs to extend survival in an aggressive liver cancer model

    Science.gov (United States)

    Zhou, Kejin; Nguyen, Liem H.; Miller, Jason B.; Yan, Yunfeng; Kos, Petra; Xiong, Hu; Li, Lin; Hao, Jing; Minnig, Jonathan T.; Siegwart, Daniel J.

    2016-01-01

    RNA-based cancer therapies are hindered by the lack of delivery vehicles that avoid cancer-induced organ dysfunction, which exacerbates carrier toxicity. We address this issue by reporting modular degradable dendrimers that achieve the required combination of high potency to tumors and low hepatotoxicity to provide a pronounced survival benefit in an aggressive genetic cancer model. More than 1,500 dendrimers were synthesized using sequential, orthogonal reactions where ester degradability was systematically integrated with chemically diversified cores, peripheries, and generations. A lead dendrimer, 5A2-SC8, provided a broad therapeutic window: identified as potent [EC50 75 mg/kg dendrimer repeated dosing). Delivery of let-7g microRNA (miRNA) mimic inhibited tumor growth and dramatically extended survival. Efficacy stemmed from a combination of a small RNA with the dendrimer’s own negligible toxicity, therefore illuminating an underappreciated complication in treating cancer with RNA-based drugs. PMID:26729861

  16. Enabling Dark Energy Science with Deep Generative Models of Galaxy Images

    CERN Document Server

    Ravanbakhsh, Siamak; Mandelbaum, Rachel; Schneider, Jeff; Poczos, Barnabas

    2016-01-01

    Understanding the nature of dark energy, the mysterious force driving the accelerated expansion of the Universe, is a major challenge of modern cosmology. The next generation of cosmological surveys, specifically designed to address this issue, rely on accurate measurements of the apparent shapes of distant galaxies. However, shape measurement methods suffer from various unavoidable biases and therefore will rely on a precise calibration to meet the accuracy requirements of the science analysis. This calibration process remains an open challenge as it requires large sets of high quality galaxy images. To this end, we study the application of deep conditional generative models in generating realistic galaxy images. In particular we consider variations on conditional variational autoencoder and introduce a new adversarial objective for training of conditional generative networks. Our results suggest a reliable alternative to the acquisition of expensive high quality observations for generating the calibration d...

  17. A transgenic quail model that enables dynamic imaging of amniote embryogenesis.

    Science.gov (United States)

    Huss, David; Benazeraf, Bertrand; Wallingford, Allison; Filla, Michael; Yang, Jennifer; Fraser, Scott E; Lansford, Rusty

    2015-08-15

    Embryogenesis is the coordinated assembly of tissues during morphogenesis through changes in individual cell behaviors and collective cell movements. Dynamic imaging, combined with quantitative analysis, is ideal for investigating fundamental questions in developmental biology involving cellular differentiation, growth control and morphogenesis. However, a reliable amniote model system that is amenable to the rigors of extended, high-resolution imaging and cell tracking has been lacking. To address this shortcoming, we produced a novel transgenic quail that ubiquitously expresses nuclear localized monomer cherry fluorescent protein (chFP). We characterize the expression pattern of chFP and provide concrete examples of how Tg(PGK1:H2B-chFP) quail can be used to dynamically image and analyze key morphogenetic events during embryonic stages X to 11.

  18. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models

  19. Modeling the marketing strategy-performance relationship : towards an hierarchical marketing performance framework

    NARCIS (Netherlands)

    Huizingh, Eelko K.R.E.; Zengerink, Evelien

    2001-01-01

    Accurate measurement of marketing performance is an important topic for both marketing academics and marketing managers. Many researchers have recognized that marketing performance measurement should go beyond financial measurement. In this paper we propose a conceptual framework that models marketi

  20. The Regional Hydrologic Extremes Assessment System: A software framework for hydrologic modeling and data assimilation

    National Research Council Canada - National Science Library

    Konstantinos M Andreadis; Narendra Das; Dimitrios Stampoulis; Amor Ines; Joshua B Fisher; Stephanie Granger; Jessie Kawata; Eunjin Han; Ali Behrangi

    2017-01-01

    The Regional Hydrologic Extremes Assessment System (RHEAS) is a prototype software framework for hydrologic modeling and data assimilation that automates the deployment of water resources nowcasting and forecasting applications...