WorldWideScience

Sample records for framework implementing distributed

  1. Framework for implementation of maintenance management in distribution network service providers

    International Nuclear Information System (INIS)

    Gomez Fernandez, Juan Francisco; Crespo Marquez, Adolfo

    2009-01-01

    Distribution network service providers (DNSP) are companies dealing with network infrastructure, such as distribution of gas, water, electricity or telecommunications, and they require the development of special maintenance management (MM) capabilities in order to satisfy the needs of their customers. In this sector, maintenance management information systems are essential to ensure control, gain knowledge and improve decision making. The aim of this paper is the study of specific characteristics of maintenance in these types of companies. We will investigate existing standards and best management practices with the scope of defining a suitable ad-hoc framework for implementation of maintenance management. The conclusion of the work supports the proposition of a framework consisting on a processes framework based on a structure of systems, integrated for continuous improvement of maintenance activities. The paper offers a very practical approach to the problem, as a result of more of 10 years of professional experience within this sector, and specially focused to network maintenance.

  2. Algorithm and Implementation of Distributed ESN Using Spark Framework and Parallel PSO

    Directory of Open Access Journals (Sweden)

    Kehe Wu

    2017-04-01

    Full Text Available The echo state network (ESN employs a huge reservoir with sparsely and randomly connected internal nodes and only trains the output weights, which avoids the suboptimal problem, exploding and vanishing gradients, high complexity and other disadvantages faced by traditional recurrent neural network (RNN training. In light of the outstanding adaption to nonlinear dynamical systems, ESN has been applied into a wide range of applications. However, in the era of Big Data, with an enormous amount of data being generated continuously every day, the data are often distributed and stored in real applications, and thus the centralized ESN training process is prone to being technologically unsuitable. In order to achieve the requirement of Big Data applications in the real world, in this study we propose an algorithm and its implementation for distributed ESN training. The mentioned algorithm is based on the parallel particle swarm optimization (P-PSO technique and the implementation uses Spark, a famous large-scale data processing framework. Four extremely large-scale datasets, including artificial benchmarks, real-world data and image data, are adopted to verify our framework on a stretchable platform. Experimental results indicate that the proposed work is accurate in the era of Big Data, regarding speed, accuracy and generalization capabilities.

  3. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  4. Developing frameworks for protocol implementation

    NARCIS (Netherlands)

    de Barros Barbosa, C.; de barros Barbosa, C.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a method to develop frameworks for protocol implementation. Frameworks are software structures developed for a specific application domain, which can be reused in the implementation of various different concrete systems in this domain. The use of frameworks support a protocol

  5. Distributed inter process communication framework of BES III DAQ online software

    International Nuclear Information System (INIS)

    Li Fei; Liu Yingjie; Ren Zhenyu; Wang Liang; Chinese Academy of Sciences, Beijing; Chen Mali; Zhu Kejun; Zhao Jingwei

    2006-01-01

    DAQ (Data Acquisition) system is one important part of BES III, which is the large scale high-energy physics detector on the BEPC. The inter process communication (IPC) of online software in distributed environments is very pivotal for design and implement of DAQ system. This article will introduce one distributed inter process communication framework, which is based on CORBA and used in BES III DAQ online software. The article mainly presents the design and implementation of the IPC framework and application based on IPC. (authors)

  6. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  7. A Distributed Framework for Real Time Path Planning in Practical Multi-agent Systems

    KAUST Repository

    Abdelkader, Mohamed; Jaleel, Hassan; Shamma, Jeff S.

    2017-01-01

    We present a framework for distributed, energy efficient, and real time implementable algorithms for path planning in multi-agent systems. The proposed framework is presented in the context of a motivating example of capture the flag which

  8. Using a Commercial Framework to Implement and Enhance the IEEE 1451.1 Standard

    OpenAIRE

    Viegas, Vítor; Pereira, José Dias; Girão, P. Silva

    2005-01-01

    In 1999, the 1451.1 Std was published defining a common object model and interface specification to develop open, multi-vendor distributed measurement and control systems. However, despite the well-known advantages of the model, few have been the initiatives to implement it. In this paper we describe the implementation of a NCAP – Network Capable Application Processor, in a well-known and well-proven infrastructure: the Microsoft .NET Framework. The choice of a commercial framework was part o...

  9. Design and Implementation of Distributed Crawler System Based on Scrapy

    Science.gov (United States)

    Fan, Yuhao

    2018-01-01

    At present, some large-scale search engines at home and abroad only provide users with non-custom search services, and a single-machine web crawler cannot sovle the difficult task. In this paper, Through the study and research of the original Scrapy framework, the original Scrapy framework is improved by combining Scrapy and Redis, a distributed crawler system based on Web information Scrapy framework is designed and implemented, and Bloom Filter algorithm is applied to dupefilter modul to reduce memory consumption. The movie information captured from douban is stored in MongoDB, so that the data can be processed and analyzed. The results show that distributed crawler system based on Scrapy framework is more efficient and stable than the single-machine web crawler system.

  10. The LabVIEW RADE framework distributed architecture

    International Nuclear Information System (INIS)

    Andreassen, O.O.; Kudryavtsev, D.; Raimondo, A.; Rijllart, A.; Shaipov, V.; Sorokoletov, R.

    2012-01-01

    For accelerator GUI (Graphical User Interface) applications there is a need for a rapid development environment (RADE) to create expert tools or to prototype operator applications. Typically a variety of tools are being used, such as Matlab or Excel, but their scope is limited, either because of their low flexibility or limited integration into the accelerator infrastructure. In addition, having several tools obliges users to deal with different programming techniques and data structures. We have addressed these limitations by using LabVIEW, extending it with interfaces to C++ and Java. In this way it fulfills requirements of ease of use, flexibility and connectivity, which makes up what we refer to as the RADE framework. Recent application requirements could only be met by implementing a distributed architecture with multiple servers running multiple services. This brought us the additional advantage to implement redundant services, to increase the availability and to make transparent updates. We will present two applications requiring high availability. We also report on issues encountered with such a distributed architecture and how we have addressed them. The latest extension of the framework is to industrial equipment, with program templates and drivers for PLCs (Siemens and Schneider) and PXI with LabVIEW-Real Time. (authors)

  11. A novel optimal distribution system planning framework implementing distributed generation in a deregulated electricity market

    International Nuclear Information System (INIS)

    Porkar, S.; Poure, P.; Abbaspour-Tehrani-fard, A.; Saadate, S.

    2010-01-01

    This paper introduces a new framework included mathematical model and a new software package interfacing two powerful softwares (MATLAB and GAMS) for obtaining the optimal distributed generation (DG) capacity sizing and sitting investments with capability to simulate large distribution system planning. The proposed optimization model allows minimizing total system planning costs for DG investment, DG operation and maintenance, purchase of power by the distribution companies (DISCOs) from transmission companies (TRANSCOs) and system power losses. The proposed model provides not only the DG size and site but also the new market price as well. Three different cases depending on system conditions and three different scenarios depending on different planning alternatives and electrical market structures, have been considered. They have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders' lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers' bills. (author)

  12. A novel optimal distribution system planning framework implementing distributed generation in a deregulated electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Porkar, S. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Poure, P. [Laboratoire d' Instrumentation Electronique de Nancy, LIEN, EA 3440, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Abbaspour-Tehrani-fard, A. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Saadate, S. [Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France)

    2010-07-15

    This paper introduces a new framework included mathematical model and a new software package interfacing two powerful softwares (MATLAB and GAMS) for obtaining the optimal distributed generation (DG) capacity sizing and sitting investments with capability to simulate large distribution system planning. The proposed optimization model allows minimizing total system planning costs for DG investment, DG operation and maintenance, purchase of power by the distribution companies (DISCOs) from transmission companies (TRANSCOs) and system power losses. The proposed model provides not only the DG size and site but also the new market price as well. Three different cases depending on system conditions and three different scenarios depending on different planning alternatives and electrical market structures, have been considered. They have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders' lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers' bills. (author)

  13. A Distributed Framework for Real Time Path Planning in Practical Multi-agent Systems

    KAUST Repository

    Abdelkader, Mohamed

    2017-10-19

    We present a framework for distributed, energy efficient, and real time implementable algorithms for path planning in multi-agent systems. The proposed framework is presented in the context of a motivating example of capture the flag which is an adversarial game played between two teams of autonomous agents called defenders and attackers. We start with the centralized formulation of the problem as a linear program because of its computational efficiency. Then we present an approximation framework in which each agent solves a local version of the centralized linear program by communicating with its neighbors only. The premise in this work is that for practical multi-agent systems, real time implementability of distributed algorithms is more crucial then global optimality. Thus, instead of verifying the proposed framework by performing offline simulations in MATLAB, we run extensive simulations in a robotic simulator V-REP, which includes a detailed dynamic model of quadrotors. Moreover, to create a realistic scenario, we allow a human operator to control the attacker quadrotor through a joystick in a single attacker setup. These simulations authenticate that the proposed framework is real time implementable and results in a performance that is comparable with the global optimal solution under the considered scenarios.

  14. Distributed Framework for Prototyping of Observability Concepts in Smart Grids

    DEFF Research Database (Denmark)

    Prostejovsky, Alexander; Gehrke, Oliver; Kosek, Anna Magdalena

    2015-01-01

    —Development and testing of distributed monitoring, visualisation, and decision support concepts for future power systems require appropriate modelling tools that represent both the electrical side of the grid, as well as the communication and logical relations between the acting entities....... This work presents an Observability Framework for distributed data acquisition and knowledge inference that aims to facilitate the development of these distributed concepts. They are realised as applications that run within the framework and are able to access the information on the grid topology and states...... via an abstract information model. Data is acquired dynamically over low-level data interfaces that allow for easy integration within heterogeneous environments. A Multi-Agent System platform was chosen for implementation, where agents represent the different electrical and logical grid elements...

  15. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  16. Design Of Real-Time Implementable Distributed Suboptimal Control: An LQR Perspective

    KAUST Repository

    Jaleel, Hassan

    2017-09-29

    We propose a framework for multiagent systems in which the agents compute their control actions in real time, based on local information only. The novelty of the proposed framework is that the process of computing a suboptimal control action is divided into two phases: an offline phase and an online phase. In the offline phase, an approximate problem is formulated with a cost function that is close to the optimal cost in some sense and is distributed, i.e., the costs of non-neighboring nodes are not coupled. This phase is centralized and is completed before the deployment of the system. In the online phase, the approximate problem is solved in real time by implementing any efficient distributed optimization algorithm. To quantify the performance loss, we derive upper bounds for the maximum error between the optimal performance and the performance under the proposed framework. Finally, the proposed framework is applied to an example setup in which a team of mobile nodes is assigned the task of establishing a communication link between two base stations with minimum energy consumption. We show through simulations that the performance under the proposed framework is close to the optimal performance and the suboptimal policy can be efficiently implemented online.

  17. Distributed security framework for modern workforce

    Energy Technology Data Exchange (ETDEWEB)

    Balatsky, G.; Scherer, C. P., E-mail: gbalatsky@lanl.gov, E-mail: scherer@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2014-07-01

    Safe and sustainable nuclear power production depends on strict adherence to nuclear security as a necessary prerequisite for nuclear power. This paper considers the current challenges for nuclear security, and proposes a conceptual framework to address those challenges. We identify several emerging factors that affect nuclear security: 1. Relatively high turnover rates in the nuclear workforce compared to the earlier years of the nuclear industry, when nuclear workers were more likely to have secure employment, a lifelong career at one company, and retirement on a pension plan. 2. Vulnerabilities stemming from the ubiquitous presence of modern electronics and their patterns of use by the younger workforce. 3. Modern management practices, including outsourcing and short-term contracting (which relates to number 1 above). In such a dynamic and complex environment, nuclear security personnel alone cannot effectively guarantee adequate security. We propose that one solution to this emerging situation is a distributed security model in which the components of nuclear security become the responsibility of each and every worker at a nuclear facility. To implement this model, there needs to be a refurbishment of current workforce training and mentoring practices. The paper will present an example of distributed security framework model, and how it may look in practice. (author)

  18. Distributed security framework for modern workforce

    International Nuclear Information System (INIS)

    Balatsky, G.; Scherer, C. P.

    2014-01-01

    Safe and sustainable nuclear power production depends on strict adherence to nuclear security as a necessary prerequisite for nuclear power. This paper considers the current challenges for nuclear security, and proposes a conceptual framework to address those challenges. We identify several emerging factors that affect nuclear security: 1. Relatively high turnover rates in the nuclear workforce compared to the earlier years of the nuclear industry, when nuclear workers were more likely to have secure employment, a lifelong career at one company, and retirement on a pension plan. 2. Vulnerabilities stemming from the ubiquitous presence of modern electronics and their patterns of use by the younger workforce. 3. Modern management practices, including outsourcing and short-term contracting (which relates to number 1 above). In such a dynamic and complex environment, nuclear security personnel alone cannot effectively guarantee adequate security. We propose that one solution to this emerging situation is a distributed security model in which the components of nuclear security become the responsibility of each and every worker at a nuclear facility. To implement this model, there needs to be a refurbishment of current workforce training and mentoring practices. The paper will present an example of distributed security framework model, and how it may look in practice. (author)

  19. Vertical Load Distribution for Cloud Computing via Multiple Implementation Options

    Science.gov (United States)

    Phan, Thomas; Li, Wen-Syan

    Cloud computing looks to deliver software as a provisioned service to end users, but the underlying infrastructure must be sufficiently scalable and robust. In our work, we focus on large-scale enterprise cloud systems and examine how enterprises may use a service-oriented architecture (SOA) to provide a streamlined interface to their business processes. To scale up the business processes, each SOA tier usually deploys multiple servers for load distribution and fault tolerance, a scenario which we term horizontal load distribution. One limitation of this approach is that load cannot be distributed further when all servers in the same tier are loaded. In complex multi-tiered SOA systems, a single business process may actually be implemented by multiple different computation pathways among the tiers, each with different components, in order to provide resilience and scalability. Such multiple implementation options gives opportunities for vertical load distribution across tiers. In this chapter, we look at a novel request routing framework for SOA-based enterprise computing with multiple implementation options that takes into account the options of both horizontal and vertical load distribution.

  20. THE LABVIEW RADE FRAMEWORK DISTRIBUTED ARCHITECTURE

    CERN Document Server

    Andreassen, O O; Raimondo, A; Rijllart, A; Shaipov, V; Sorokoletov, R

    2011-01-01

    For accelerator GUI applications there is a need for a rapid development environment to create expert tools or to prototype operator applications. Typically a variety of tools are being used, such as Matlab or Excel, but their scope is limited, either because of their low flexibility or limited integration into the accelerator infrastructure. In addition, having several tools obliges users to deal with different programming techniques and data structures. We have addressed these limitations by using LabVIEW, extending it with interfaces to C++ and Java. In this way it fulfils requirements of ease of use, flexibility and connectivity, which makes up what we refer to as the RADE framework. Recent application requirements could only be met by implementing a distributed architecture with multiple servers running multiple services. This brought us the additional advantage to implement redundant services, to increase the availability and to make transparent updates. We will present two applications requiring high a...

  1. A Review of Telehealth Service Implementation Frameworks

    Directory of Open Access Journals (Sweden)

    Liezl Van Dyk

    2014-01-01

    Full Text Available Despite the potential of telehealth services to increase the quality and accessibility of healthcare, the success rate of such services has been disappointing. The purpose of this paper is to find and compare existing frameworks for the implementation of telehealth services that can contribute to the success rate of future endeavors. After a thorough discussion of these frameworks, this paper outlines the development methodologies in terms of theoretical background, methodology and validation. Finally, the common themes and formats are identified for consideration in future implementation. It was confirmed that a holistic implementation approach is needed, which includes technology, organizational structures, change management, economic feasibility, societal impacts, perceptions, user-friendliness, evaluation and evidence, legislation, policy and governance. Furthermore, there is some scope for scientifically rigorous framework development and validation approaches.

  2. Framework for Leading Next Generation Science Standards Implementation

    Science.gov (United States)

    Stiles, Katherine; Mundry, Susan; DiRanna, Kathy

    2017-01-01

    In response to the need to develop leaders to guide the implementation of the Next Generation Science Standards (NGSS), the Carnegie Corporation of New York provided funding to WestEd to develop a framework that defines the leadership knowledge and actions needed to effectively implement the NGSS. The development of the framework entailed…

  3. Assessing citation networks for dissemination and implementation research frameworks.

    Science.gov (United States)

    Skolarus, Ted A; Lehmann, Todd; Tabak, Rachel G; Harris, Jenine; Lecy, Jesse; Sales, Anne E

    2017-07-28

    A recent review of frameworks used in dissemination and implementation (D&I) science described 61 judged to be related either to dissemination, implementation, or both. The current use of these frameworks and their contributions to D&I science more broadly has yet to be reviewed. For these reasons, our objective was to determine the role of these frameworks in the development of D&I science. We used the Web of Science™ Core Collection and Google Scholar™ to conduct a citation network analysis for the key frameworks described in a recent systematic review of D&I frameworks (Am J Prev Med 43(3):337-350, 2012). From January to August 2016, we collected framework data including title, reference, publication year, and citations per year and conducted descriptive and main path network analyses to identify those most important in holding the current citation network for D&I frameworks together. The source article contained 119 cited references, with 50 published articles and 11 documents identified as a primary framework reference. The average citations per year for the 61 frameworks reviewed ranged from 0.7 to 103.3 among articles published from 1985 to 2012. Citation rates from all frameworks are reported with citation network analyses for the framework review article and ten highly cited framework seed articles. The main path for the D&I framework citation network is presented. We examined citation rates and the main paths through the citation network to delineate the current landscape of D&I framework research, and opportunities for advancing framework development and use. Dissemination and implementation researchers and practitioners may consider frequency of framework citation and our network findings when planning implementation efforts to build upon this foundation and promote systematic advances in D&I science.

  4. Maintenance Management in Network Utilities Framework and Practical Implementation

    CERN Document Server

    Gómez Fernández, Juan F

    2012-01-01

    In order to satisfy the needs of their customers, network utilities require specially developed maintenance management capabilities. Maintenance Management information systems are essential to ensure control, gain knowledge and improve-decision making in companies dealing with network infrastructure, such as distribution of gas, water, electricity and telecommunications. Maintenance Management in Network Utilities studies specified characteristics of maintenance management in this sector to offer a practical approach to defining and implementing  the best management practices and suitable frameworks.   Divided into three major sections, Maintenance Management in Network Utilities defines a series of stages which can be followed to manage maintenance frameworks properly. Different case studies provide detailed descriptions which illustrate the experience in real company situations. An introduction to the concepts is followed by main sections including: • A Literature Review: covering the basic concepts an...

  5. A Run-Time Verification Framework for Smart Grid Applications Implemented on Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ciraci, Selim; Sozer, Hasan; Tekinerdogan, Bedir

    2013-05-18

    Smart grid applications are implemented and tested with simulation frameworks as the developers usually do not have access to large sensor networks to be used as a test bed. The developers are forced to map the implementation onto these frameworks which results in a deviation between the architecture and the code. On its turn this deviation makes it hard to verify behavioral constraints that are de- scribed at the architectural level. We have developed the ConArch toolset to support the automated verification of architecture-level behavioral constraints. A key feature of ConArch is programmable mapping for architecture to the implementation. Here, developers implement queries to identify the points in the target program that correspond to architectural interactions. ConArch generates run- time observers that monitor the flow of execution between these points and verifies whether this flow conforms to the behavioral constraints. We illustrate how the programmable mappings can be exploited for verifying behavioral constraints of a smart grid appli- cation that is implemented with two simulation frameworks.

  6. Design and implementation of a standard framework for KSTAR control system

    International Nuclear Information System (INIS)

    Lee, Woongryol; Park, Mikyung; Lee, Taegu; Lee, Sangil; Yun, Sangwon; Park, Jinseop; Park, Kaprai

    2014-01-01

    Highlights: • We performed a standardized of control system in KSTAR. • EPICS based software framework is developed for the realization of various control systems. • The applicability of the framework is widened from a simple command dispatcher to the real time application. • Our framework supports the implementation of embedded IOC in FPGA board. - Abstract: Standardization of control system is an important issue in KSTAR which is organized with various heterogeneous systems. Diverse control systems in KSTAR have been adopting new application software since 2010. Development of this software was launched for easy implementation of a data acquisition system but it is extended to as a Standard Framework (SFW) of control system in KSTAR. It is composed with a single library, database, template, and descriptor files. The SFW based controller has common factors. It has non-blocking control command method with a thread. The internal sequence handler makes it can be synchronized with KSTAR experiment. It also has a ring buffer pool mechanism for streaming input data handling. Recently, there are two important functional improvements in the framework. Processor embedded FPGA was proposed as a standard hardware platform for specific application. These are also manipulated by the SFW based embedded application. This approach gives single board system an ability of low level distributed control under the EPICS environments. We also developed a real time monitoring system as a real time network inspection tool in 2012 campaign using the SFW

  7. A framework for implementing a Distributed Intrusion Detection System (DIDS) with interoperabilty and information analysis

    OpenAIRE

    Davicino, Pablo; Echaiz, Javier; Ardenghi, Jorge Raúl

    2011-01-01

    Computer Intrusion Detection Systems (IDS) are primarily designed to protect availability, condentiality and integrity of critical information infrastructures. A Distributed IDS (DIDS) consists of several IDS over a large network(s), all of which communicate with each other, with a central server or with a cluster of servers that facilitates advanced network monitoring. In a distributed environment, DIDS are implemented using cooperative intelligent sensors distributed across the network(s). ...

  8. A penalized framework for distributed lag non-linear models.

    Science.gov (United States)

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  9. Adapting the Consolidated Framework for Implementation Research to Create Organizational Readiness and Implementation Tools for Project ECHO.

    Science.gov (United States)

    Serhal, Eva; Arena, Amanda; Sockalingam, Sanjeev; Mohri, Linda; Crawford, Allison

    2018-03-01

    The Project Extension for Community Healthcare Outcomes (ECHO) model expands primary care provider (PCP) capacity to manage complex diseases by sharing knowledge, disseminating best practices, and building a community of practice. The model has expanded rapidly, with over 140 ECHO projects currently established globally. We have used validated implementation frameworks, such as Damschroder's (2009) Consolidated Framework for Implementation Research (CFIR) and Proctor's (2011) taxonomy of implementation outcomes, combined with implementation experience to (1) create a set of questions to assess organizational readiness and suitability of the ECHO model and (2) provide those who have determined ECHO is the correct model with a checklist to support successful implementation. A set of considerations was created, which adapted and consolidated CFIR constructs to create ECHO-specific organizational readiness questions, as well as a process guide for implementation. Each consideration was mapped onto Proctor's (2011) implementation outcomes, and questions relating to the constructs were developed and reviewed for clarity. The Preimplementation list included 20 questions; most questions fall within Proctor's (2001) implementation outcome domains of "Appropriateness" and "Acceptability." The Process Checklist is a 26-item checklist to help launch an ECHO project; items map onto the constructs of Planning, Engaging, Executing, Reflecting, and Evaluating. Given that fidelity to the ECHO model is associated with robust outcomes, effective implementation is critical. These tools will enable programs to work through key considerations to implement a successful Project ECHO. Next steps will include validation with a diverse sample of ECHO projects.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited

  10. Mobile Autonomous Sensing Unit (MASU: A Framework That Supports Distributed Pervasive Data Sensing

    Directory of Open Access Journals (Sweden)

    Esunly Medina

    2016-07-01

    Full Text Available Pervasive data sensing is a major issue that transverses various research areas and application domains. It allows identifying people’s behaviour and patterns without overwhelming the monitored persons. Although there are many pervasive data sensing applications, they are typically focused on addressing specific problems in a single application domain, making them difficult to generalize or reuse. On the other hand, the platforms for supporting pervasive data sensing impose restrictions to the devices and operational environments that make them unsuitable for monitoring loosely-coupled or fully distributed work. In order to help address this challenge this paper present a framework that supports distributed pervasive data sensing in a generic way. Developers can use this framework to facilitate the implementations of their applications, thus reducing complexity and effort in such an activity. The framework was evaluated using simulations and also through an empirical test, and the obtained results indicate that it is useful to support such a sensing activity in loosely-coupled or fully distributed work scenarios.

  11. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  12. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon

    2005-01-01

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  13. Flexible investment under uncertainty in smart distribution networks with demand side response: Assessment framework and practical implementation

    International Nuclear Information System (INIS)

    Schachter, Jonathan A.; Mancarella, Pierluigi; Moriarty, John; Shaw, Rita

    2016-01-01

    Classical deterministic models applied to investment valuation in distribution networks may not be adequate for a range of real-world decision-making scenarios as they effectively ignore the uncertainty found in the most important variables driving network planning (e.g., load growth). As greater uncertainty is expected from growing distributed energy resources in distribution networks, there is an increasing risk of investing in too much or too little network capacity and hence causing the stranding and inefficient use of network assets; these costs are then passed on to the end-user. An alternative emerging solution in the context of smart grid development is to release untapped network capacity through Demand-Side Response (DSR). However, to date there is no approach able to quantify the value of ‘smart’ DSR solutions against ‘conventional’ asset-heavy investments. On these premises, this paper presents a general real options framework and a novel probabilistic tool for the economic assessment of DSR for smart distribution network planning under uncertainty, which allows the modeling and comparison of multiple investment strategies, including DSR and capacity reinforcements, based on different cost and risk metrics. In particular the model provides an explicit quantification of the economic value of DSR against alternative investment strategies. Through sensitivity analysis it is able to indicate the maximum price payable for DSR service such that DSR remains economically optimal against these alternatives. The proposed model thus provides Regulators with clear insights for overseeing DSR contractual arrangements. Further it highlights that differences exist in the economic perspective of the regulated DNO business and of customers. Our proposed model is therefore capable of highlighting instances where a particular investment strategy is favorable to the DNO but not to its customers, or vice-versa, and thus aspects of the regulatory framework which may

  14. Quality Implementation in Transition: A Framework for Specialists and Administrators.

    Science.gov (United States)

    Wald, Judy L.; Repetto, Jeanne B.

    1995-01-01

    Quality Implementation in Transition is a framework designed to guide transition specialists and administrators in the implementation of total quality management. The framework uses the tenets set forth by W. Edwards Deming and is intended to help professionals facilitate change within transition programs. (Author/JOW)

  15. Architectural notes: a framework for distributed systems development

    NARCIS (Netherlands)

    Pires, L.F.; Ferreira Pires, Luis

    1994-01-01

    This thesis develops a framework of methods and techniques for distributed systems development. This framework consists of two related domains in which design concepts for distributed systems are defined: the entity domain and the behaviour domain. In the entity domain we consider structures of

  16. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice.

    Science.gov (United States)

    Breimaier, Helga E; Heckemann, Birgit; Halfens, Ruud J G; Lohrmann, Christa

    2015-01-01

    Implementing clinical practice guidelines (CPGs) in healthcare settings is a complex intervention involving both independent and interdependent components. Although the Consolidated Framework for Implementation Research (CFIR) has never been evaluated in a practical context, it appeared to be a suitable theoretical framework to guide an implementation process. The aim of this study was to evaluate the comprehensiveness, applicability and usefulness of the CFIR in the implementation of a fall-prevention CPG in nursing practice to improve patient care in an Austrian university teaching hospital setting. The evaluation of the CFIR was based on (1) team-meeting minutes, (2) the main investigator's research diary, containing a record of a before-and-after, mixed-methods study design embedded in a participatory action research (PAR) approach for guideline implementation, and (3) an analysis of qualitative and quantitative data collected from graduate and assistant nurses in two Austrian university teaching hospital departments. The CFIR was used to organise data per and across time point(s) and assess their influence on the implementation process, resulting in implementation and service outcomes. Overall, the CFIR could be demonstrated to be a comprehensive framework for the implementation of a guideline into a hospital-based nursing practice. However, the CFIR did not account for some crucial factors during the planning phase of an implementation process, such as consideration of stakeholder aims and wishes/needs when implementing an innovation, pre-established measures related to the intended innovation and pre-established strategies for implementing an innovation. For the CFIR constructs reflecting & evaluating and engaging, a more specific definition is recommended. The framework and its supplements could easily be used by researchers, and their scope was appropriate for the complexity of a prospective CPG-implementation project. The CFIR facilitated qualitative data

  17. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  18. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  19. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    Science.gov (United States)

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  20. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    Science.gov (United States)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  1. Recommendations for institutional policy and network regulatory frameworks towards distributed generation in EU Member States

    International Nuclear Information System (INIS)

    Ten Donkelaar, M.; Van Oostvoorn, F.

    2005-01-01

    Recommendations regarding the development of regulatory frameworks and institutional policies towards an optimal integration of distributed generation (DG) into electricity networks are presented. These recommendations are based on findings from a benchmarking study conducted in the framework of the ENIRDG-net project. The aim of the benchmarking exercise was to identify examples of well-defined pro-DG policies, with clear targets and adequate implementation mechanisms. In this study an adequate pro-DG policy is defined on the basis of a level playing field, a situation where distributed and centralised generation receive equal incentives and have equal access to the liberalised markets for electricity. The benchmark study includes the results of a similar study conducted in the framework of the SUSTELNET project. When comparing the results a certain discrepancy can be noticed between the actual regulation and policy in a number of countries, the medium to long-term targets and the ideal situation described by the level playing field objective. To overcome this discrepancy, a number of recommendations have been drafted for future policy and regulation towards distributed generation

  2. Modeling and Implementation of Cattle/Beef Supply Chain Traceability Using a Distributed RFID-Based Framework in China

    Science.gov (United States)

    Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei

    2015-01-01

    In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain. PMID:26431340

  3. Modeling and Implementation of Cattle/Beef Supply Chain Traceability Using a Distributed RFID-Based Framework in China.

    Science.gov (United States)

    Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei

    2015-01-01

    In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain.

  4. Promoting Action on Research Implementation in Health Services framework applied to TeamSTEPPS implementation in small rural hospitals.

    Science.gov (United States)

    Ward, Marcia M; Baloh, Jure; Zhu, Xi; Stewart, Greg L

    A particularly useful model for examining implementation of quality improvement interventions in health care settings is the PARIHS (Promoting Action on Research Implementation in Health Services) framework developed by Kitson and colleagues. The PARIHS framework proposes three elements (evidence, context, and facilitation) that are related to successful implementation. An evidence-based program focused on quality enhancement in health care, termed TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety), has been widely promoted by the Agency for Healthcare Research and Quality, but research is needed to better understand its implementation. We apply the PARIHS framework in studying TeamSTEPPS implementation to identify elements that are most closely related to successful implementation. Quarterly interviews were conducted over a 9-month period in 13 small rural hospitals that implemented TeamSTEPPS. Interview quotes that were related to each of the PARIHS elements were identified using directed content analysis. Transcripts were also scored quantitatively, and bivariate regression analysis was employed to explore relationships between PARIHS elements and successful implementation related to planning activities. The current findings provide support for the PARIHS framework and identified two of the three PARIHS elements (context and facilitation) as important contributors to successful implementation. This study applies the PARIHS framework to TeamSTEPPS, a widely used quality initiative focused on improving health care quality and patient safety. By focusing on small rural hospitals that undertook this quality improvement activity of their own accord, our findings represent effectiveness research in an understudied segment of the health care delivery system. By identifying context and facilitation as the most important contributors to successful implementation, these analyses provide a focus for efficient and effective sustainment of Team

  5. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  6. A framework for effective implementation of lean production in Small and Medium-sized Enterprises

    Directory of Open Access Journals (Sweden)

    Amine Belhadi

    2016-09-01

    Full Text Available Purpose: The present paper aims at developing an effective framework including all the components necessary for implementing lean production properly in Small and Medium-sized Enterprises. Design/methodology/approach: The paper begins with the review of the main existing framework of lean implementation in order to highlight shortcomings in the literature through a lack of suitable framework for small companies. To overcome this literature gap, data of successful initiatives of lean implementation were collected based on a multiple case study approach. These initiatives has been juxtaposed in order to develop a new, practical and effective framework that includes all the components (process, tools, success factors that are necessary to implement lean in Small and Medium-sized Enterprises. Findings: The proposed framework presents many significant contributions: First, it provides an overcoming for the limitations of the existing frameworks by proposing for consultants, researchers and organizations an effective framework for lean implementation in SMEs that allows SMEs to benefit from competitive advantages  gained by lean. Second, it brings together a set of the more essential and critical elements of lean implementation commonly used by SMEs and derived from the practical experiences of them in lean implementation. Finally, it highlights the successful experiences of small companies in implementing lean programs and then proves that lean can give a relevant results even for SMEs. Research limitations/implications: The proposed framework presents a number of limitations and still evokes extension for further researches: Although it was derived from practical experiences of SMEs, the proposed framework is not supported by practical implementation. On the other hand and even though the elements in the proposed framework from the practical experiences of four SMEs, the identified elements need to be generalized and enriching by conducting

  7. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  8. Heartbeat-based error diagnosis framework for distributed embedded systems

    Science.gov (United States)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  9. A framework for risk assessment on lean production implementation

    Directory of Open Access Journals (Sweden)

    Giuliano Almeida Marodin

    2014-02-01

    Full Text Available The organizational and technical complexity of implementing the lean principles and practices can become an extensively time consuming journey with few benefits. We argue that risk assessment can aid on the understanding and management of the major difficulties on the Lean production implementation (LPI. Thus, this paper proposes a framework for risk assessment on the LPI process. The literature review permitted to adapt the risk assessment steps to the characteristics of the LPI and develop data collection and analysis procedures for each step. The Sociotechnical systems (STS theory was brought in to improve the understanding of the context’s characteristics on the proposed framework because it has a major influence on the LPI. The framework was has five steps: (a defining the unit of analysis; (b describing the context; (c risk identification; (d risk analysis; and (e risk relationships modeling.

  10. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  11. Framework and implementation for improving physics essential skills via computer-based practice: Vector math

    Science.gov (United States)

    Mikula, Brendon D.; Heckler, Andrew F.

    2017-06-01

    We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with a careful identification of target skills and the study of specific student difficulties with these skills. It then employs computer-based instruction, immediate feedback, mastery grading, and well-researched principles from cognitive psychology such as interleaved training sequences and distributed practice. We implemented this with more than 1500 students over 2 semesters. Students completed the mastery practice for an average of about 13 min /week , for a total of about 2-3 h for the whole semester. Results reveal large (>1 SD ) pretest to post-test gains in accuracy in vector skills, even compared to a control group, and these gains were retained at least 2 months after practice. We also find evidence of improved fluency, student satisfaction, and that awarding regular course credit results in higher participation and higher learning gains than awarding extra credit. In all, we find that simple computer-based mastery practice is an effective and efficient way to improve a set of basic and essential skills for introductory physics.

  12. A distributed framework for inter-domain virtual network embedding

    Science.gov (United States)

    Wang, Zihua; Han, Yanni; Lin, Tao; Tang, Hui

    2013-03-01

    Network virtualization has been a promising technology for overcoming the Internet impasse. A main challenge in network virtualization is the efficient assignment of virtual resources. Existing work focused on intra-domain solutions whereas inter-domain situation is more practical in realistic setting. In this paper, we present a distributed inter-domain framework for mapping virtual networks to physical networks which can ameliorate the performance of the virtual network embedding. The distributed framework is based on a Multi-agent approach. A set of messages for information exchange is defined. We design different operations and IPTV use scenarios to validate the advantages of our framework. Use cases shows that our framework can solve the inter-domain problem efficiently.

  13. A Framework Proposal For Choosing A New Business Implementation Model In Henkel

    OpenAIRE

    Li, Tsz Wan

    2015-01-01

    Henkel's New Business team is a corporate venturing unit that explores corporate entrepreneurial activities on behalf of Henkel Adhesives Technologies. The new business ideas are implemented through one of these models: incubator, venturing or innovation ecosystem. In current practice, there is no systematic framework in place to choose the implementation model. The goal of the thesis is to propose a framework for choosing the most appropriate model for implementation of a new business idea i...

  14. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    Science.gov (United States)

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  15. Implementing a Mobile Social Media Framework for Designing Creative Pedagogies

    Directory of Open Access Journals (Sweden)

    Thomas Cochrane

    2014-08-01

    Full Text Available The rise of mobile social media provides unique opportunities for new and creative pedagogies. Pedagogical change requires a catalyst, and we argue that mobile social media can be utilized as such a catalyst. However, the mobile learning literature is dominated by case studies that retrofit traditional pedagogical strategies and pre-existing course activities onto mobile devices and social media. From our experiences of designing and implementing a series of mobile social media projects, the authors have developed a mobile social media framework for creative pedagogies. We illustrate the implementation of our mobile social media framework within the development of a new media minor (an elective set of four courses that explicitly integrates the unique technical and pedagogical affordances of mobile social media, with a focus upon student-generated content and student-determined learning (heutagogy. We argue that our mobile social media framework is potentially transferable to a range of educational contexts, providing a simple design framework for new pedagogies.

  16. 2016-2020 Strategic Plan and Implementing Framework

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-01

    The 2016-2020 Strategic Plan and Implementing Framework from the Office of Energy Efficiency and Renewable Energy (EERE) is the blueprint for launching the nation’s leadership in the global clean energy economy. This document will guide the organization to build on decades of progress in powering our nation from clean, affordable and secure energy.

  17. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  18. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  19. California Curriculum Frameworks: A Handbook for Production, Implementation, and Evaluation Activities.

    Science.gov (United States)

    California State Dept. of Education, Sacramento.

    This booklet describes the characteristics and role of curriculum frameworks and describes how they can be used in developing educational programs. It is designed as a guide for writers of frameworks, for educators who are responsible for implementing frameworks, or for evaluators of educational programs. It provides a concise description of the…

  20. Quality Assurance Framework Implementation Guide for Isolated Community Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Esterly, Sean R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Baring-Gould, Edward I. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burman, Kari A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Greacen, Chris [Independent Consultant (United States)

    2017-08-15

    This implementation guide is a companion document to the 'Quality Assurance Framework for Mini-Grids' technical report. This document is intended to be used by one of the many stakeholder groups that take part in the implementation of isolated power systems. Although the QAF could be applied to a single system, it was designed primarily to be used within the context of a larger national or regional rural electrification program in which many individual systems are being installed. This guide includes a detailed overview of the Quality Assurance Framework and provides guidance focused on the implementation of the Framework from the perspective of the different stakeholders that are commonly involved in expanding energy development within specific communities or regions. For the successful long-term implementation of a specific rural electrification program using mini-grid systems, six key stakeholders have been identified that are typically engaged, each with a different set of priorities 1. Regulatory agency 2. Governmental ministry 3. System developers 4. Mini-utility 5. Investors 6. Customers/consumers. This document is broken into two distinct sections. The first focuses on the administrative processes in the development and operation of community-based mini-grid programs, while the second focuses on the process around the installation of the mini-grid project itself.

  1. Developing a Framework and Implementing User-Driven Innovation in Supply and Value Network

    DEFF Research Database (Denmark)

    Jacobsen, Alexia; Lassen, Astrid Heidemann; Wandahl, Søren

    2011-01-01

    This paper serves to create a framework for and, subsequently, implementing user-driven innovation in a construction material industry network. The research has its outset in Project InnoDoors that consists of a Danish university and a construction material network. The framework and the implemen......This paper serves to create a framework for and, subsequently, implementing user-driven innovation in a construction material industry network. The research has its outset in Project InnoDoors that consists of a Danish university and a construction material network. The framework...

  2. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  3. Competing Through Lean – Towards Sustainable Resource-Oriented Implementation Framework

    Directory of Open Access Journals (Sweden)

    Rymaszewska Anna

    2014-11-01

    Full Text Available This paper addresses the needs of SMEs manufacturing companies which due to their limited resources are often unable to introduce radical changes in their strategies. The main focus is on analyzing the principles of lean manufacturing and management regarding their potential contribution to building a company's competitive advantage. The paper analyses lean from a strategic management viewpoint while combining its implementation with achieving a competitive advantage. The ultimate result is a framework for lean implementation aimed at building a competitive advantage for companies. The proposed framework focuses on the idea of a closed loop with embedded sustainability.

  4. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  5. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review.

    Science.gov (United States)

    Birken, Sarah A; Powell, Byron J; Presseau, Justin; Kirk, M Alexis; Lorencatto, Fabiana; Gould, Natalie J; Shea, Christopher M; Weiner, Bryan J; Francis, Jill J; Yu, Yan; Haines, Emily; Damschroder, Laura J

    2017-01-05

    Over 60 implementation frameworks exist. Using multiple frameworks may help researchers to address multiple study purposes, levels, and degrees of theoretical heritage and operationalizability; however, using multiple frameworks may result in unnecessary complexity and redundancy if doing so does not address study needs. The Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF) are both well-operationalized, multi-level implementation determinant frameworks derived from theory. As such, the rationale for using the frameworks in combination (i.e., CFIR + TDF) is unclear. The objective of this systematic review was to elucidate the rationale for using CFIR + TDF by (1) describing studies that have used CFIR + TDF, (2) how they used CFIR + TDF, and (2) their stated rationale for using CFIR + TDF. We undertook a systematic review to identify studies that mentioned both the CFIR and the TDF, were written in English, were peer-reviewed, and reported either a protocol or results of an empirical study in MEDLINE/PubMed, PsycInfo, Web of Science, or Google Scholar. We then abstracted data into a matrix and analyzed it qualitatively, identifying salient themes. We identified five protocols and seven completed studies that used CFIR + TDF. CFIR + TDF was applied to studies in several countries, to a range of healthcare interventions, and at multiple intervention phases; used many designs, methods, and units of analysis; and assessed a variety of outcomes. Three studies indicated that using CFIR + TDF addressed multiple study purposes. Six studies indicated that using CFIR + TDF addressed multiple conceptual levels. Four studies did not explicitly state their rationale for using CFIR + TDF. Differences in the purposes that authors of the CFIR (e.g., comprehensive set of implementation determinants) and the TDF (e.g., intervention development) propose help to justify the use of CFIR

  6. Identifying a practice-based implementation framework for sustainable interventions for improving the evolving working environment: Hitting the Moving Target Framework.

    Science.gov (United States)

    Højberg, Helene; Rasmussen, Charlotte Diana Nørregaard; Osborne, Richard H; Jørgensen, Marie Birk

    2018-02-01

    Our aim was to identify implementation components for sustainable working environment interventions in the nursing assistant sector to generate a framework to optimize the implementation of workplace improvement initiatives. The implementation framework was informed by: 1) an industry advisory group, 2) interviews with key stakeholder, 3) concept mapping workshops, and 4) an e-mail survey. Thirty five stakeholders were interviewed and contributed in the concept mapping workshops. Eleven implementation components were derived across four domains: 1) A supportive organizational platform, 2) An engaged workplace with mutual goals, 3) The intervention is sustainably fitted to the workplace, and 4) the intervention is an attractive choice. The highest rated component was "Engaged and Active Management" (mean 4.1) and the lowest rated was "Delivered in an Attractive Form" (mean 2.8). The framework provides new insights into implementation in an evolving working environment and is aiming to assist with addressing gaps in effectiveness of workplace interventions and implementation success. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework

    Directory of Open Access Journals (Sweden)

    French Simon D

    2012-04-01

    Full Text Available Abstract Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s of delivery could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be

  8. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework.

    Science.gov (United States)

    French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M

    2012-04-24

    There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a

  9. DiSC: A Simulation Framework for Distribution System Voltage Control

    DEFF Research Database (Denmark)

    Pedersen, Rasmus; Sloth, Christoffer Eg; Andresen, Gorm

    2015-01-01

    This paper presents the MATLAB simulation framework, DiSC, for verifying voltage control approaches in power distribution systems. It consists of real consumption data, stochastic models of renewable resources, flexible assets, electrical grid, and models of the underlying communication channels....... The simulation framework makes it possible to validate control approaches, and thus advance realistic and robust control algorithms for distribution system voltage control. Two examples demonstrate the potential voltage issues from penetration of renewables in the distribution grid, along with simple control...

  10. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2014-05-01

    The GFCS is a global partnership of governments and UN and international agencies that produce and use climate information and services. WMO, which is leading the initiative in collaboration with UN ISDR, WHO, WFP, FAO, UNESCO, UNDP and other UN and international partners are pooling their expertise and resources in order to co-design and co-produce knowledge, information and services to support effective decision making in response to climate variability and change in four priority areas (agriculture and fod security, water, health and disaster risk reduction). To address the entire value chain for the effective production and application of climate services the GFCS main components or pillars are being implemented, namely: • User Interface Platform — to provide ways for climate service users and providers to interact to identify needs and capacities and improve the effectiveness of the Framework and its climate services; • Climate Services Information System — to produce and distribute climate data, products and information according to the needs of users and to agreed standards; • Observations and Monitoring - to generate the necessary data for climate services according to agreed standards; • Research, Modelling and Prediction — to harness science capabilities and results and develop appropriate tools to meet the needs of climate services; • Capacity Building — to support the systematic development of the institutions, infrastructure and human resources needed for effective climate services. Activities are being implemented in various countries in Africa, the Caribbean and South pacific Islands. This paper will provide details on the status of implementation of the GFCS worldwider.

  11. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  12. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  13. On effectiveness of network sensor-based defense framework

    Science.gov (United States)

    Zhang, Difan; Zhang, Hanlin; Ge, Linqiang; Yu, Wei; Lu, Chao; Chen, Genshe; Pham, Khanh

    2012-06-01

    Cyber attacks are increasing in frequency, impact, and complexity, which demonstrate extensive network vulnerabilities with the potential for serious damage. Defending against cyber attacks calls for the distributed collaborative monitoring, detection, and mitigation. To this end, we develop a network sensor-based defense framework, with the aim of handling network security awareness, mitigation, and prediction. We implement the prototypical system and show its effectiveness on detecting known attacks, such as port-scanning and distributed denial-of-service (DDoS). Based on this framework, we also implement the statistical-based detection and sequential testing-based detection techniques and compare their respective detection performance. The future implementation of defensive algorithms can be provisioned in our proposed framework for combating cyber attacks.

  14. Introducing the Canadian Thoracic Society Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation

    Directory of Open Access Journals (Sweden)

    Samir Gupta

    2013-01-01

    Full Text Available The Canadian Thoracic Society (CTS is leveraging its strengths in guideline production to enable respiratory guideline implementation in Canada. The authors describe the new CTS Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation, which has three spheres of action: guideline production, implementation infrastructure and knowledge translation (KT methodological support. The Canadian Institutes of Health Research ‘Knowledge-to-Action’ process was adopted as the model of choice for conceptualizing KT interventions. Within the framework, new evidence for formatting guideline recommendations to enhance the intrinsic implementability of future guidelines were applied. Clinical assemblies will consider implementability early in the guideline production cycle when selecting clinical questions, and new practice guidelines will include a section dedicated to KT. The framework describes the development of a web-based repository and communication forum to inventory existing KT resources and to facilitate collaboration and communication among implementation stakeholders through an online discussion board. A national forum for presentation and peer-review of proposed KT projects is described. The framework outlines expert methodological support for KT planning, development and evaluation including a practical guide for implementers and a novel ‘Clinical Assembly – KT Action Team’, and in-kind logistical support and assistance in securing peer-reviewed funding.

  15. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    Science.gov (United States)

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  16. An implementation framework for additive manufacturing in supply chains

    Directory of Open Access Journals (Sweden)

    Raed Handal

    2017-12-01

    Full Text Available Additive manufacturing has become one of the most important technologies in the manufacturing field. Full implementation of additive manufacturing will change many well-known management practices in the production sector. However, theoretical development in the field of additive manufacturing with regard to its impact on supply chain management is rare. While additive manufacturing is believed to revolutionize and enhance traditional manufacturing, there is no comprehensive toolset developed in the manufacturing field to assess the impact of additive manufacturing and determine the best production method that suits the applied supply chain strategy. A significant portion of the existing supply chain methods and frameworks were adopted in this study to examine the implementation of additive manufacturing in supply chain management. The aim of this study is to develop a framework to explain when additive manufacturing impacts supply chain management efficiently.

  17. HUMANITARIAN AID DISTRIBUTION FRAMEWORK FOR NATURAL DISASTER MANAGEMENT

    OpenAIRE

    Mohd, S.; Fathi, M. S.; Harun, A. N.

    2018-01-01

    Humanitarian aid distribution is associated with many activities, numerous disaster management stakeholders, enormous effort and different processes. For effective communication, humanitarian aid distribution activities require appropriate and up-to-date information to enhance collaboration, and improve integration. The purpose of this paper is to develop a humanitarian aid distribution framework for disaster management in Malaysia. The findings of this paper are based on a review of the huma...

  18. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  19. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    Science.gov (United States)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  20. Dreams: a framework for distributed synchronous coordination

    NARCIS (Netherlands)

    Proença, J.; Clarke, D.; Vink, de E.P.; Arbab, F.

    2012-01-01

    Synchronous coordination systems, such as Reo, exchange data via indivisible actions, while distributed systems are typically asynchronous and assume that messages can be delayed or get lost. To combine these seemingly contradictory notions, we introduce the Dreams framework. Coordination patterns

  1. Architectural frameworks: defining the structures for implementing learning health systems.

    Science.gov (United States)

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline

  2. Participation in the implementation of the Water Framework Directive in Denmark

    DEFF Research Database (Denmark)

    Wright, Stuart Anthony Lewis; Jacobsen, Brian Højland

    2011-01-01

    Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform to the re......Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform....... The paper then presents the Danish AGWAPLAN project which actively involved farmers in selecting measures to reduce diffuse nutrient pollution from agriculture. The second aim of the paper is to establish whether nationwide implementation of the AGWAPLAN concept is worthwhile. AGWAPLAN resulted in outcomes...... which could potentially increase the effectiveness of the WFD. Furthermore, the adoption of the project approach would also be one way to satisfy the requirement for active involvement in the Directive. However, some problems exist, relating to time, administrative costs, problems with control...

  3. Distributed team innovation - a framework for distributed product development

    OpenAIRE

    Larsson, Andreas; Törlind, Peter; Karlsson, Lennart; Mabogunje, Ade; Leifer, Larry; Larsson, Tobias; Elfström, Bengt-Olof

    2003-01-01

    In response to the need for increased effectivity in global product development, the Polhem Laboratory at Luleå University of Technology, Sweden, and the Center for Design Research at Stanford University, USA, have created the concept of Distributed Team Innovation (DTI). The overall aim of the DTI framework is to decrease the negative impact of geographic distance on product development efforts and to further enhance current advantages of worldwide, multidisciplinary collaboration. The DTI ...

  4. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  5. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  6. A framework for plasticity implementation on the SpiNNaker neural architecture.

    Science.gov (United States)

    Galluppi, Francesco; Lagorce, Xavier; Stromatias, Evangelos; Pfeiffer, Michael; Plana, Luis A; Furber, Steve B; Benosman, Ryad B

    2014-01-01

    Many of the precise biological mechanisms of synaptic plasticity remain elusive, but simulations of neural networks have greatly enhanced our understanding of how specific global functions arise from the massively parallel computation of neurons and local Hebbian or spike-timing dependent plasticity rules. For simulating large portions of neural tissue, this has created an increasingly strong need for large scale simulations of plastic neural networks on special purpose hardware platforms, because synaptic transmissions and updates are badly matched to computing style supported by current architectures. Because of the great diversity of biological plasticity phenomena and the corresponding diversity of models, there is a great need for testing various hypotheses about plasticity before committing to one hardware implementation. Here we present a novel framework for investigating different plasticity approaches on the SpiNNaker distributed digital neural simulation platform. The key innovation of the proposed architecture is to exploit the reconfigurability of the ARM processors inside SpiNNaker, dedicating a subset of them exclusively to process synaptic plasticity updates, while the rest perform the usual neural and synaptic simulations. We demonstrate the flexibility of the proposed approach by showing the implementation of a variety of spike- and rate-based learning rules, including standard Spike-Timing dependent plasticity (STDP), voltage-dependent STDP, and the rate-based BCM rule. We analyze their performance and validate them by running classical learning experiments in real time on a 4-chip SpiNNaker board. The result is an efficient, modular, flexible and scalable framework, which provides a valuable tool for the fast and easy exploration of learning models of very different kinds on the parallel and reconfigurable SpiNNaker system.

  7. Communication Optimizations for a Wireless Distributed Prognostic Framework

    Science.gov (United States)

    Saha, Sankalita; Saha, Bhaskar; Goebel, Kai

    2009-01-01

    Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication overhead is an important design problem for such systems. In this paper we focus on communication issues faced in the distributed implementation of an important class of algorithms for prognostics - particle filters. In spite of being computation and memory intensive, particle filters lend well to distributed implementation except for one significant step - resampling. We propose new resampling scheme called parameterized resampling that attempts to reduce communication between collaborating nodes in a distributed wireless sensor network. Analysis and comparison with relevant resampling schemes is also presented. A battery health management system is used as a target application. A new resampling scheme for distributed implementation of particle filters has been discussed in this paper. Analysis and comparison of this new scheme with existing resampling schemes in the context for minimizing communication overhead have also been discussed. Our proposed new resampling scheme performs significantly better compared to other schemes by attempting to reduce both the communication message length as well as number total communication messages exchanged while not compromising prediction accuracy and precision. Future work will explore the effects of the new resampling scheme in the overall computational performance of the whole system as well as full implementation of the new schemes on the Sun SPOT devices. Exploring different network architectures for efficient communication is an importance future research direction as well.

  8. Implementation of Perioperative Music Using the Consolidated Framework for Implementation Research.

    Science.gov (United States)

    Carter, Jessica E; Pyati, Srinivas; Kanach, Frances A; Maxwell, Ann Miller W; Belden, Charles M; Shea, Christopher M; Van de Ven, Thomas; Thompson, Jillian; Hoenig, Helen; Raghunathan, Karthik

    2018-06-12

    Complementary integrative health therapies have a perioperative role in the reduction of pain, analgesic use, and anxiety, and increasing patient satisfaction. However, long implementation lags have been quantified. The Consolidated Framework for Implementation Research (CFIR) can help mitigate this translational problem. We reviewed evidence for several nonpharmacological treatments (CFIR domain: characteristics of interventions) and studied external context and organizational readiness for change by surveying providers at 11 Veterans Affairs (VA) hospitals (domains: outer and inner settings). We asked patients about their willingness to receive music and studied the association between this and known risk factors for opioid use (domain: characteristics of individuals). We implemented a protocol for the perioperative use of digital music players loaded with veteran-preferred playlists and evaluated its penetration in a subgroup of patients undergoing joint replacements over a 6-month period (domain: process of implementation). We then extracted data on postoperative recovery time and other outcomes, comparing them with historic and contemporary cohorts. Evidence varied from strong and direct for perioperative music and acupuncture, to modest or weak and indirect for mindfulness, yoga, and tai chi, respectively. Readiness for change surveys completed by 97 perioperative providers showed overall positive scores (mean >0 on a scale from -2 to +2, equivalent to >2.5 on the 5-point Likert scale). Readiness was higher at Durham (+0.47) versus most other VA hospitals (range +0.05 to +0.63). Of 3307 veterans asked about willingness to receive music, approximately 68% (n = 2252) answered "yes." In multivariable analyses, a positive response (acceptability) was independently predicted by younger age and higher mean preoperative pain scores (>4 out of 10 over 90 days before admission), factors associated with opioid overuse. Penetration was modest in the targeted subset (39

  9. How can we improve guideline use? A conceptual framework of implementability

    Directory of Open Access Journals (Sweden)

    Lemieux-Charles Louise

    2011-03-01

    Full Text Available Abstract Background Guidelines continue to be underutilized, and a variety of strategies to improve their use have been suboptimal. Modifying guideline features represents an alternative, but untested way to promote their use. The purpose of this study was to identify and define features that facilitate guideline use, and examine whether and how they are included in current guidelines. Methods A guideline implementability framework was developed by reviewing the implementation science literature. We then examined whether guidelines included these, or additional implementability elements. Data were extracted from publicly available high quality guidelines reflecting primary and institutional care, reviewed independently by two individuals, who through discussion resolved conflicts, then by the research team. Results The final implementability framework included 22 elements organized in the domains of adaptability, usability, validity, applicability, communicability, accommodation, implementation, and evaluation. Data were extracted from 20 guidelines on the management of diabetes, hypertension, leg ulcer, and heart failure. Most contained a large volume of graded, narrative evidence, and tables featuring complementary clinical information. Few contained additional features that could improve guideline use. These included alternate versions for different users and purposes, summaries of evidence and recommendations, information to facilitate interaction with and involvement of patients, details of resource implications, and instructions on how to locally promote and monitor guideline use. There were no consistent trends by guideline topic. Conclusions Numerous opportunities were identified by which guidelines could be modified to support various types of decision making by different users. New governance structures may be required to accommodate development of guidelines with these features. Further research is needed to validate the proposed

  10. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt

    2013-01-01

    This paper describes work being done towards a Framework for Applying the Reference Model for an Open Archival Information System (OAIS) to Distributed Digital Preservation (DDP). Such a Framework will be helpful for future analyses and/or audits of repositories that are performing digital...

  11. Policy implementation in practice: the case of national service frameworks in general practice.

    Science.gov (United States)

    Checkland, Kath; Harrison, Stephen

    2004-10-01

    National Service Frameworks are an integral part of the government's drive to 'modernise' the NHS, intended to standardise both clinical care and the design of the services used to deliver that clinical care. This article uses evidence from qualitative case studies in three general practices to illustrate the difficulties associated with the implementation of such top-down guidelines and models of service. In these studies it was found that, while there had been little explicit activity directed at implementation overall, the National Service Framework for coronary heart disease had in general fared better than that for older people. Gunn's notion of 'perfect implementation' is used to make sense of the findings.

  12. Supporting Collective Inquiry: A Technology Framework for Distributed Learning

    Science.gov (United States)

    Tissenbaum, Michael

    This design-based study describes the implementation and evaluation of a technology framework to support smart classrooms and Distributed Technology Enhanced Learning (DTEL) called SAIL Smart Space (S3). S3 is an open-source technology framework designed to support students engaged in inquiry investigations as a knowledge community. To evaluate the effectiveness of S3 as a generalizable technology framework, a curriculum named PLACE (Physics Learning Across Contexts and Environments) was developed to support two grade-11 physics classes (n = 22; n = 23) engaged in a multi-context inquiry curriculum based on the Knowledge Community and Inquiry (KCI) pedagogical model. This dissertation outlines three initial design studies that established a set of design principles for DTEL curricula, and related technology infrastructures. These principles guided the development of PLACE, a twelve-week inquiry curriculum in which students drew upon their community-generated knowledge base as a source of evidence for solving ill-structured physics problems based on the physics of Hollywood movies. During the culminating smart classroom activity, the S3 framework played a central role in orchestrating student activities, including managing the flow of materials and students using real-time data mining and intelligent agents that responded to emergent class patterns. S3 supported students' construction of knowledge through the use individual, collective and collaborative scripts and technologies, including tablets and interactive large-format displays. Aggregate and real-time ambient visualizations helped the teacher act as a wondering facilitator, supporting students in their inquiry where needed. A teacher orchestration tablet gave the teacher some control over the flow of the scripted activities, and alerted him to critical moments for intervention. Analysis focuses on S3's effectiveness in supporting students' inquiry across multiple learning contexts and scales of time, and in

  13. Identifying a practice-based implementation framework for sustainable interventions for improving the evolving working environment

    DEFF Research Database (Denmark)

    Højberg, Helene; Nørregaard Rasmussen, Charlotte Diana; Osborne, Richard H.

    2018-01-01

    Our aim was to identify implementation components for sustainable working environment interventions in the nursing assistant sector to generate a framework to optimize the implementation of workplace improvement initiatives. The implementation framework was informed by: 1) an industry advisory...... group, 2) interviews with key stakeholder, 3) concept mapping workshops, and 4) an e-mail survey. Thirty five stakeholders were interviewed and contributed in the concept mapping workshops. Eleven implementation components were derived across four domains: 1) A supportive organizational platform, 2......) An engaged workplace with mutual goals, 3) The intervention is sustainably fitted to the workplace, and 4) the intervention is an attractive choice. The highest rated component was “Engaged and Active Management” (mean 4.1) and the lowest rated was “Delivered in an Attractive Form” (mean 2.8). The framework...

  14. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  15. Teachers implementing context-based teaching materials : a framework for case-analysis in chemistry

    NARCIS (Netherlands)

    Vos, M.A.J.; Taconis, R.; Jochems, W.M.G.; Pilot, A.

    2010-01-01

    We present a framework for analysing the interplay between context-based teaching material and teachers, and for evaluating the adequacy of the resulting implementation of context-based pedagogy in chemistry classroom practice. The development of the framework is described, including an account of

  16. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions.

    Science.gov (United States)

    Pérez, Dennis; Van der Stuyft, Patrick; Zabala, Maríadel Carmen; Castro, Marta; Lefèvre, Pierre

    2016-07-08

    One of the major debates in implementation research turns around fidelity and adaptation. Fidelity is the degree to which an intervention is implemented as intended by its developers. It is meant to ensure that the intervention maintains its intended effects. Adaptation is the process of implementers or users bringing changes to the original design of an intervention. Depending on the nature of the modifications brought, adaptation could either be potentially positive or could carry the risk of threatening the theoretical basis of the intervention, resulting in a negative effect on expected outcomes. Adaptive interventions are those for which adaptation is allowed or even encouraged. Classical fidelity dimensions and conceptual frameworks do not address the issue of how to adapt an intervention while still maintaining its effectiveness. We support the idea that fidelity and adaptation co-exist and that adaptations can impact either positively or negatively on the intervention's effectiveness. For adaptive interventions, research should answer the question how an adequate fidelity-adaptation balance can be reached. One way to address this issue is by looking systematically at the aspects of an intervention that are being adapted. We conducted fidelity research on the implementation of an empowerment strategy for dengue prevention in Cuba. In view of the adaptive nature of the strategy, we anticipated that the classical fidelity dimensions would be of limited use for assessing adaptations. The typology we used in the assessment-implemented, not-implemented, modified, or added components of the strategy-also had limitations. It did not allow us to answer the question which of the modifications introduced in the strategy contributed to or distracted from outcomes. We confronted our empirical research with existing literature on fidelity, and as a result, considered that the framework for implementation fidelity proposed by Carroll et al. in 2007 could potentially meet

  17. Implementing change in primary care practices using electronic medical records: a conceptual framework.

    Science.gov (United States)

    Nemeth, Lynne S; Feifer, Chris; Stuart, Gail W; Ornstein, Steven M

    2008-01-16

    Implementing change in primary care is difficult, and little practical guidance is available to assist small primary care practices. Methods to structure care and develop new roles are often needed to implement an evidence-based practice that improves care. This study explored the process of change used to implement clinical guidelines for primary and secondary prevention of cardiovascular disease in primary care practices that used a common electronic medical record (EMR). Multiple conceptual frameworks informed the design of this study designed to explain the complex phenomena of implementing change in primary care practice. Qualitative methods were used to examine the processes of change that practice members used to implement the guidelines. Purposive sampling in eight primary care practices within the Practice Partner Research Network-Translating Researching into Practice (PPRNet-TRIP II) clinical trial yielded 28 staff members and clinicians who were interviewed regarding how change in practice occurred while implementing clinical guidelines for primary and secondary prevention of cardiovascular disease and strokes. A conceptual framework for implementing clinical guidelines into primary care practice was developed through this research. Seven concepts and their relationships were modelled within this framework: leaders setting a vision with clear goals for staff to embrace; involving the team to enable the goals and vision for the practice to be achieved; enhancing communication systems to reinforce goals for patient care; developing the team to enable the staff to contribute toward practice improvement; taking small steps, encouraging practices' tests of small changes in practice; assimilating the electronic medical record to maximize clinical effectiveness, enhancing practices' use of the electronic tool they have invested in for patient care improvement; and providing feedback within a culture of improvement, leading to an iterative cycle of goal setting

  18. Analyse of The Legal Framework in Colombia for implementation of Bioprospecting Practices

    International Nuclear Information System (INIS)

    Duarte, Oscar; Velho Lea

    2008-01-01

    The practice of bioprospecting is inherently linked with traditional knowledge and practices of local communities in the South as well as with the commercial activities of industries (e.g., pharmaceutics sector, agriculture) in the North. A series of actors operate at this interface, such as Non-Governmental Organizations (NGOs), Research Centers, Universities, Science and Technology sponsor institutions and the State. As these actors have divergent interests and powers of negotiation, an appropriate regulatory framework is necessary to regulate their interaction. This paper analyzes the existing legal framework in a mega-diverse country, like Colombia, for implementation of bioprospecting practices. The research consisted of two key components: (i) A review of the state of art of bioprospecting; (ii) A work in situ in Colombia, which consisted of analysis of information and genetic resources related to bioprospecting, participation in the implementation of a legal frame for bioprospecting practices and interviews with Colombian professionals in the field of biodiversity conservation. Our research determined that: (i) national authorities encounter a multitude of difficulties to implement a legal framework in Colombia, especially the Andean regional normativity; (ii) the execution of research projects related to bioprospecting in Colombia faces numerous challenges

  19. The Development of a Practical Framework for the Implementation of JIT Manufacturing

    OpenAIRE

    Hallihan, A.

    1996-01-01

    This research develops a framework to guide practitioners through the process of implementing Just In Time manufacturing in the commercial aircraft manufacturing industry. The scope of Just In Time manufacturing is determined through an analysis of its evolution and current use. Current approaches to its implementation are reviewed and shortcomings are identified. A requirement to allow practitioners to tailor the approach to the implementation of Just In Time manufacturing, ...

  20. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  1. Towards a resilience management framework for complex enterprise systems upgrade implementation

    Science.gov (United States)

    Teoh, Say Yen; Yeoh, William; Zadeh, Hossein Seif

    2017-05-01

    The lack of knowledge of how resilience management supports enterprise system (ES) projects accounts for the failure of firms to leverage their investments in costly ES implementations. Using a structured-pragmatic-situational (SPS) case study research approach, this paper reports on an investigation into the resilience management of a large utility company as it implemented an ES upgrade. Drawing on the literature and on the case study findings, we developed a process-based resilience management framework that involves three strategies (developing situation awareness, demystifying threats, and executing restoration plans) and four organisational capabilities that transform resilience management concepts into practices. We identified the crucial phases of ES upgrade implementation and developed indicators for how different strategies and capabilities of resilience management can assist managers at different stages of an ES upgrade. This research advances the state of existing knowledge by providing specific and verifiable propositions for attaining a state of resilience, the knowledge being grounded in the empirical reality of a case study. Moreover, the framework offers ES practitioners a roadmap to better identify appropriate responses and levels of preparedness.

  2. A research framework for the development and implementation of interventions preventing work-related musculoskeletal disorders.

    Science.gov (United States)

    van der Beek, Allard J; Dennerlein, Jack T; Huysmans, Maaike A; Mathiassen, Svend Erik; Burdorf, Alex; van Mechelen, Willem; van Dieën, Jaap H; Frings-Dresen, Monique Hw; Holtermann, Andreas; Janwantanakul, Prawit; van der Molen, Henk F; Rempel, David; Straker, Leon; Walker-Bone, Karen; Coenen, Pieter

    2017-11-01

    Objectives Work-related musculoskeletal disorders (MSD) are highly prevalent and put a large burden on (working) society. Primary prevention of work-related MSD focuses often on physical risk factors (such as manual lifting and awkward postures) but has not been too successful in reducing the MSD burden. This may partly be caused by insufficient knowledge of etiological mechanisms and/or a lack of adequately feasible interventions (theory failure and program failure, respectively), possibly due to limited integration of research disciplines. A research framework could link research disciplines thereby strengthening the development and implementation of preventive interventions. Our objective was to define and describe such a framework for multi-disciplinary research on work-related MSD prevention. Methods We described a framework for MSD prevention research, partly based on frameworks from other research fields (ie, sports injury prevention and public health). Results The framework is composed of a repeated sequence of six steps comprising the assessment of (i) incidence and severity of MSD, (ii) risk factors for MSD, and (iii) underlying mechanisms; and the (iv) development, (v) evaluation, and (vi) implementation of preventive intervention(s). Conclusions In the present framework for optimal work-related MSD prevention, research disciplines are linked. This framework can thereby help to improve theories and strengthen the development and implementation of prevention strategies for work-related MSD.

  3. A Test Generation Framework for Distributed Fault-Tolerant Algorithms

    Science.gov (United States)

    Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.

    2009-01-01

    Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.

  4. Assessment of school wellness policies implementation by benchmarking against diffusion of innovation framework.

    Science.gov (United States)

    Harriger, Dinah; Lu, Wenhua; McKyer, E Lisako J; Pruitt, Buzz E; Goodson, Patricia

    2014-04-01

    The School Wellness Policy (SWP) mandate marks one of the first innovative and extensive efforts of the US government to address the child obesity epidemic and the influence of the school environment on child health. However, no systematic review has been conducted to examine the implementation of the mandate. The study examines the literature on SWP implementation by using the Diffusion of Innovations Theory as a framework. Empirically based literature on SWP was systematically searched and analyzed. A theory-driven approach was used to categorize the articles by 4 diffusion stages: restructuring/redefining, clarifying, routinizing, and multiple stages. Twenty-one studies were identified, and 3 key characteristics of the reviewed literature were captured: (1) uniformity in methodology, (2) role of context in analyzing policy implementation, and (3) lack of information related to policy clarification. Over half of the studies were published by duplicate set of authors, and only 1 study employed a pure qualitative methodology. Only 2 articles include an explicit theoretical framework to study theory-driven constructs related to SWP implementation. Policy implementation research can inform the policy process. Therefore, it is essential that policy implementation is measured accurately. Failing to clearly define implementation constructs may result in misguided conclusion. © 2014, American School Health Association.

  5. A QDWH-Based SVD Software Framework on Distributed-Memory Manycore Systems

    KAUST Repository

    Sukkari, Dalal

    2017-01-01

    This paper presents a high performance software framework for computing a dense SVD on distributed- memory manycore systems. Originally introduced by Nakatsukasa et al. (Nakatsukasa et al. 2010; Nakatsukasa and Higham 2013), the SVD solver relies on the polar decomposition using the QR Dynamically-Weighted Halley algorithm (QDWH). Although the QDWH-based SVD algorithm performs a significant amount of extra floating-point operations compared to the traditional SVD with the one-stage bidiagonal reduction, the inherent high level of concurrency associated with Level 3 BLAS compute-bound kernels ultimately compensates for the arithmetic complexity overhead. Using the ScaLAPACK two-dimensional block cyclic data distribution with a rectangular processor topology, the resulting QDWH-SVD further reduces excessive communications during the panel factorization, while increasing the degree of parallelism during the update of the trailing submatrix, as opposed to relying to the default square processor grid. After detailing the algorithmic complexity and the memory footprint of the algorithm, we conduct a thorough performance analysis and study the impact of the grid topology on the performance by looking at the communication and computation profiling trade-offs. We report performance results against state-of-the-art existing QDWH software implementations (e.g., Elemental) and their SVD extensions on large-scale distributed-memory manycore systems based on commodity Intel x86 Haswell processors and Knights Landing (KNL) architecture. The QDWH-SVD framework achieves up to 3/8-fold on the Haswell/KNL-based platforms, respectively, against ScaLAPACK PDGESVD and turns out to be a competitive alternative for well and ill-conditioned matrices. We finally come up herein with a performance model based on these empirical results. Our QDWH-based polar decomposition and its SVD extension are freely available at https://github.com/ecrc/qdwh.git and https

  6. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    Science.gov (United States)

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  7. Knowledge Framework Implementation with Multiple Architectures - 13090

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P. [Applied Research Center, Florida International University, Miami, FL 33174 (United States); DeGregory, J. [Office of D and D and Facility Engineering, Environmental Management, Department of Energy (United States)

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  8. Knowledge Framework Implementation with Multiple Architectures - 13090

    International Nuclear Information System (INIS)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P.; DeGregory, J.

    2013-01-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  9. Nash Bargaining Game-Theoretic Framework for Power Control in Distributed Multiple-Radar Architecture Underlying Wireless Communication System

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2018-04-01

    Full Text Available This paper presents a novel Nash bargaining solution (NBS-based cooperative game-theoretic framework for power control in a distributed multiple-radar architecture underlying a wireless communication system. Our primary objective is to minimize the total power consumption of the distributed multiple-radar system (DMRS with the protection of wireless communication user’s transmission, while guaranteeing each radar’s target detection requirement. A unified cooperative game-theoretic framework is proposed for the optimization problem, where interference power constraints (IPCs are imposed to protect the communication user’s transmission, and a minimum signal-to-interference-plus-noise ratio (SINR requirement is employed to provide reliable target detection for each radar. The existence, uniqueness and fairness of the NBS to this cooperative game are proven. An iterative Nash bargaining power control algorithm with low computational complexity and fast convergence is developed and is shown to converge to a Pareto-optimal equilibrium for the cooperative game model. Numerical simulations and analyses are further presented to highlight the advantages and testify to the efficiency of our proposed cooperative game algorithm. It is demonstrated that the distributed algorithm is effective for power control and could protect the communication system with limited implementation overhead.

  10. The Framework for KM Implementation in Product and Service Oriented SMEs: Evidence from Field Studies in Taiwan

    Directory of Open Access Journals (Sweden)

    Yao Chin Lin

    2015-03-01

    Full Text Available Knowledge management (KM is a core competency that determines the success of small and medium-sized enterprises (SMEs in this knowledge-based economy. Instead of competing on the basis of physical and financial capital, the success of SMEs is influenced by the knowledge, experience and skills of the owners and its employees. Unfortunately, many SMEs are still struggling with KM implementation due to lacking a comprehensive KM framework. This study aims to identify enablers for KM success and build up a framework for KM implementation in service and product oriented SMEs. By using multiple research methods, this study collects data from SMEs in Taiwan to prove our suggested enablers and reference KM framework. The suggested framework can provide useful assistance and guidance for holistic KM solutions. The K-object concept, which adopted the XML standard, may become a significant managerial and technical element in the KM practice. The enhanced KM framework mandates every employee’s participation in knowledge activities, not just some elite knowledge workers. The findings provide useful implications for researchers and practitioners by providing useful templates for implementing KM initiatives in different industries and more comprehensive framework for KM implementation in different types of SMEs.

  11. Developing a Framework for Traceability Implementation in the Textile Supply Chain

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2017-04-01

    Full Text Available Traceability has recently gained considerable attention in the textile industry. Traceability stands for information sharing about a product including the product history, specification, or location. With the involvement of globally dispersed actors in the textile supply chain, ensuring appropriate product quality with timely supplies is crucial for surviving in this industry with ever increasing competition. Hence it is of paramount importance for a supply chain actor to track every product and trace its history in the supply chain. In this context, this paper presents a framework to implement traceability in the textile supply chain. A system approach has been followed, where firstly the usage requirement of traceability is defined, and then a framework for implementing intra-actor or internal traceability and inter-actor or external traceability is discussed. This article further presents a sequential diagram to demonstrate the interaction and information exchange between the actors in the supply chain, when the traceability information is requested. An example is also illustrated for data storage using a relational database management system and information exchange using XML for the textile weaver. Finally, the article discusses challenges and future studies required to implement traceability in the textile supply chain.

  12. A framework for distributed mixed-language scientific applications

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1996-01-01

    The Object Management Group has defined an architecture (COBRA) for distributed object applications based on an Object Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel subs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL. (author)

  13. A framework of quality improvement interventions to implement evidence-based practices for pressure ulcer prevention.

    Science.gov (United States)

    Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Valuck, Robert J

    2014-06-01

    To enhance the learner's competence with knowledge about a framework of quality improvement (QI) interventions to implement evidence-based practices for pressure ulcer (PrU) prevention. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Summarize the process of creating and initiating the best-practice framework of QI for PrU prevention.2. Identify the domains and QI interventions for the best-practice framework of QI for PrU prevention. Pressure ulcer (PrU) prevention is a priority issue in US hospitals. The National Pressure Ulcer Advisory Panel endorses an evidence-based practice (EBP) protocol to help prevent PrUs. Effective implementation of EBPs requires systematic change of existing care units. Quality improvement interventions offer a mechanism of change to existing structures in order to effectively implement EBPs for PrU prevention. The best-practice framework developed by Nelson et al is a useful model of quality improvement interventions that targets process improvement in 4 domains: leadership, staff, information and information technology, and performance and improvement. At 2 academic medical centers, the best-practice framework was shown to physicians, nurses, and health services researchers. Their insight was used to modify the best-practice framework as a reference tool for quality improvement interventions in PrU prevention. The revised framework includes 25 elements across 4 domains. Many of these elements support EBPs for PrU prevention, such as updates in PrU staging and risk assessment. The best-practice framework offers a reference point to initiating a bundle of quality improvement interventions in support of EBPs. Hospitals and clinicians tasked with quality improvement efforts can use this framework to problem-solve PrU prevention and other critical issues.

  14. A Multi-Functional Fully Distributed Control Framework for AC Microgrids

    DEFF Research Database (Denmark)

    Shafiee, Qobad; Nasirian, Vahidreza; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper proposes a fully distributed control methodology for secondary control of AC microgrids. The control framework includes three modules: voltage regulator, reactive power regulator, and active power/frequency regulator. The voltage regulator module maintains the average voltage of the mi......This paper proposes a fully distributed control methodology for secondary control of AC microgrids. The control framework includes three modules: voltage regulator, reactive power regulator, and active power/frequency regulator. The voltage regulator module maintains the average voltage...... of the microgrid distribution line at the rated value. The reactive power regulator compares the local normalized reactive power of an inverter with its neighbors’ powers on a communication graph and, accordingly, fine-tunes Q-V droop coefficients to mitigate any reactive power mismatch. Collectively, these two....../reactive power sharing. An AC microgrid is prototyped to experimentally validate the proposed control methodology against the load change, plug-and-play operation, and communication constraints such as delay, packet loss, and limited bandwidth....

  15. Mobile agent-enabled framework for structuring and building distributed systems on the internet

    Institute of Scientific and Technical Information of China (English)

    CAO Jiannong; ZHOU Jingyang; ZHU Weiwei; LI Xuhui

    2006-01-01

    Mobile agent has shown its promise as a powerful means to complement and enhance existing technology in various application areas. In particular, existing work has demonstrated that MA can simplify the development and improve the performance of certain classes of distributed applications, especially for those running on a wide-area, heterogeneous, and dynamic networking environment like the Internet. In our previous work, we extended the application of MA to the design of distributed control functions, which require the maintenance of logical relationship among and/or coordination of processing entities in a distributed system. A novel framework is presented for structuring and building distributed systems, which use cooperating mobile agents as an aid to carry out coordination and cooperation tasks in distributed systems. The framework has been used for designing various distributed control functions such as load balancing and mutual exclusion in our previous work. In this paper, we use the framework to propose a novel approach to detecting deadlocks in distributed system by using mobile agents, which demonstrates the advantage of being adaptive and flexible of mobile agents. We first describe the MAEDD (Mobile Agent Enabled Deadlock Detection) scheme, in which mobile agents are dispatched to collect and analyze deadlock information distributed across the network sites and, based on the analysis, to detect and resolve deadlocks. Then the design of an adaptive hybrid algorithm derived from the framework is presented. The algorithm can dynamically adapt itself to the changes in system state by using different deadlock detection strategies. The performance of the proposed algorithm has been evaluated using simulations. The results show that the algorithm can outperform existing algorithms that use a fixed deadlock detection strategy.

  16. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  17. Water and spatial development: the implementation of the water framework directive in the Netherlands

    NARCIS (Netherlands)

    Knaap, van der W.G.M.; Pijnappels, M.

    2010-01-01

    This paper discusses how water managers and spatial planners could co-operate on local level in combination with the implementation of the Water Framework Directive and the Birds and Habitats Directives in the Netherlands. Recent evaluations of the European Commission show that implementation of

  18. Barriers to Implementing the Response to Intervention Framework in Secondary Schools: Interviews with Secondary Principals

    Science.gov (United States)

    Bartholomew, Mitch; De Jong, David

    2017-01-01

    Despite the successful implementation of the Response to Intervention (RtI) framework in many elementary schools, there is little evidence of successful implementation in high school settings. Several themes emerged from the interviews of nine secondary principals, including a lack of knowledge and training for successful implementation, the…

  19. A General Framework for Analyzing, Characterizing, and Implementing Spectrally Modulated, Spectrally Encoded Signals

    National Research Council Canada - National Science Library

    Roberts, Marcus L

    2006-01-01

    .... Research is rapidly progressing in SDR hardware and software venues, but current CR-based SDR research lacks the theoretical foundation and analytic framework to permit efficient implementation...

  20. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  1. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    Science.gov (United States)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  2. Organizational Health Literacy: Review of Theories, Frameworks, Guides, and Implementation Issues

    Science.gov (United States)

    Bonneville, Luc; Bouchard, Louise

    2018-01-01

    Organizational health literacy is described as an organization-wide effort to transform organization and delivery of care and services to make it easier for people to navigate, understand, and use information and services to take care of their health. Several health literacy guides have been developed to assist healthcare organizations with this effort, but their content has not been systematically reviewed to understand the scope and practical implications of this transformation. The objective of this study was to review (1) theories and frameworks that inform the concept of organizational health literacy, (2) the attributes of organizational health literacy as described in the guides, (3) the evidence for the effectiveness of the guides, and (4) the barriers and facilitators to implementing organizational health literacy. Drawing on a metanarrative review method, 48 publications were reviewed, of which 15 dealt with the theories and operational frameworks, 20 presented health literacy guides, and 13 addressed guided implementation of organizational health literacy. Seven theories and 9 operational frameworks have been identified. Six health literacy dimensions and 9 quality-improvement characteristics were reviewed for each health literacy guide. Evidence about the effectiveness of health literacy guides is limited at this time, but experiences with the guides were positive. Thirteen key barriers (conceived also as facilitators) were identified. Further development of organizational health literacy requires a strong and a clear connection between its vision and operationalization as an implementation strategy to patient-centered care. For many organizations, becoming health literate will require multiple, simultaneous, and radical changes. Organizational health literacy has to make sense from clinical and financial perspectives in order for organizations to embark on such transformative journey. PMID:29569968

  3. Organizational Health Literacy: Review of Theories, Frameworks, Guides, and Implementation Issues.

    Science.gov (United States)

    Farmanova, Elina; Bonneville, Luc; Bouchard, Louise

    2018-01-01

    Organizational health literacy is described as an organization-wide effort to transform organization and delivery of care and services to make it easier for people to navigate, understand, and use information and services to take care of their health. Several health literacy guides have been developed to assist healthcare organizations with this effort, but their content has not been systematically reviewed to understand the scope and practical implications of this transformation. The objective of this study was to review (1) theories and frameworks that inform the concept of organizational health literacy, (2) the attributes of organizational health literacy as described in the guides, (3) the evidence for the effectiveness of the guides, and (4) the barriers and facilitators to implementing organizational health literacy. Drawing on a metanarrative review method, 48 publications were reviewed, of which 15 dealt with the theories and operational frameworks, 20 presented health literacy guides, and 13 addressed guided implementation of organizational health literacy. Seven theories and 9 operational frameworks have been identified. Six health literacy dimensions and 9 quality-improvement characteristics were reviewed for each health literacy guide. Evidence about the effectiveness of health literacy guides is limited at this time, but experiences with the guides were positive. Thirteen key barriers (conceived also as facilitators) were identified. Further development of organizational health literacy requires a strong and a clear connection between its vision and operationalization as an implementation strategy to patient-centered care. For many organizations, becoming health literate will require multiple, simultaneous, and radical changes. Organizational health literacy has to make sense from clinical and financial perspectives in order for organizations to embark on such transformative journey.

  4. Analysing the agricultural cost and non-market benefits of implementing the water framework directive

    NARCIS (Netherlands)

    Bateman, I.J.; Brouwer, R.; Davies, H.; Day, B.H.; Deflandre, A.; Di Falco, S.; Georgiou, S.; Hadley, D.; Hutchins, M.; Jones, A.P.; Kay, D.; Leeks, G.; Lewis, M.; Lovett, A.A.; Neal, C.; Posen, P.; Rigby, D.; Turner, R.K.

    2006-01-01

    Implementation of the Water Framework Directive (WFD) represents a fundamental change in the management of water in Europe with a requirement that member states ensure 'good ecological status' for all water bodies by 2015. Agriculture is expected to bear a major share of WFD implementation costs as

  5. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  6. Communication Channels as Implementation Determinants of Performance Management Framework in Kenya

    Science.gov (United States)

    Sang, Jane

    2016-01-01

    The purpose of this study to assess communication channels as implementation determinants of performance management framework In Kenya at Moi Teaching and Referral Hospital (MTRH). The communication theory was used to inform the study. This study adopted an explanatory design. The target sampled 510 respondents through simple random and stratified…

  7. Implementation of a Framework for Collaborative Social Networks in E-Learning

    Science.gov (United States)

    Maglajlic, Seid

    2016-01-01

    This paper describes the implementation of a framework for the construction and utilization of social networks in ELearning. These social networks aim to enhance collaboration between all E-Learning participants (i.e. both traineeto-trainee and trainee-to-tutor communication are targeted). E-Learning systems that include a so-called "social…

  8. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  9. Implementing the Marine Strategy Framework Directive: A policy perspective on regulatory, institutional and stakeholder impediments to effective implementation

    NARCIS (Netherlands)

    Leeuwen, van J.; Raakjaer, J.; Hoof, van L.J.W.; Tatenhove, van J.P.M.; Long, R.; Ounanian, K.

    2014-01-01

    The implementation of the European Union (EU) Marine Strategy Framework Directive (MSFD) requires EU Member States to draft a program of measures to achieve Good Environmental Status (GES). Central argument of this paper, based on an analysis of the unique, holistic character of the MSFD, is that

  10. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    Science.gov (United States)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  11. MAPI: a software framework for distributed biomedical applications

    Directory of Open Access Journals (Sweden)

    Karlsson Johan

    2013-01-01

    Full Text Available Abstract Background The amount of web-based resources (databases, tools etc. in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license.

  12. A Framework for ERP Post-Implementation Amendments: A Literature Analysis

    Directory of Open Access Journals (Sweden)

    Taiwo Oseni

    2017-06-01

    Full Text Available Post-implementation amendments to ERP systems (ERP-PIA are of importance for advancing ERP research, but more importantly essential if ERP systems are to be used as a strategic and competitive business tool. For ease of clarity, we have adopted the term “amendments” to encompass the main forms of post implementation activities: maintenance, enhancements and upgrades. The term ‘amendments’ is used to counteract one of the major findings from this research - the inconsistency of terms used by many authors to explain post implementation activities. This paper presents a review of the ERP post-implementation amendment literature in order to provide answers to two specific questions: first, what is the current state of research in the field of ERP-PIA; and second, what are the future research directions that need to be explored in the field of ERP-PIA. From the review, we develop a framework to identify: (a major themes concerning ERP post-implementation amendments, (b inherent gaps in the post-implementation amendments literature, and (c specific areas that require further research attention influencing the uptake of amendments. Suggestions on empirical evaluation of research directions and their relevance in the extension of existing literature is presented.

  13. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  14. A framework for implementation of user-centric identity management systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    2010-01-01

    Users increasingly become part of electronic transactions, which take place instantaneously without direct user involvement. This leads to the risk of data manipulation, identity theft and privacy violation, and it has become a major concern for individuals and businesses around the world. Gov......-ernments in many countries are implementing identity man-agement systems (IdMS) to curtail these incidences and to offer citizens the power to exercise informational self-determination. Using concepts from technology adoption and fit-viability theo-ries as well as the laws of identity, this paper analyzes...... the crite-ria for successful implementation and defines a framework for a citizen-centric national IdMS. Results from a survey con-ducted in Ghana are also included....

  15. Supporting the Evaluation and Implementation of Musculoskeletal Models of Care: A Globally Informed Framework for Judging Readiness and Success.

    Science.gov (United States)

    Briggs, Andrew M; Jordan, Joanne E; Jennings, Matthew; Speerin, Robyn; Bragge, Peter; Chua, Jason; Woolf, Anthony D; Slater, Helen

    2017-04-01

    To develop a globally informed framework to evaluate readiness for implementation and success after implementation of musculoskeletal models of care (MOCs). Three phases were undertaken: 1) a qualitative study with 27 Australian subject matter experts (SMEs) to develop a draft framework; 2) an eDelphi study with an international panel of 93 SMEs across 30 nations to evaluate face validity, and refine and establish consensus on the framework components; and 3) translation of the framework into a user-focused resource and evaluation of its acceptability with the eDelphi panel. A comprehensive evaluation framework was developed for judging the readiness and success of musculoskeletal MOCs. The framework consists of 9 domains, with each domain containing a number of themes underpinned by detailed elements. In the first Delphi round, scores of "partly agree" or "completely agree" with the draft framework ranged 96.7%-100%. In the second round, "essential" scores ranged 58.6%-98.9%, resulting in 14 of 34 themes being classified as essential. SMEs strongly agreed or agreed that the final framework was useful (98.8%), usable (95.1%), credible (100%) and appealing (93.9%). Overall, 96.3% strongly supported or supported the final structure of the framework as it was presented, while 100%, 96.3%, and 100% strongly supported or supported the content within the readiness, initiating implementation, and success streams, respectively. An empirically derived framework to evaluate the readiness and success of musculoskeletal MOCs was strongly supported by an international panel of SMEs. The framework provides an important internationally applicable benchmark for the development, implementation, and evaluation of musculoskeletal MOCs. © 2016, American College of Rheumatology.

  16. Development of framework for sustainable Lean implementation: an ISM approach

    Science.gov (United States)

    Jadhav, Jagdish Rajaram; Mantha, S. S.; Rane, Santosh B.

    2014-07-01

    The survival of any organization depends upon its competitive edge. Even though Lean is one of the most powerful quality improvement methodologies, nearly two-thirds of the Lean implementations results in failures and less than one-fifth of those implemented have sustained results. One of the most significant tasks of top management is to identify, understand and deploy the significant Lean practices like quality circle, Kanban, Just-in-time purchasing, etc. The term `bundle' is used to make groups of inter-related and internally consistent Lean practices. Eight significant Lean practice bundles have been identified based on literature reviewed and opinion of the experts. The order of execution of Lean practice bundles is very important. Lean practitioners must be able to understand the interrelationship between these practice bundles. The objective of this paper is to develop framework for sustainable Lean implementation using interpretive structural modelling approach.

  17. MATLAB implementation of satellite positioning error overbounding by generalized Pareto distribution

    Science.gov (United States)

    Ahmad, Khairol Amali; Ahmad, Shahril; Hashim, Fakroul Ridzuan

    2018-02-01

    In the satellite navigation community, error overbound has been implemented in the process of integrity monitoring. In this work, MATLAB programming is used to implement the overbounding of satellite positioning error CDF. Using a trajectory of reference, the horizontal position errors (HPE) are computed and its non-parametric distribution function is given by the empirical Cumulative Distribution Function (ECDF). According to the results, these errors have a heavy-tailed distribution. Sınce the ECDF of the HPE in urban environment is not Gaussian distributed, the ECDF is overbound with the CDF of the generalized Pareto distribution (GPD).

  18. The stapl Skeleton Framework

    KAUST Repository

    Zandifar, Mani

    2015-01-01

    © Springer International Publishing Switzerland 2015. This paper describes the stapl Skeleton Framework, a highlevel skeletal approach for parallel programming. This framework abstracts the underlying details of data distribution and parallelism from programmers and enables them to express parallel programs as a composition of existing elementary skeletons such as map, map-reduce, scan, zip, butterfly, allreduce, alltoall and user-defined custom skeletons. Skeletons in this framework are defined as parametric data flow graphs, and their compositions are defined in terms of data flow graph compositions. Defining the composition in this manner allows dependencies between skeletons to be defined in terms of point-to-point dependencies, avoiding unnecessary global synchronizations. To show the ease of composability and expressivity, we implemented the NAS Integer Sort (IS) and Embarrassingly Parallel (EP) benchmarks using skeletons and demonstrate comparable performance to the hand-optimized reference implementations. To demonstrate scalable performance, we show a transformation which enables applications written in terms of skeletons to run on more than 100,000 cores.

  19. A survey on the progress with implementation of the radiography profession's career progression framework in UK radiotherapy centres

    International Nuclear Information System (INIS)

    James, Sarah; Beardmore, Charlotte; Dumbleton, Claire

    2012-01-01

    Aim: The purpose of the survey was to benchmark the progress with implementing the radiography profession's career progression framework within radiotherapy centres across the United Kingdom (UK). Methods: A survey questionnaire was constructed using the Survey Monkey™ tool to assess implementation of the career progression framework of the Society and College of Radiographers. Once constructed, an on line link to the survey questionnaire was emailed to all radiotherapy centre managers in the UK (N = 67) who were invited to provide one response per centre. The survey comprised twenty nine questions which were grouped into nine sections. Key results: The workforce profile indicates that increases in assistant, advanced and consultant level practitioners are required to meet National Radiotherapy Advisory Group recommendations with only a small number of centres having fully implemented the career progression framework. The overall vacancy level across the therapeutic radiography workforce was 4.6% at the time of the survey. Conclusions: and Recommendations: The survey has highlighted some progress with implementation of the career progression framework across the UK since its launch in 2000. However the current level of implementation demonstrated is disappointing considering it is a key recommendation within the NRAG Report 2007 with respect to England. It is recommended that all centres undertake a multi-professional workforce review to embed the career progression framework within their service in order to meet the workforce challenge associated with the required anticipated large growth in radiotherapy capacity.

  20. Consolidating tactical planning and implementation frameworks for integrated vector management in Uganda.

    Science.gov (United States)

    Okia, Michael; Okui, Peter; Lugemwa, Myers; Govere, John M; Katamba, Vincent; Rwakimari, John B; Mpeka, Betty; Chanda, Emmanuel

    2016-04-14

    Integrated vector management (IVM) is the recommended approach for controlling some vector-borne diseases (VBD). In the face of current challenges to disease vector control, IVM is vital to achieve national targets set for VBD control. Though global efforts, especially for combating malaria, now focus on elimination and eradication, IVM remains useful for Uganda which is principally still in the control phase of the malaria continuum. This paper outlines the processes undertaken to consolidate tactical planning and implementation frameworks for IVM in Uganda. The Uganda National Malaria Control Programme with its efforts to implement an IVM approach to vector control was the 'case' for this study. Integrated management of malaria vectors in Uganda remained an underdeveloped component of malaria control policy. In 2012, knowledge and perceptions of malaria vector control policy and IVM were assessed, and recommendations for a specific IVM policy were made. In 2014, a thorough vector control needs assessment (VCNA) was conducted according to WHO recommendations. The findings of the VCNA informed the development of the national IVM strategic guidelines. Information sources for this study included all available data and accessible archived documentary records on VBD control in Uganda. The literature was reviewed and adapted to the local context and translated into the consolidated tactical framework. WHO recommends implementation of IVM as the main strategy to vector control and has encouraged member states to adopt the approach. However, many VBD-endemic countries lack IVM policy frameworks to guide implementation of the approach. In Uganda most VBD coexists and could be managed more effectively if done in tandem. In order to successfully control malaria and other VBD and move towards their elimination, the country needs to scale up proven and effective vector control interventions and also learn from the experience of other countries. The IVM strategy is important in

  1. Development of a distributed air pollutant dry deposition modeling framework

    International Nuclear Information System (INIS)

    Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J.

    2012-01-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO 2 ), sulfur dioxide (SO 2 ), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. - Highlights: ► A distributed air pollutant dry deposition modeling system was developed. ► The developed system enhances the functionality of i-Tree Eco. ► The developed system employs nationally available input datasets. ► The developed system is transferable to any U.S. city. ► Future planting and protection spots were visually identified in a case study. - Employing nationally available datasets and a GIS, this study will provide urban forest managers in U.S. cities a framework to quantify and visualize urban forest structure and its air pollution removal effect.

  2. A more ‘autonomous’ European social dialogue: the implementation of the Framework Agreement On Telework

    NARCIS (Netherlands)

    Visser, J.; Ramos Martín, N.

    2008-01-01

    This paper examines the implementation of the first ‘autonomous’ agreement signed by the European social partners. The European Framework Agreement on Telework of July 2002 was to be implemented ‘in accordance with the procedures and practices specific to management and labour and the Member

  3. The art framework

    International Nuclear Information System (INIS)

    Green, C; Kowalkowski, J; Paterno, M; Fischler, M; Garren, L; Lu, Q

    2012-01-01

    Future “Intensity Frontier” experiments at Fermilab are likely to be conducted by smaller collaborations, with fewer scientists, than is the case for recent “Energy Frontier” experiments. art is a C++ event-processing framework designed with the needs of such experiments in mind. An evolution from the framework of the CMS experiment, art was designed and implemented to be usable by multiple experiments without imposing undue maintenance effort requirements on either the art developers or experiments using it. We describe the key requirements and features of art and the rationale behind evolutionary changes, additions and simplifications with respect to the CMS framework. In addition, our package distribution system and our collaborative model with respect to the multiple experiments using art helps keep the maintenance burden low. We also describe in-progress and future enhancements to the framework, including strategies we are using to allow multi-threaded use of the art framework in today's multi- and many-core environments.

  4. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  5. Flexible patient information search and retrieval framework: pilot implementation

    Science.gov (United States)

    Erdal, Selnur; Catalyurek, Umit V.; Saltz, Joel; Kamal, Jyoti; Gurcan, Metin N.

    2007-03-01

    Medical centers collect and store significant amount of valuable data pertaining to patients' visit in the form of medical free-text. In addition, standardized diagnosis codes (International Classification of Diseases, Ninth Revision, Clinical Modification: ICD9-CM) related to those dictated reports are usually available. In this work, we have created a framework where image searches could be initiated through a combination of free-text reports as well as ICD9 codes. This framework enables more comprehensive search on existing large sets of patient data in a systematic way. The free text search is enriched by computer-aided inclusion of additional search terms enhanced by a thesaurus. This combination of enriched search allows users to access to a larger set of relevant results from a patient-centric PACS in a simpler way. Therefore, such framework is of particular use in tasks such as gathering images for desired patient populations, building disease models, and so on. As the motivating application of our framework, we implemented a search engine. This search engine processed two years of patient data from the OSU Medical Center's Information Warehouse and identified lung nodule location information using a combination of UMLS Meta-Thesaurus enhanced text report searches along with ICD9 code searches on patients that have been discharged. Five different queries with various ICD9 codes involving lung cancer were carried out on 172552 cases. Each search was completed under a minute on average per ICD9 code and the inclusion of UMLS thesaurus increased the number of relevant cases by 45% on average.

  6. A Framework for Low-Communication 1-D FFT

    Directory of Open Access Journals (Sweden)

    Ping Tak Peter Tang

    2013-01-01

    Full Text Available In high-performance computing on distributed-memory systems, communication often represents a significant part of the overall execution time. The relative cost of communication will certainly continue to rise as compute-density growth follows the current technology and industry trends. Design of lower-communication alternatives to fundamental computational algorithms has become an important field of research. For distributed 1-D FFT, communication cost has hitherto remained high as all industry-standard implementations perform three all-to-all internode data exchanges (also called global transposes. These communication steps indeed dominate execution time. In this paper, we present a mathematical framework from which many single-all-to-all and easy-to-implement 1-D FFT algorithms can be derived. For large-scale problems, our implementation can be twice as fast as leading FFT libraries on state-of-the-art computer clusters. Moreover, our framework allows tradeoff between accuracy and performance, further boosting performance if reduced accuracy is acceptable.

  7. A Testing and Implementation Framework (TIF) for Climate Adaptation Innovations : Initial Version of the TIF - Deliverable 5.1

    NARCIS (Netherlands)

    Sebastian, A.G.; Lendering, K.T.; van Loon-Steensma, J.M.; Paprotny, D.; Bellamy, Rob; Willems, Patrick; van Loenhout, Joris; Colaço, Conceição; Dias, Susana; Nunes, Leónia; Rego, Francisco; Koundouri, Phoebe; Xepapadeas, Petros; Vassilopoulos, Achilleas; Wiktor, Paweł; Wysocka-Golec, Justyna

    2017-01-01

    Currently there is no internationally accepted framework for assessing the readiness of innovations that reduce disaster risk. To fill this gap, BRIGAID is developing a standard, comprehensive Testing and Implementation Framework (TIF). The TIF is designed to provide innovators with a framework for

  8. Validation of the theoretical domains framework for use in behaviour change and implementation research

    OpenAIRE

    Cane, James E.; O'Connor, Denise; Michie, Susan

    2012-01-01

    Abstract Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results The...

  9. Multiscale, multiphysics beam dynamics framework design and applications

    International Nuclear Information System (INIS)

    Amundson, J F; Spentzouris, P; Dechow, D; Stoltz, P; McInnes, L; Norris, B

    2008-01-01

    Modern beam dynamics simulations require nontrivial implementations of multiple physics models. We discuss how component framework design in combination with the Common Component Architecture's component model and implementation eases the process of incorporation of existing state-of-the-art models with newly-developed models. We discuss current developments in componentized beam dynamics software, emphasizing design issues and distribution issues

  10. Understanding the Attributes of Implementation Frameworks to Guide the Implementation of a Model of Community-based Integrated Health Care for Older Adults with Complex Chronic Conditions: A Metanarrative Review

    Directory of Open Access Journals (Sweden)

    Ann McKillop

    2017-06-01

    Full Text Available Introduction: Many studies have investigated the process of healthcare implementation to understand better how to bridge gaps between recommended practice, the needs and demands of healthcare consumers, and what they actually receive. However, in the implementation of integrated community-based and integrated health care, it is still not well known which approaches work best.  Methods: We conducted a systematic review and metanarrative synthesis of literature on implementation frameworks, theories and models in support of a research programme investigating CBPHC for older adults with chronic health problems. Results: Thirty-five reviews met our inclusion criteria and were appraised, summarised, and synthesised. Five metanarratives emerged 1 theoretical constructs; 2 multiple influencing factors; 3 development of new frameworks; 4 application of existing frameworks; and 5 effectiveness of interventions within frameworks/models. Four themes were generated that exposed the contradictions and synergies among the metanarratives. Person-centred care is fundamental to integrated CBPHC at all levels in the health care delivery system, yet many implementation theories and frameworks neglect this cornerstone.  Discussion: The research identified perspectives central to integrated CBPHC that were missing in the literature. Context played a key role in determining success and in how consumers and their families, providers, organisations and policy-makers stay connected to implementing the best care possible.  Conclusions: All phases of implementation of a new model of CBPHC call for collaborative partnerships with all stakeholders, the most important being the person receiving care in terms of what matters most to them.

  11. Behavior and Convergence of Wasserstein Metric in the Framework of Stable Distributions

    Czech Academy of Sciences Publication Activity Database

    Omelchenko, Vadym

    2012-01-01

    Roč. 2012, č. 30 (2012), s. 124-138 ISSN 1212-074X R&D Projects: GA ČR GAP402/10/0956 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Wasserstein Metric * Stable Distributions * Empirical Distribution Function Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/omelchenko-behavior and convergence of wasserstein metric in the framework of stable distributions.pdf

  12. cMsg - A general purpose, publish-subscribe, interprocess communication implementation and framework

    International Nuclear Information System (INIS)

    Timmer, C; Abbott, D; Gyurjyan, V; Heyes, G; Jastrzembski, E; Wolin, E

    2008-01-01

    cMsg is software used to send and receive messages in the Jefferson Lab online and runcontrol systems. It was created to replace the several IPC software packages in use with a single API. cMsg is asynchronous in nature, running a callback for each message received. However, it also includes synchronous routines for convenience. On the framework level, cMsg is a thin API layer in Java, C, or C++ that can be used to wrap most message-based interprocess communication protocols. The top layer of cMsg uses this same API and multiplexes user calls to one of many such wrapped protocols (or domains) based on a URL-like string which we call a Uniform Domain Locator or UDL. One such domain is a complete implementation of a publish-subscribe messaging system using network communications and written in Java (user APIs in C and C++ too). This domain is built in a way which allows it to be used as a proxy server to other domains (protocols). Performance is excellent allowing the system not only to be used for messaging but also as a data distribution system

  13. dCache: implementing a high-end NFSv4.1 service using a Java NIO framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    dCache is a high performance scalable storage system widely used by HEP community. In addition to set of home grown protocols we also provide industry standard access mechanisms like WebDAV and NFSv4.1. This support places dCache as a direct competitor to commercial solutions. Nevertheless conforming to a protocol is not enough; our implementations must perform comparably or even better than commercial systems. To achieve this, dCache uses two high-end IO frameworks from well know application servers: GlassFish and JBoss. This presentation describes how we implemented an rfc1831 and rfc2203 compliant ONC RPC (Sun RPC) service based on the Grizzly NIO framework, part of the GlassFish application server. This ONC RPC service is the key component of dCache’s NFSv4.1 implementation, but is independent of dCache and available for other projects. We will also show some details of dCache NFS v4.1 implementations, describe some of the Java NIO techniques used and, finally, present details of our performance e...

  14. Implementering Run-time Evaluation of Distributed Timing Constraints in a Micro Kernel

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.; Nielsen, Jens Frederik Dalsgaard

    In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems......In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems...

  15. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    Directory of Open Access Journals (Sweden)

    Ndawi Benedict

    2011-02-01

    Full Text Available Abstract Background Despite the growing importance of the Accountability for Reasonableness (A4R framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full

  16. Implementation of parallel processing in the basf2 framework for Belle II

    International Nuclear Information System (INIS)

    Itoh, Ryosuke; Lee, Soohyung; Katayama, N; Mineo, S; Moll, A; Kuhr, T; Heck, M

    2012-01-01

    Recent PC servers are equipped with multi-core CPUs and it is desired to utilize the full processing power of them for the data analysis in large scale HEP experiments. A software framework basf2 is being developed for the use in the Belle II experiment, a new generation B-factory experiment at KEK, and the parallel event processing to utilize the multi-core CPUs is in its design for the use in the massive data production. The details of the implementation of event parallel processing in the basf2 framework are discussed with the report of preliminary performance study in the realistic use on a 32 core PC server.

  17. State-of-the-Art: Research Theoretical Framework of Information Systems Implementation Research in the Health Sector in Sub-Saharan Africa

    DEFF Research Database (Denmark)

    Tetteh, Godwin Kofi

    2014-01-01

    This study is about the state-of-the-art of reference theories and theoretical framework of information systems implementation research in the health industry in the Sub-Saharan countries from a process perspective. A process – variance framework, Poole et al, (2000), Markus & Robey, (1988......) and Shaw & Jarvenpaa, (1997) is employed to examine reference theories employed in research conducted on information systems implementation in the health sector in the Sub-Saharan region and published between 2003 and 2013. Using a number of key words and searching on a number of databases, EBSCO, CSA...... the process theoretical framework to enhance our insight into successful information systems implementation in the region. It is our optimism that the process based theoretical framework will be useful for, information system practitioners and organisational managers and researchers in the health sector...

  18. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  19. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Ravindra [Argonne National Lab. (ANL), Argonne, IL (United States); Reilly, James T. [Reilly Associates, Pittston, PA (United States); Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to common barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.

  20. Adjusting Estimates of the Expected Value of Information for Implementation: Theoretical Framework and Practical Application.

    Science.gov (United States)

    Andronis, Lazaros; Barton, Pelham M

    2016-04-01

    Value of information (VoI) calculations give the expected benefits of decision making under perfect information (EVPI) or sample information (EVSI), typically on the premise that any treatment recommendations made in light of this information will be implemented instantly and fully. This assumption is unlikely to hold in health care; evidence shows that obtaining further information typically leads to "improved" rather than "perfect" implementation. To present a method of calculating the expected value of further research that accounts for the reality of improved implementation. This work extends an existing conceptual framework by introducing additional states of the world regarding information (sample information, in addition to current and perfect information) and implementation (improved implementation, in addition to current and optimal implementation). The extension allows calculating the "implementation-adjusted" EVSI (IA-EVSI), a measure that accounts for different degrees of implementation. Calculations of implementation-adjusted estimates are illustrated under different scenarios through a stylized case study in non-small cell lung cancer. In the particular case study, the population values for EVSI and IA-EVSI were £ 25 million and £ 8 million, respectively; thus, a decision assuming perfect implementation would have overestimated the expected value of research by about £ 17 million. IA-EVSI was driven by the assumed time horizon and, importantly, the specified rate of change in implementation: the higher the rate, the greater the IA-EVSI and the lower the difference between IA-EVSI and EVSI. Traditionally calculated measures of population VoI rely on unrealistic assumptions about implementation. This article provides a simple framework that accounts for improved, rather than perfect, implementation and offers more realistic estimates of the expected value of research. © The Author(s) 2015.

  1. Distributed Programming via Safe Closure Passing

    Directory of Open Access Journals (Sweden)

    Philipp Haller

    2016-02-01

    Full Text Available Programming systems incorporating aspects of functional programming, e.g., higher-order functions, are becoming increasingly popular for large-scale distributed programming. New frameworks such as Apache Spark leverage functional techniques to provide high-level, declarative APIs for in-memory data analytics, often outperforming traditional "big data" frameworks like Hadoop MapReduce. However, widely-used programming models remain rather ad-hoc; aspects such as implementation trade-offs, static typing, and semantics are not yet well-understood. We present a new asynchronous programming model that has at its core several principles facilitating functional processing of distributed data. The emphasis of our model is on simplicity, performance, and expressiveness. The primary means of communication is by passing functions (closures to distributed, immutable data. To ensure safe and efficient distribution of closures, our model leverages both syntactic and type-based restrictions. We report on a prototype implementation in Scala. Finally, we present preliminary experimental results evaluating the performance impact of a static, type-based optimization of serialization.

  2. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Directory of Open Access Journals (Sweden)

    Runzhe Geng

    Full Text Available Best management practices (BMPs for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P index, model simulation techniques (Hydrological Simulation Program-FORTRAN, and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001 decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  3. FRIEND Engine Framework: a real time neurofeedback client-server system for neuroimaging studies

    Science.gov (United States)

    Basilio, Rodrigo; Garrido, Griselda J.; Sato, João R.; Hoefle, Sebastian; Melo, Bruno R. P.; Pamplona, Fabricio A.; Zahn, Roland; Moll, Jorge

    2015-01-01

    In this methods article, we present a new implementation of a recently reported FSL-integrated neurofeedback tool, the standalone version of “Functional Real-time Interactive Endogenous Neuromodulation and Decoding” (FRIEND). We will refer to this new implementation as the FRIEND Engine Framework. The framework comprises a client-server cross-platform solution for real time fMRI and fMRI/EEG neurofeedback studies, enabling flexible customization or integration of graphical interfaces, devices, and data processing. This implementation allows a fast setup of novel plug-ins and frontends, which can be shared with the user community at large. The FRIEND Engine Framework is freely distributed for non-commercial, research purposes. PMID:25688193

  4. FRIEND Engine Framework: A real time neurofeedback client-server system for neuroimaging studies

    Directory of Open Access Journals (Sweden)

    Rodrigo eBasilio

    2015-01-01

    Full Text Available In this methods article, we present a new implementation of a recently reported FSL-integrated neurofeedback tool, the standalone version of Functional Real-time Interactive Endogenous Modulation and Decoding (FRIEND. We will refer to this new implementation as the FRIEND Engine Framework. The framework comprises a client-server cross-platform solution for real time fMRI and fMRI/EEG neurofeedback studies, enabling flexible customization or integration of graphical interfaces, devices and data processing. This implementation allows a fast setup of novel plug-ins and frontends, which can be shared with the user community at large. The FRIEND Engine Framework is freely distributed for non-commercial, research purposes.

  5. Establishing a framework to implement 4D XCAT Phantom for 4D radiotherapy research

    Directory of Open Access Journals (Sweden)

    Raj K Panta

    2012-01-01

    Conclusions: An integrated computer program has been developed to generate, review, analyse, process, and export the 4D XCAT images. A framework has been established to implement the 4D XCAT phantom for 4D RT research.

  6. Understanding effects in reviews of implementation interventions using the Theoretical Domains Framework.

    Science.gov (United States)

    Little, Elizabeth A; Presseau, Justin; Eccles, Martin P

    2015-06-17

    Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to

  7. Cryptographically Secure Multiparty Computation and Distributed Auctions Using Homomorphic Encryption

    Directory of Open Access Journals (Sweden)

    Anunay Kulshrestha

    2017-12-01

    Full Text Available We introduce a robust framework that allows for cryptographically secure multiparty computations, such as distributed private value auctions. The security is guaranteed by two-sided authentication of all network connections, homomorphically encrypted bids, and the publication of zero-knowledge proofs of every computation. This also allows a non-participant verifier to verify the result of any such computation using only the information broadcasted on the network by each individual bidder. Building on previous work on such systems, we design and implement an extensible framework that puts the described ideas to practice. Apart from the actual implementation of the framework, our biggest contribution is the level of protection we are able to guarantee from attacks described in previous work. In order to provide guidance to users of the library, we analyze the use of zero knowledge proofs in ensuring the correct behavior of each node in a computation. We also describe the usage of the library to perform a private-value distributed auction, as well as the other challenges in implementing the protocol, such as auction registration and certificate distribution. Finally, we provide performance statistics on our implementation of the auction.

  8. PIRPOSAL Model of Integrative STEM Education: Conceptual and Pedagogical Framework for Classroom Implementation

    Science.gov (United States)

    Wells, John G.

    2016-01-01

    The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…

  9. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  10. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation.

    Science.gov (United States)

    Scobbie, Lesley; McLean, Donald; Dixon, Diane; Duncan, Edward; Wyke, Sally

    2013-05-24

    Goal setting is considered 'best practice' in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient's well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. G-AP has been perceived as both beneficial and broadly acceptable in one community rehabilitation team; however, implementation of novel

  11. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation

    Science.gov (United States)

    2013-01-01

    Background Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. Methods G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. Results G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient’s well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. Conclusions G-AP has been perceived as both beneficial and broadly acceptable in one community

  12. Implementing Distributed Operations: A Comparison of Two Deep Space Missions

    Science.gov (United States)

    Mishkin, Andrew; Larsen, Barbara

    2006-01-01

    Two very different deep space exploration missions--Mars Exploration Rover and Cassini--have made use of distributed operations for their science teams. In the case of MER, the distributed operations capability was implemented only after the prime mission was completed, as the rovers continued to operate well in excess of their expected mission lifetimes; Cassini, designed for a mission of more than ten years, had planned for distributed operations from its inception. The rapid command turnaround timeline of MER, as well as many of the operations features implemented to support it, have proven to be conducive to distributed operations. These features include: a single science team leader during the tactical operations timeline, highly integrated science and engineering teams, processes and file structures designed to permit multiple team members to work in parallel to deliver sequencing products, web-based spacecraft status and planning reports for team-wide access, and near-elimination of paper products from the operations process. Additionally, MER has benefited from the initial co-location of its entire operations team, and from having a single Principal Investigator, while Cassini operations have had to reconcile multiple science teams distributed from before launch. Cassini has faced greater challenges in implementing effective distributed operations. Because extensive early planning is required to capture science opportunities on its tour and because sequence development takes significantly longer than sequence execution, multiple teams are contributing to multiple sequences concurrently. The complexity of integrating inputs from multiple teams is exacerbated by spacecraft operability issues and resource contention among the teams, each of which has their own Principal Investigator. Finally, much of the technology that MER has exploited to facilitate distributed operations was not available when the Cassini ground system was designed, although later adoption

  13. Surgical model-view-controller simulation software framework for local and collaborative applications.

    Science.gov (United States)

    Maciel, Anderson; Sankaranarayanan, Ganesh; Halic, Tansel; Arikatla, Venkata Sreekanth; Lu, Zhonghua; De, Suvranu

    2011-07-01

    Surgical simulations require haptic interactions and collaboration in a shared virtual environment. A software framework for decoupled surgical simulation based on a multi-controller and multi-viewer model-view-controller (MVC) pattern was developed and tested. A software framework for multimodal virtual environments was designed, supporting both visual interactions and haptic feedback while providing developers with an integration tool for heterogeneous architectures maintaining high performance, simplicity of implementation, and straightforward extension. The framework uses decoupled simulation with updates of over 1,000 Hz for haptics and accommodates networked simulation with delays of over 1,000 ms without performance penalty. The simulation software framework was implemented and was used to support the design of virtual reality-based surgery simulation systems. The framework supports the high level of complexity of such applications and the fast response required for interaction with haptics. The efficacy of the framework was tested by implementation of a minimally invasive surgery simulator. A decoupled simulation approach can be implemented as a framework to handle simultaneous processes of the system at the various frame rates each process requires. The framework was successfully used to develop collaborative virtual environments (VEs) involving geographically distributed users connected through a network, with the results comparable to VEs for local users.

  14. Spiking Activity of a LIF Neuron in Distributed Delay Framework

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-06-01

    Full Text Available Evolution of membrane potential and spiking activity for a single leaky integrate-and-fire (LIF neuron in distributed delay framework (DDF is investigated. DDF provides a mechanism to incorporate memory element in terms of delay (kernel function into a single neuron models. This investigation includes LIF neuron model with two different kinds of delay kernel functions, namely, gamma distributed delay kernel function and hypo-exponential distributed delay kernel function. Evolution of membrane potential for considered models is studied in terms of stationary state probability distribution (SPD. Stationary state probability distribution of membrane potential (SPDV for considered neuron models are found asymptotically similar which is Gaussian distributed. In order to investigate the effect of membrane potential delay, rate code scheme for neuronal information processing is applied. Firing rate and Fano-factor for considered neuron models are calculated and standard LIF model is used for comparative study. It is noticed that distributed delay increases the spiking activity of a neuron. Increase in spiking activity of neuron in DDF is larger for hypo-exponential distributed delay function than gamma distributed delay function. Moreover, in case of hypo-exponential delay function, a LIF neuron generates spikes with Fano-factor less than 1.

  15. A Framework for Distributed Problem Solving

    Science.gov (United States)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  16. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  17. DeepSpark: A Spark-Based Distributed Deep Learning Framework for Commodity Clusters

    OpenAIRE

    Kim, Hanjoo; Park, Jaehong; Jang, Jaehee; Yoon, Sungroh

    2016-01-01

    The increasing complexity of deep neural networks (DNNs) has made it challenging to exploit existing large-scale data processing pipelines for handling massive data and parameters involved in DNN training. Distributed computing platforms and GPGPU-based acceleration provide a mainstream solution to this computational challenge. In this paper, we propose DeepSpark, a distributed and parallel deep learning framework that exploits Apache Spark on commodity clusters. To support parallel operation...

  18. A development framework for distributed artificial intelligence

    Science.gov (United States)

    Adler, Richard M.; Cottman, Bruce H.

    1989-01-01

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.

  19. Implementing Peer Learning in Clinical Education: A Framework to Address Challenges In the "Real World".

    Science.gov (United States)

    Tai, Joanna Hong Meng; Canny, Benedict J; Haines, Terry P; Molloy, Elizabeth K

    2017-01-01

    Phenomenon: Peer learning has many benefits and can assist students in gaining the educational skills required in future years when they become teachers themselves. Peer learning may be particularly useful in clinical learning environments, where students report feeling marginalized, overwhelmed, and unsupported. Educational interventions often fail in the workplace environment, as they are often conceived in the "ideal" rather than the complex, messy real world. This work sought to explore barriers and facilitators to implementing peer learning activities in a clinical curriculum. Previous peer learning research results and a matrix of empirically derived peer learning activities were presented to local clinical education experts to generate discussion around the realities of implementing such activities. Potential barriers and limitations of and strategies for implementing peer learning in clinical education were the focus of the individual interviews. Thematic analysis of the data identified three key considerations for real-world implementation of peer learning: culture, epistemic authority, and the primacy of patient-centered care. Strategies for peer learning implementation were also developed from themes within the data, focusing on developing a culture of safety in which peer learning could be undertaken, engaging both educators and students, and establishing expectations for the use of peer learning. Insights: This study identified considerations and strategies for the implementation of peer learning activities, which took into account both educator and student roles. Reported challenges were reflective of those identified within the literature. The resultant framework may aid others in anticipating implementation challenges. Further work is required to test the framework's application in other contexts and its effect on learner outcomes.

  20. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt; Skinner, Katherine

    apparatuses in order to achieve the reliable persistence of digital content. Although the use of distribution is common within the preservation field, there is not yet an accepted definition for “distributed digital preservation”. As the preservation field has matured, the term “distributed digital...... preservation” has been applied to myriad preservation approaches. In the white paper we define DDP as the use of replication, independence, and coordination to address the known threats to digital content through time to ensure their accessibility. The preservation field relies heavily upon an international......, delineating the various trends and practices that compel an elaboration upon OAIS, identifying the challenges ahead for advancing this endeavor, and putting forward a series of recommendations for making progress toward developing a formal framework for a DDP environment....

  1. Using the Consolidated Framework for Implementation Research to Identify Barriers and Facilitators for the Implementation of an Internet-Based Patient-Provider Communication Service in Five Settings: A Qualitative Study.

    Science.gov (United States)

    Varsi, Cecilie; Ekstedt, Mirjam; Gammon, Deede; Ruland, Cornelia M

    2015-11-18

    Although there is growing evidence of the positive effects of Internet-based patient-provider communication (IPPC) services for both patients and health care providers, their implementation into clinical practice continues to be a challenge. The 3 aims of this study were to (1) identify and compare barriers and facilitators influencing the implementation of an IPPC service in 5 hospital units using the Consolidated Framework for Implementation Research (CFIR), (2) assess the ability of the different constructs of CFIR to distinguish between high and low implementation success, and (3) compare our findings with those from other studies that used the CFIR to discriminate between high and low implementation success. This study was based on individual interviews with 10 nurses, 6 physicians, and 1 nutritionist who had used the IPPC to answer messages from patients. Of the 36 CFIR constructs, 28 were addressed in the interviews, of which 12 distinguished between high and low implementation units. Most of the distinguishing constructs were related to the inner setting domain of CFIR, indicating that institutional factors were particularly important for successful implementation. Health care providers' beliefs in the intervention as useful for themselves and their patients as well as the implementation process itself were also important. A comparison of constructs across ours and 2 other studies that also used the CFIR to discriminate between high and low implementation success showed that 24 CFIR constructs distinguished between high and low implementation units in at least 1 study; 11 constructs distinguished in 2 studies. However, only 2 constructs (patient need and resources and available resources) distinguished consistently between high and low implementation units in all 3 studies. The CFIR is a helpful framework for illuminating barriers and facilitators influencing IPPC implementation. However, CFIR's strength of being broad and comprehensive also limits its

  2. Concurrent and Distributed Applications with ActoDeS

    Directory of Open Access Journals (Sweden)

    Bergenti Federico

    2016-01-01

    Full Text Available ActoDeS is a software framework for the development of large concurrent and distributed systems. This software framework takes advantage of the actor model and of an its implementation that makes easy the development of the actor code by delegating the management of events (i.e., the reception of messages to the execution environment. Moreover, it allows the development of scalable and efficient applications through the possibility of using different implementations of the components that drive the execution of actors. In particular, the paper introduces the software framework and presents the results of its experimentation.

  3. Reducing Binge Drinking in Adolescents through Implementation of the Strategic Prevention Framework

    Science.gov (United States)

    Anderson-Carpenter, Kaston D.; Watson-Thompson, Jomella; Chaney, Lisa; Jones, Marvia

    2016-01-01

    The Strategic Prevention Framework (SPF) is a conceptual model that supports coalition-driven efforts to address underage drinking and related consequences. Although the SPF has been promoted by the U.S. Substance Abuse and Mental Health Services Administration’s Center for Substance Abuse Prevention and implemented in multiple U.S. states and territories, there is limited research on the SPF’s effectiveness on improving targeted outcomes and associated influencing factors. The present quasi-experimental study examines the effects of SPF implementation on binge drinking and enforcement of existing underage drinking laws as an influencing factor. The intervention group encompassed 11 school districts that were implementing the SPF with local prevention coalitions across eight Kansas communities. The comparison group consisted of 14 school districts that were matched based on demographic variables. The intervention districts collectively facilitated 137 community-level changes, including new or modified programs, policies, and practices. SPF implementation supported significant improvements in binge drinking and enforcement outcomes over time (p .05). Overall, the findings provide a basis for guiding future research and community-based prevention practice in implementing and evaluating the SPF. PMID:27217310

  4. The SBIRT program matrix: a conceptual framework for program implementation and evaluation.

    Science.gov (United States)

    Del Boca, Frances K; McRee, Bonnie; Vendetti, Janice; Damon, Donna

    2017-02-01

    Screening, Brief Intervention and Referral to Treatment (SBIRT) is a comprehensive, integrated, public health approach to the delivery of services to those at risk for the adverse consequences of alcohol and other drug use, and for those with probable substance use disorders. Research on successful SBIRT implementation has lagged behind studies of efficacy and effectiveness. This paper (1) outlines a conceptual framework, the SBIRT Program Matrix, to guide implementation research and program evaluation and (2) specifies potential implementation outcomes. Overview and narrative description of the SBIRT Program Matrix. The SBIRT Program Matrix has five components, each of which includes multiple elements: SBIRT services; performance sites; provider attributes; patient/client populations; and management structure and activities. Implementation outcomes include program adoption, acceptability, appropriateness, feasibility, fidelity, costs, penetration, sustainability, service provision and grant compliance. The Screening, Brief Intervention and Referral to Treatment Program Matrix provides a template for identifying, classifying and organizing the naturally occurring commonalities and variations within and across SBIRT programs, and for investigating which variables are associated with implementation success and, ultimately, with treatment outcomes and other impacts. © 2017 Society for the Study of Addiction.

  5. A CONCEPTUAL FRAMEWORK OF DISTRIBUTIVE JUSTICE IN ISLAMIC ECONOMICS

    Directory of Open Access Journals (Sweden)

    Shafinah Begum Abdul Rahim

    2015-06-01

    Full Text Available itical, behavioural and social sciences both in mainstream or Islam. Given its increasing relevance to the global village we share and the intensity of socio-economic problems invariably related to the distribution of resources amongst us, this work is aimed at adding value through a deeper understanding and appreciation of justice placed by the Syariah in all domains of of our economic lives. The existing works within this area appear to lean mostly towards redistributive mechanisms available in the revealed knowledge. Hence a comprehensive analysis of the notion of distributive justice from the theoretical level translated into practical terms is expected to contribute significantly to policymakers committed towards finding permanent solutions to economic problems especially in the Muslim world. It is a modest yet serious attempt to bridge the gap between distributive justice in letter and spirit as clearly ordained in the Holy Quran. The entire analysis is based on critical reviews and appraisals of the all relevant literary on distributive justice in Islamic Economics. The final product is a conceptual framework that can be used as a blueprint in establishing the notion of justice in the distribution of economic resources, i.e. income and wealth as aspired by the Syariah.

  6. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  7. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  8. A framework for evaluating distributed control systems in nuclear power plants

    International Nuclear Information System (INIS)

    O'Donell, C.; Jiang, J.

    2004-01-01

    A framework for evaluating the use of distributed control systems (DCS) in nuclear power plants (NPP) is proposed in this paper. The framework consists of advanced communication, control, hardware and software technology. This paper presents the results of an experiment using the framework test-bench, and elaborates on a variety of other research possibilities. Using a hardware in the loop system (HIL) a DeltaV M3 controller from Emerson Process is connected to a desktop NPP simulator. The industry standard communication protocol, Modbus, has been selected in this study. A simplified boiler pressure control (BPC) module is created on the NPP simulator. The test-bench provides an interface between the controller and the simulator. Through software monitoring the performance of the DCS can be evaluated. Controller access and response times over the Modbus network are observed and compared with theoretical values. The controller accomplishes its task under the specifications set out for the BPC. This novel framework allows a performance metric to be applied against different industrial controllers. (author)

  9. The development of an implementation framework for service-learning during the undergraduate nursing programme in the Western Cape Province

    Directory of Open Access Journals (Sweden)

    Hester Julie

    2015-11-01

    Full Text Available Background: Service-learning (SL is a contested field of knowledge and issues of sustainability and scholarship have been raised about it. The South African Higher Education Quality Committee (HEQC has provided policy documents to guide higher education institutions (HEIs in the facilitation of SL institutionalisation in their academic programmes. An implementation framework was therefore needed to institutionalise the necessary epistemological shifts advocated in the national SL policy guidelines. Objectives: This article is based on the findings of a doctoral thesis that aimed at developing an SL implementation framework for the School of Nursing (SoN at the University of the Western Cape (UWC. Method: Mixed methods were used during the first four phases of the design and developmenti ntervention research model developed by Rothman and Thomas. Results: The SL implementation framework that was developed during Phase 3 specified the intervention elements to address the gaps that had been identified by the core findings of Phases 1 and 2. Four intervention elements were specified for the SL implementation framework. The first intervention element focused on the assessment of readiness for SL institutionalisation. The development of SL capacity and SL scholarship was regarded as the pivotal intervention element for three of the elements: the development of a contextual SL definition, an SL pedagogical model, and a monitoring and evaluation system for SL institutionalisation. Conclusion: The SL implementation framework satisfies the goals of SL institutionalisation, namely to develop a common language and a set of principles to guide practice, and to ensure the allocation of resources in order to facilitate the SL teaching methodology.The contextualised SL definition that was formulated for the SoN contributes to the SL operationalisation discourse at the HEI.

  10. A Strategic Approach to Curriculum Design for Information Literacy in Teacher Education--Implementing an Information Literacy Conceptual Framework

    Science.gov (United States)

    Klebansky, Anna; Fraser, Sharon P.

    2013-01-01

    This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…

  11. Distributed tactical reasoning framework for intelligent vehicles

    Science.gov (United States)

    Sukthankar, Rahul; Pomerleau, Dean A.; Thorpe, Chuck E.

    1998-01-01

    In independent vehicle concepts for the Automated Highway System (AHS), the ability to make competent tactical-level decisions in real-time is crucial. Traditional approaches to tactical reasoning typically involve the implementation of large monolithic systems, such as decision trees or finite state machines. However, as the complexity of the environment grows, the unforeseen interactions between components can make modifications to such systems very challenging. For example, changing an overtaking behavior may require several, non-local changes to car-following, lane changing and gap acceptance rules. This paper presents a distributed solution to the problem. PolySAPIENT consists of a collection of autonomous modules, each specializing in a particular aspect of the driving task - classified by traffic entities rather than tactical behavior. Thus, the influence of the vehicle ahead on the available actions is managed by one reasoning object, while the implications of an approaching exit are managed by another. The independent recommendations form these reasoning objects are expressed in the form of votes and vetos over a 'tactical action space', and are resolved by a voting arbiter. This local independence enables PolySAPIENT reasoning objects to be developed independently, using a heterogenous implementation. PolySAPIENT vehicles are implemented in the SHIVA tactical highway simulator, whose vehicles are based on the Carnegie Mellon Navlab robots.

  12. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  13. Implementation of evidence into practice for cancer-related fatigue management of hospitalized adult patients using the PARIHS framework.

    Directory of Open Access Journals (Sweden)

    Li Tian

    Full Text Available This study aimed to explore an evidence-based nursing practice model of CRF management in hospitalized adult patients using the PARIHS evidence-implementation framework as the theoretical structure to provide guidance for similar nursing practices. The implementation of guideline evidence into clinical practice was conducted on the oncology and radiotherapy wards of a university-affiliated hospital. The process of integrating the guideline into the symptom management system of cancer patients was described. The impact of the evidence implementation was evaluated from three aspects: organizational innovations and outcome measures associated with nurses and with patients pre- and post-evidence implementation. During the implementation of evidence into practice on the wards, a nursing process, health education, a quality control sheet and CRF training courses were established. Through this implementation, compliance with evidence related to CRF increased significantly on the two wards, with that of ward B being higher than that of ward A. Regarding nursing outcomes, nursing knowledge, attitude and behavior scores with respect to CRF nursing care increased substantially after its application on the two wards, and the ward B nurses' scoring was higher than that of the ward A nurses. Qualitative analysis concerning the nurses suggested that leadership, patient concern about CRF management, and the need for professional development were the main motivators of the application, whereas the shortage and mobility of nursing human resources and insufficient communication between doctors and nurses were the main barriers. Additionally, most nurses felt more professional and confident about their work. Regarding patient outcomes, patient knowledge, attitude and behavior scores regarding CRF self-management increased significantly. Patients' post-implementation CRF was alleviated compared with the pre-implementation treatment cycle. The PARIHS framework may

  14. Parallel Molecular Distributed Detection With Brownian Motion.

    Science.gov (United States)

    Rogers, Uri; Koh, Min-Sung

    2016-12-01

    This paper explores the in vivo distributed detection of an undesired biological agent's (BAs) biomarkers by a group of biological sized nanomachines in an aqueous medium under drift. The term distributed, indicates that the system information relative to the BAs presence is dispersed across the collection of nanomachines, where each nanomachine possesses limited communication, computation, and movement capabilities. Using Brownian motion with drift, a probabilistic detection and optimal data fusion framework, coined molecular distributed detection, will be introduced that combines theory from both molecular communication and distributed detection. Using the optimal data fusion framework as a guide, simulation indicates that a sub-optimal fusion method exists, allowing for a significant reduction in implementation complexity while retaining BA detection accuracy.

  15. Distributed Database Management Systems A Practical Approach

    CERN Document Server

    Rahimi, Saeed K

    2010-01-01

    This book addresses issues related to managing data across a distributed database system. It is unique because it covers traditional database theory and current research, explaining the difficulties in providing a unified user interface and global data dictionary. The book gives implementers guidance on hiding discrepancies across systems and creating the illusion of a single repository for users. It also includes three sample frameworksâ€"implemented using J2SE with JMS, J2EE, and Microsoft .Netâ€"that readers can use to learn how to implement a distributed database management system. IT and

  16. A Transparent Framework for Evaluating the Effects of DGPV on Distribution System Costs

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mather, Barry A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Denholm, Paul L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-02

    Assessing the costs and benefits of distributed photovoltaic generators (DGPV) to the power system and electricity consumers is key to determining appropriate policies, tariff designs, and power system upgrades for the modern grid. We advance understanding of this topic by providing a transparent framework, terminology, and data set for evaluating distribution system upgrade costs, line losses, and interconnection costs as a function of DGPV penetration level.

  17. Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure

    Science.gov (United States)

    Sloan, Tine

    2013-01-01

    This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…

  18. A Framework for Enhancing the Value of Research for Dissemination and Implementation.

    Science.gov (United States)

    Neta, Gila; Glasgow, Russell E; Carpenter, Christopher R; Grimshaw, Jeremy M; Rabin, Borsika A; Fernandez, Maria E; Brownson, Ross C

    2015-01-01

    A comprehensive guide that identifies critical evaluation and reporting elements necessary to move research into practice is needed. We propose a framework that highlights the domains required to enhance the value of dissemination and implementation research for end users. We emphasize the importance of transparent reporting on the planning phase of research in addition to delivery, evaluation, and long-term outcomes. We highlight key topics for which well-established reporting and assessment tools are underused (e.g., cost of intervention, implementation strategy, adoption) and where such tools are inadequate or lacking (e.g., context, sustainability, evolution) within the context of existing reporting guidelines. Consistent evaluation of and reporting on these issues with standardized approaches would enhance the value of research for practitioners and decision-makers.

  19. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

    Directory of Open Access Journals (Sweden)

    Alexander Jeffery A

    2009-08-01

    Full Text Available Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality, four constructs were identified related to outer setting (e.g., patient needs and resources, 12 constructs were identified related to inner setting (e.g., culture

  20. A Distributed Python HPC Framework: ODIN, PyTrilinos, & Seamless

    Energy Technology Data Exchange (ETDEWEB)

    Grant, Robert [Enthought, Inc., Austin, TX (United States)

    2015-11-23

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports the DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.

  1. Implementing the European Marine Strategy Framework Directive: Scientific challenges and opportunities

    Science.gov (United States)

    Newton, Alice; Borja, Angel; Solidoro, Cosimo; Grégoire, Marilaure

    2015-10-01

    The Marine Strategy Framework Directive (MSFD; EC, 2008) is an ambitious European policy instrument that aims to achieve Good Environmental Status (GES) in the 5,720,000 km2 of European seas by 2020, using an Ecosystem Approach. GES is to be assessed using 11 descriptors and up to 56 indicators (European Commission, 2010), and the goal is for clean, healthy and productive seas that are the basis for marine-based development, known as Blue-Growth. The MSFD is one of many policy instruments, such as the Water Framework Directive, the Common Fisheries Policy and the Habitats Directive that, together, should result in "Healthy Oceans and Productive Ecosystems - HOPE". Researchers working together with stakeholders such as the Member States environmental agencies, the European Environmental Agency, and the Regional Sea Conventions, are to provide the scientific knowledge basis for the implementation of the MSFD. This represents both a fascinating challenge and a stimulating opportunity.

  2. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series

    Directory of Open Access Journals (Sweden)

    Mittman Brian S

    2008-05-01

    Full Text Available Abstract Background The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers. The Quality Enhancement Research Initiative (QUERI was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI. Strategic approach to organizational change QUERI used an evidence-based organizational framework focused on three contextual elements: 1 cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2 capacity, in this case among researchers and key partners to engage in implementation research; 3 and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in Implementation Science, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery. Conclusion QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the

  3. Development of a framework towards successful implementation of e-governance initiatives in health sector in India.

    Science.gov (United States)

    Ray, Subhasis; Mukherjee, Amitava

    2007-01-01

    The purpose of this paper is to explore the route map for employing efficient e-governance so that at least existing resource and infrastructure are better utilized and deficiencies are tracked for future planning. National health is one of the most important factors in a country's economic growth. India seems to be a victim of the vicious cycle around poor economy and poor health conditions. A detailed study was carried out to find out India's healthcare infrastructure and its standing in e-governance initiatives. After consolidating the fact that effective e-governance can enhance the quality of healthcare service even within limited resources, authors explored success and failure factors of many e-governance initiatives in India and abroad. Finally, an e-governance framework is suggested based on the above factors together with the authors' own experience of implementing e-governance projects in India and abroad. The suggested framework is based on a phased implementation approach. The first phase "Information Dissemination" is more geared towards breaking the "digital divide" across three dimensions: G2Business; G2Citizen; and G2Agent. The most advanced stage is aimed towards joining up healthcare information across the above three dimensions and drawing meaningful analytics out of it. The recommendations also include management of Policies, Scope, Process Reform, Infrastructure, Technology, Finance, Partnership and People for efficient implementation of such e-governance initiatives. The paper provides measures for continuous evaluation of systems as one passes through various stages of implementation. However, the framework can be tested on real or simulated environment to prove its worthiness. This paper can be a potential frame of reference for nation-wide e-healthcare projects not only in India but also in other developing countries. The paper also describes challenges that are most likely to be faced during implementation. Since the paper is practical in

  4. TMD PDFs. A Monte Carlo implementation for the sea quark distribution

    International Nuclear Information System (INIS)

    Hautmann, F.

    2012-05-01

    This article gives an introduction to transverse momentum dependent (TMD) parton distribution functions and their use in shower Monte Carlo event generators for high-energy hadron collisions, and describes recent progress in the treatment of sea quark effects within a TMD parton-shower framework.

  5. AFECS. multi-agent framework for experiment control systems

    Energy Technology Data Exchange (ETDEWEB)

    Gyurjyan, V; Abbott, D; Heyes, G; Jastrzembski, E; Timmer, C; Wolin, E [Jefferson Lab, 12000 Jefferson Ave. MS-12B3, Newport News, VA 23606 (United States)], E-mail: gurjyan@jlab.org

    2008-07-01

    AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC.

  6. AFECS. Multi-Agent Framework for Experiment Control Systems

    Energy Technology Data Exchange (ETDEWEB)

    Vardan Gyurjyan; David Abbott; William Heyes; Edward Jastrzembski; Carl Timmer; Elliott Wolin

    2008-01-23

    AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC.

  7. AFECS. multi-agent framework for experiment control systems

    International Nuclear Information System (INIS)

    Gyurjyan, V; Abbott, D; Heyes, G; Jastrzembski, E; Timmer, C; Wolin, E

    2008-01-01

    AFECS is a pure Java based software framework for designing and implementing distributed control systems. AFECS creates a control system environment as a collection of software agents behaving as finite state machines. These agents can represent real entities, such as hardware devices, software tasks, or control subsystems. A special control oriented ontology language (COOL), based on RDFS (Resource Definition Framework Schema) is provided for control system description as well as for agent communication. AFECS agents can be distributed over a variety of platforms. Agents communicate with their associated physical components using range of communication protocols, including tcl-DP, cMsg (publish-subscribe communication system developed at Jefferson Lab), SNMP (simple network management protocol), EPICS channel access protocol and JDBC

  8. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    Science.gov (United States)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  9. Protocol design and implementation using formal methods

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Ferreira Pires, Luis; Pires, L.F.; Vissers, C.A.

    1992-01-01

    This paper reports on a number of formal methods that support correct protocol design and implementation. These methods are placed in the framework of a design methodology for distributed systems that was studied and developed within the ESPRIT II Lotosphere project (2304). The paper focuses on

  10. National Qualifications Framework For Higher Education in Turkey, and Architectural Education: Problems and Challenges of Implementation

    Directory of Open Access Journals (Sweden)

    Emel AKÖZER

    2013-01-01

    Full Text Available The Council of Higher Education (CoHE adopted the National Qualifications Framework for Higher Education in Turkey (NQF-HETR in May 2009, as part of the Bologna reforms. In January 2010, the CoHE decided full implementation of the NQF-HETR at institutional and program levels and in this decision, it was foreseen that the process would be completed by the end of December 2012. The NQFHETR has been aligned both to the overarching Framework for Qualifications in the European Higher Education Area (QF-EHEA, 2005 and to the European Qualifications Framework for lifelong learning (EQF-LLL, 2008. The latter was introduced to facilitate the European cooperation in education and training, in line with the goals of the European Union's (EU Lisbon Strategy. This paper focuses on some of the problems that have become apparent during the NQF-HETR's implementation at the levels of “narrow fields of education” and architecture programs, and the challenges ahead. Following a discussion of the significance of the two European frameworks in light of the goals of the EHEA, the Education and Training 2010 work programme (ET 2010 and the strategic framework for European cooperation in Education and Training (ET 2020, it covers two problem areas concerning qualifications in architecture: i terminological and classificatory problems entailed by the NQF-HETR; ii the lack of alignment between the European qualifications frameworks and the EU Directive on the Recognition of Professional Qualifications (Directive EC/2005/36 that covers seven “sectoral professions” including architecture. The paper also reviews the latest developments for the modernization of the EU Directive in order to provide progression in forming an integrated European Higher Education Area.

  11. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; The ATLAS collaboration; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows the sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain confugurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the...

  12. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of Ath...

  13. Intercomparison of Streamflow Simulations between WRF-Hydro and Hydrology Laboratory-Research Distributed Hydrologic Model Frameworks

    Science.gov (United States)

    KIM, J.; Smith, M. B.; Koren, V.; Salas, F.; Cui, Z.; Johnson, D.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA)-National Weather Service (NWS) developed the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) framework as an initial step towards spatially distributed modeling at River Forecast Centers (RFCs). Recently, the NOAA/NWS worked with the National Center for Atmospheric Research (NCAR) to implement the National Water Model (NWM) for nationally-consistent water resources prediction. The NWM is based on the WRF-Hydro framework and is run at a 1km spatial resolution and 1-hour time step over the contiguous United States (CONUS) and contributing areas in Canada and Mexico. In this study, we compare streamflow simulations from HL-RDHM and WRF-Hydro to observations from 279 USGS stations. For streamflow simulations, HL-RDHM is run on 4km grids with the temporal resolution of 1 hour for a 5-year period (Water Years 2008-2012), using a priori parameters provided by NOAA-NWS. The WRF-Hydro streamflow simulations for the same time period are extracted from NCAR's 23 retrospective run of the NWM (version 1.0) over CONUS based on 1km grids. We choose 279 USGS stations which are relatively less affected by dams or reservoirs, in the domains of six different RFCs. We use the daily average values of simulations and observations for the convenience of comparison. The main purpose of this research is to evaluate how HL-RDHM and WRF-Hydro perform at USGS gauge stations. We compare daily time-series of observations and both simulations, and calculate the error values using a variety of error functions. Using these plots and error values, we evaluate the performances of HL-RDHM and WRF-Hydro models. Our results show a mix of model performance across geographic regions.

  14. Implementation and validation of the condensation model for containment hydrogen distribution studies

    International Nuclear Information System (INIS)

    Ravva, Srinivasa Rao; Iyer, Kannan N.; Gupta, S.K.; Gaikwad, Avinash J.

    2014-01-01

    Highlights: • A condensation model based on diffusion was implemented in FLUENT. • Validation of a condensation model for the H 2 distribution studies was performed. • Multi-component diffusion is used in the present work. • Appropriate grid and turbulence model were identified. - Abstract: This paper aims at the implementation details of a condensation model in the CFD code FLUENT and its validation so that it can be used in performing the containment hydrogen distribution studies. In such studies, computational fluid dynamics simulations are necessary for obtaining accurate predictions. While steam condensation plays an important role, commercial CFD codes such as FLUENT do not have an in-built condensation model. Therefore, a condensation model was developed and implemented in the FLUENT code through user defined functions (UDFs) for the sink terms in the mass, momentum, energy and species balance equations together with associated turbulence quantities viz., kinetic energy and dissipation rate. The implemented model was validated against the ISP-47 test of TOSQAN facility using the standard wall functions and enhanced wall treatment approaches. The best suitable grid size and the turbulence model for the low density gas (He) distribution studies are brought out in this paper

  15. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    Science.gov (United States)

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to

  16. Collaborative Windows – A User Interface Concept for Distributed Collaboration

    DEFF Research Database (Denmark)

    Esbensen, Morten

    2016-01-01

    where close collaboration and frequent meetings drive the work. One way to achieve this way of working is to implement the Scrum software development framework. Implementing Scrum in globalized context however, requires transforming the Scrum development methods to a distributed setup and extensive use...... of collaboration technologies. In this dissertation, I explore how novel collaboration technologies can support closely coupled distributed work such as that in distributed Scrum. This research is based on three different studies: an ethnographic field study of distributed Scrum between Danish and Indian software...

  17. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  18. An Open Source Extensible Smart Energy Framework

    Energy Technology Data Exchange (ETDEWEB)

    Rankin, Linda [V-Squared, Portland, OR (United States)

    2017-03-23

    Aggregated distributed energy resources are the subject of much interest in the energy industry and are expected to play an important role in meeting our future energy needs by changing how we use, distribute and generate electricity. This energy future includes an increased amount of energy from renewable resources, load management techniques to improve resiliency and reliability, and distributed energy storage and generation capabilities that can be managed to meet the needs of the grid as well as individual customers. These energy assets are commonly referred to as Distributed Energy Resources (DER). DERs rely on a means to communicate information between an energy provider and multitudes of devices. Today DER control systems are typically vendor-specific, using custom hardware and software solutions. As a result, customers are locked into communication transport protocols, applications, tools, and data formats. Today’s systems are often difficult to extend to meet new application requirements, resulting in stranded assets when business requirements or energy management models evolve. By partnering with industry advisors and researchers, an implementation DER research platform was developed called the Smart Energy Framework (SEF). The hypothesis of this research was that an open source Internet of Things (IoT) framework could play a role in creating a commodity-based eco-system for DER assets that would reduce costs and provide interoperable products. SEF is based on the AllJoynTM IoT open source framework. The demonstration system incorporated DER assets, specifically batteries and smart water heaters. To verify the behavior of the distributed system, models of water heaters and batteries were also developed. An IoT interface for communicating between the assets and a control server was defined. This interface supports a series of “events” and telemetry reporting, similar to those defined by current smart grid communication standards. The results of this

  19. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  20. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    Science.gov (United States)

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  1. A portable implementation of ARPACK for distributed memory parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Maschhoff, K.J.; Sorensen, D.C.

    1996-12-31

    ARPACK is a package of Fortran 77 subroutines which implement the Implicitly Restarted Arnoldi Method used for solving large sparse eigenvalue problems. A parallel implementation of ARPACK is presented which is portable across a wide range of distributed memory platforms and requires minimal changes to the serial code. The communication layers used for message passing are the Basic Linear Algebra Communication Subprograms (BLACS) developed for the ScaLAPACK project and Message Passing Interface(MPI).

  2. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  3. A Novel Implementation Strategy in Residential Care Settings to Promote EBP: Direct Care Provider Perceptions and Development of a Conceptual Framework.

    Science.gov (United States)

    Slaughter, Susan E; Bampton, Erin; Erin, Daniel F; Ickert, Carla; Jones, C Allyson; Estabrooks, Carole A

    2017-06-01

    Innovative approaches are required to facilitate the adoption and sustainability of evidence-based care practices. We propose a novel implementation strategy, a peer reminder role, which involves offering a brief formal reminder to peers during structured unit meetings. This study aims to (a) identify healthcare aide (HCA) perceptions of a peer reminder role for HCAs, and (b) develop a conceptual framework for the role based on these perceptions. In 2013, a qualitative focus group study was conducted in five purposively sampled residential care facilities in western Canada. A convenience sample of 24 HCAs agreed to participate in five focus groups. Concurrent with data collection, two researchers coded the transcripts and identified themes by consensus. They jointly determined when saturation was achieved and took steps to optimize the trustworthiness of the findings. Five HCAs from the original focus groups commented on the resulting conceptual framework. HCAs were cautious about accepting a role that might alienate them from their co-workers. They emphasized feeling comfortable with the peer reminder role and identified circumstances that would optimize their comfort including: effective implementation strategies, perceptions of the role, role credibility and a supportive context. These intersecting themes formed a peer reminder conceptual framework. We identified HCAs' perspectives of a new peer reminder role designed specifically for them. Based on their perceptions, a conceptual framework was developed to guide the implementation of a peer reminder role for HCAs. This role may be a strategic implementation strategy to optimize the sustainability of new practices in residential care settings, and the related framework could offer guidance on how to implement this role. © 2017 Sigma Theta Tau International.

  4. Capataz: a framework for distributing algorithms via the World Wide Web

    Directory of Open Access Journals (Sweden)

    Gonzalo J. Martínez

    2015-08-01

    Full Text Available In recent years, some scientists have embraced the distributed computing paradigm. As experiments and simulations demand ever more computing power, coordinating the efforts of many different processors is often the only reasonable resort. We developed an open-source distributed computing framework based on web technologies, and named it Capataz. Acting as an HTTP server, web browsers running on many different devices can connect to it to contribute in the execution of distributed algorithms written in Javascript. Capataz takes advantage of architectures with many cores using web workers. This paper presents an improvement in Capataz´ usability and why it was needed. In previous experiments the total time of distributed algorithms proved to be susceptible to changes in the execution time of the jobs. The system now adapts by bundling jobs together if they are too simple. The computational experiment to test the solution is a brute force estimation of pi. The benchmark results show that by bundling jobs, the overall perfomance is greatly increased.

  5. Distributed Processing System for Restoration of Electric Power Distribution Network Using Two-Layered Contract Net Protocol

    Science.gov (United States)

    Kodama, Yu; Hamagami, Tomoki

    Distributed processing system for restoration of electric power distribution network using two-layered CNP is proposed. The goal of this study is to develop the restoration system which adjusts to the future power network with distributed generators. The state of the art of this study is that the two-layered CNP is applied for the distributed computing environment in practical use. The two-layered CNP has two classes of agents, named field agent and operating agent in the network. In order to avoid conflicts of tasks, operating agent controls privilege for managers to send the task announcement messages in CNP. This technique realizes the coordination between agents which work asynchronously in parallel with others. Moreover, this study implements the distributed processing system using a de-fact standard multi-agent framework, JADE(Java Agent DEvelopment framework). This study conducts the simulation experiments of power distribution network restoration and compares the proposed system with the previous system. We confirmed the results show effectiveness of the proposed system.

  6. A Semantics for Distributed Execution of Statemate

    DEFF Research Database (Denmark)

    Fränzle, Martin; Niehaus, Jürgen; Metzner, Alexander

    2003-01-01

    We present a semantics for the statechart variant implemented in the Statemate product of i-Logix. Our semantics enables distributed code generation for Statemate models in the context of rapid prototyping for embedded control applications. We argue that it seems impossible to efficiently generate......, the changes made regarding the interaction of distributed model parts are similar to the interaction between the model and its environment in the original semantics, thus giving designers a familiar execution model. The semantics has been implemented in Grace, a framework for rapid prototyping code generation...... distributed code using the original Statemate semantics. The new, distributed semantics has the advantages that, first, it enables the generation of efficient distributed code, second, it preserves many aspects of the original semantics for those parts of a model that are not distributed, and third...

  7. Using the "customer service framework" to successfully implement patient- and family-centered care.

    Science.gov (United States)

    Rangachari, Pavani; Bhat, Anita; Seol, Yoon-Ho

    2011-01-01

    Despite the growing momentum toward patient- and family-centered care at the federal policy level, the organizational literature remains divided on its effectiveness, especially in regard to its key dimension of involving patients and families in treatment decisions and safety practices. Although some have argued for the universal adoption of patient involvement, others have questioned both the effectiveness and feasibility of patient involvement. In this article, we apply a well-established theoretical perspective, that is, the Service Quality Model (SQM) (also known as the "customer service framework") to the health care context, to reconcile the debate related to patient involvement. The application helps support the case for universal adoption of patient involvement and also question the arguments against it. A key contribution of the SQM lies in highlighting a set of fundamental service quality determinants emanating from basic consumer service needs. It also provides a simple framework for understanding how gaps between consumer expectations and management perceptions of those expectations can affect the gap between "expected" and "perceived" service quality from a consumer's perspective. Simultaneously, the SQM also outlines "management requirements" for the successful implementation of a customer service strategy. Applying the SQM to the health care context therefore, in addition to reconciling the debate on patient involvement, helps identify specific steps health care managers could take to successfully implement patient- and family-centered care. Correspondingly, the application also provides insights into strategies for the successful implementation of policy recommendations related to patient- and family-centered care in health care organizations.

  8. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    Science.gov (United States)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  9. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; Sebastián, Miguel San

    2010-01-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania...

  10. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  11. Deconstructing public participation in the Water Framework Directive: implementation and compliance with the letter or with the spirit of the law

    NARCIS (Netherlands)

    Ker Rault, P.A.; Jeffrey, P.J.

    2008-01-01

    This article offers a fresh reading of the Water Framework Directive (WFD) and of the Common Implementation Strategy guidance document number 8 on public participation (PP) aimed at identifying the conditions required for successful implementation. We propose that a central barrier to implementing

  12. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  13. Programming model for distributed intelligent systems

    Science.gov (United States)

    Sztipanovits, J.; Biegl, C.; Karsai, G.; Bogunovic, N.; Purves, B.; Williams, R.; Christiansen, T.

    1988-01-01

    A programming model and architecture which was developed for the design and implementation of complex, heterogeneous measurement and control systems is described. The Multigraph Architecture integrates artificial intelligence techniques with conventional software technologies, offers a unified framework for distributed and shared memory based parallel computational models and supports multiple programming paradigms. The system can be implemented on different hardware architectures and can be adapted to strongly different applications.

  14. Parton distributions with QED corrections

    NARCIS (Netherlands)

    Collaboration, The NNPDF; Ball, Richard D.; Bertone, Valerio; Carrazza, Stefano; Debbio, Luigi Del; Forte, Stefano; Guffanti, Alberto; Hartland, Nathan P.; Rojo, Juan

    2013-01-01

    We present a set of parton distribution functions (PDFs), based on the NNPDF2.3 set, which includes a photon PDF, and QED contributions to parton evolution. We describe the implementation of the combined QCD+QED evolution in the NNPDF framework. We then provide a first determination of the full set

  15. A Distributed Garbage Collector for NeXeme

    OpenAIRE

    Moreau, Luc; DeRoure, David

    1997-01-01

    The remote service request, a form of remote procedure call, and the global pointer, a global naming mechanism, are two features at the heart of Nexus, a library to build distributed systems. nexeme is an extension of Scheme that fully integrates both concepts in a mostly-functional framework. This short paper describes the distributed garbage collector that we implemented in nexeme.

  16. Using a knowledge translation framework to implement asthma clinical practice guidelines in primary care

    Science.gov (United States)

    Licskai, Christopher; Sands, Todd; Ong, Michael; Paolatto, Lisa; Nicoletti, Ivan

    2012-01-01

    Quality problem International guidelines establish evidence-based standards for asthma care; however, recommendations are often not implemented and many patients do not meet control targets. Initial assessment Regional pilot data demonstrated a knowledge-to-practice gap. Choice of solutions We engineered health system change in a multi-step approach described by the Canadian Institutes of Health Research knowledge translation framework. Implementation Knowledge translation occurred at multiple levels: patient, practice and local health system. A regional administrative infrastructure and inter-disciplinary care teams were developed. The key project deliverable was a guideline-based interdisciplinary asthma management program. Six community organizations, 33 primary care physicians and 519 patients participated. The program operating cost was $290/patient. Evaluation Six guideline-based care elements were implemented, including spirometry measurement, asthma controller therapy, a written self-management action plan and general asthma education, including the inhaler device technique, role of medications and environmental control strategies in 93, 95, 86, 100, 97 and 87% of patients, respectively. Of the total patients 66% were adults, 61% were female, the mean age was 35.7 (SD = ±24.2) years. At baseline 42% had two or more symptoms beyond acceptable limits vs. 17% (Pabsenteeism (5.0 days/year) vs. 19% (3.0 days/year) (P< 0.001). The mean follow-up interval was 22 (SD = ±7) months. Lessons learned A knowledge-translation framework can guide multi-level organizational change, facilitate asthma guideline implementation, and improve health outcomes in community primary care practices. Program costs are similar to those of diabetes programs. Program savings offset costs in a ratio of 2.1:1 PMID:22893665

  17. A New Mode of European Regulation? The Implementation of the Autonomous Framework Agreement on Telework in Five Countries

    OpenAIRE

    Larsen , Trine P.; Andersen , Søren Kaj

    2007-01-01

    Abstract This article examines the implementation of the first autonomous framework agreement signed by European social partners in a number of member states. Although the telework agreement states that it is to be implemented in accordance with national procedures and practices specific to management and labour, practice is often different. The approach adopted reflects the specific ...

  18. COMDES-II: A Component-Based Framework for Generative Development of Distributed Real-Time Control Systems

    DEFF Research Database (Denmark)

    Ke, Xu; Sierszecki, Krzysztof; Angelov, Christo K.

    2007-01-01

    The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher lev...... methodology for COMDES-II from a general perspective, describes the component models in details and demonstrates their application through a DC-Motor control system case study.......The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher level...

  19. Beyond Ambiguity: A Practical Framework for Developing and Implementing Open Government Reforms

    Directory of Open Access Journals (Sweden)

    Merlin Chatwin

    2017-12-01

    Full Text Available The broad idea of ‘Open Government’ is widely accepted as a facilitator for rebuilding trust and validation in governments around the world. The Open Government Partnership is a significant driver of this movement with over 75 member nations, 15 subnational government participants and many others local governments implementing reforms within their national frameworks. The central tenets of transparency, accountability, participation, and collaboration are well understood within scholarly works and practitioner publications. However, open government is yet to be attributed with a universally acknowledged definition. This leads to questions of adaptability and salience of the concept of open government across diverse contexts. This paper addresses these questions by utilizing a human systems framework called the Dialogue Boxes. To develop an understanding of how open government is currently positioned within scholarly works and practitioner publications, an extensive literature search was conducted. The search utilized major search engines, often-cited references, direct journal searches and colleague provided references. Using existing definitions and descriptions, this paper populates the framework with available information and allow for context specific content to be populated by future users. Ultimately, the aim of the paper is to support the development of open government action plans that maximize the direct positive impact on people’s lives.

  20. A tale of two directories Implementing distributed shared objects in Java

    CERN Document Server

    Herlihy, M

    1999-01-01

    A directory service keep tracks of the location and status of mobile objects in a distributed system. This paper describes our experience implementing two distributed directory protocols as part of the Aleph toolkit, a distributed shared object system implemented in Java. One protocol is a conventional home-based protocol, in which a fixed node keeps track of the object's location and status. The other is a novel arrow protocol, based on a simple path-reversal algorithm. We were surprised to discover that the arrow protocol outperformed the home protocol, sometimes substantially, across a range of system sizes. This paper describes a series of experiments testing whether the discrepancy is due to an artifact of the Java run-time system (such as differences in thread management or object serialization costs), or whether it is something inherent in the protocols themselves. In the end, we use insights gained from these experimental results to design a new directory protocol that usually outperforms both. (29 re...

  1. Implementing and Investigating Distributed Leadership in a National University Network--SaMnet

    Science.gov (United States)

    Sharma, Manjula D.; Rifkin, Will; Tzioumis, Vicky; Hill, Matthew; Johnson, Elizabeth; Varsavsky, Cristina; Jones, Susan; Beames, Stephanie; Crampton, Andrea; Zadnik, Marjan; Pyke, Simon

    2017-01-01

    The literature suggests that collaborative approaches to leadership, such as distributed leadership, are essential for supporting educational innovators in leading change in teaching in universities. This paper briefly describes the array of activities, processes and resources to support distributed leadership in the implementation of a network,…

  2. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  3. An Architectural Based Framework for the Distributed Collection, Analysis and Query from Inhomogeneous Time Series Data Sets and Wearables for Biofeedback Applications

    Directory of Open Access Journals (Sweden)

    James Lee

    2017-02-01

    Full Text Available The increasing professionalism of sports persons and desire of consumers to imitate this has led to an increased metrification of sport. This has been driven in no small part by the widespread availability of comparatively cheap assessment technologies and, more recently, wearable technologies. Historically, whilst these have produced large data sets, often only the most rudimentary analysis has taken place (Wisbey et al in: “Quantifying movement demands of AFL football using GPS tracking”. This paucity of analysis is due in no small part to the challenges of analysing large sets of data that are often from disparate data sources to glean useful key performance indicators, which has been a largely a labour intensive process. This paper presents a framework that can be cloud based for the gathering, storing and algorithmic interpretation of large and inhomogeneous time series data sets. The framework is architecture based and technology agnostic in the data sources it can gather, and presents a model for multi set analysis for inter- and intra- devices and individual subject matter. A sample implementation demonstrates the utility of the framework for sports performance data collected from distributed inertial sensors in the sport of swimming.

  4. Efficient implementation of multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    Science.gov (United States)

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2012-01-10

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  5. Researcher readiness for participating in community-engaged dissemination and implementation research: a conceptual framework of core competencies.

    Science.gov (United States)

    Shea, Christopher M; Young, Tiffany L; Powell, Byron J; Rohweder, Catherine; Enga, Zoe K; Scott, Jennifer E; Carter-Edwards, Lori; Corbie-Smith, Giselle

    2017-09-01

    Participating in community-engaged dissemination and implementation (CEDI) research is challenging for a variety of reasons. Currently, there is not specific guidance or a tool available for researchers to assess their readiness to conduct CEDI research. We propose a conceptual framework that identifies detailed competencies for researchers participating in CEDI and maps these competencies to domains. The framework is a necessary step toward developing a CEDI research readiness survey that measures a researcher's attitudes, willingness, and self-reported ability for acquiring the knowledge and performing the behaviors necessary for effective community engagement. The conceptual framework for CEDI competencies was developed by a team of eight faculty and staff affiliated with a university's Clinical and Translational Science Award (CTSA). The authors developed CEDI competencies by identifying the attitudes, knowledge, and behaviors necessary for carrying out commonly accepted CE principles. After collectively developing an initial list of competencies, team members individually mapped each competency to a single domain that provided the best fit. Following the individual mapping, the group held two sessions in which the sorting preferences were shared and discrepancies were discussed until consensus was reached. During this discussion, modifications to wording of competencies and domains were made as needed. The team then engaged five community stakeholders to review and modify the competencies and domains. The CEDI framework consists of 40 competencies organized into nine domains: perceived value of CE in D&I research, introspection and openness, knowledge of community characteristics, appreciation for stakeholder's experience with and attitudes toward research, preparing the partnership for collaborative decision-making, collaborative planning for the research design and goals, communication effectiveness, equitable distribution of resources and credit, and

  6. Alert Messaging in the CMS Distributed Workflow System

    International Nuclear Information System (INIS)

    Maxa, Zdenek

    2012-01-01

    WMAgent is the core component of the CMS workload management system. One of the features of this job managing platform is a configurable messaging system aimed at generating, distributing and processing alerts: short messages describing a given alert-worthy information or pathological condition. Apart from the framework's sub-components running within the WMAgent instances, there is a stand-alone application collecting alerts from all WMAgent instances running across the CMS distributed computing environment. The alert framework has a versatile design that allows for receiving alert messages also from other CMS production applications, such as PhEDEx data transfer manager. We present implementation details of the system, including its Python implementation using ZeroMQ, CouchDB message storage and future visions as well as operational experiences. Inter-operation with monitoring platforms such as Dashboard or Lemon is described.

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders.

    Science.gov (United States)

    Maluka, Stephen; Kamuzora, Peter; San Sebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-12-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the contexts of resource-poor settings

  8. Can the theoretical domains framework account for the implementation of clinical quality interventions?

    Science.gov (United States)

    Lipworth, Wendy; Taylor, Natalie; Braithwaite, Jeffrey

    2013-12-21

    The health care quality improvement movement is a complex enterprise. Implementing clinical quality initiatives requires attitude and behaviour change on the part of clinicians, but this has proven to be difficult. In an attempt to solve this kind of behavioural challenge, the theoretical domains framework (TDF) has been developed. The TDF consists of 14 domains from psychological and organisational theory said to influence behaviour change. No systematic research has been conducted into the ways in which clinical quality initiatives map on to the domains of the framework. We therefore conducted a qualitative mapping experiment to determine to what extent, and in what ways, the TDF is relevant to the implementation of clinical quality interventions. We conducted a thematic synthesis of the qualitative literature exploring clinicians' perceptions of various clinical quality interventions. We analysed and synthesised 50 studies in total, in five domains of clinical quality interventions: clinical quality interventions in general, structural interventions, audit-type interventions, interventions aimed at making practice more evidence-based, and risk management interventions. Data were analysed thematically, followed by synthesis of these themes into categories and concepts, which were then mapped to the domains of the TDF. Our results suggest that the TDF is highly relevant to the implementation of clinical quality interventions. It can be used to map most, if not all, of the attitudinal and behavioural barriers and facilitators of uptake of clinical quality interventions. Each of these 14 domains appeared to be relevant to many different types of clinical quality interventions. One possible additional domain might relate to perceived trustworthiness of those instituting clinical quality interventions. The TDF can be usefully applied to a wide range of clinical quality interventions. Because all 14 of the domains emerged as relevant, and we did not identify any

  9. Using a knowledge translation framework to implement asthma clinical practice guidelines in primary care.

    Science.gov (United States)

    Licskai, Christopher; Sands, Todd; Ong, Michael; Paolatto, Lisa; Nicoletti, Ivan

    2012-10-01

    Quality problem International guidelines establish evidence-based standards for asthma care; however, recommendations are often not implemented and many patients do not meet control targets. Initial assessment Regional pilot data demonstrated a knowledge-to-practice gap. Choice of solutions We engineered health system change in a multi-step approach described by the Canadian Institutes of Health Research knowledge translation framework. Implementation Knowledge translation occurred at multiple levels: patient, practice and local health system. A regional administrative infrastructure and inter-disciplinary care teams were developed. The key project deliverable was a guideline-based interdisciplinary asthma management program. Six community organizations, 33 primary care physicians and 519 patients participated. The program operating cost was $290/patient. Evaluation Six guideline-based care elements were implemented, including spirometry measurement, asthma controller therapy, a written self-management action plan and general asthma education, including the inhaler device technique, role of medications and environmental control strategies in 93, 95, 86, 100, 97 and 87% of patients, respectively. Of the total patients 66% were adults, 61% were female, the mean age was 35.7 (SD = ± 24.2) years. At baseline 42% had two or more symptoms beyond acceptable limits vs. 17% (P< 0.001) post-intervention; 71% reported urgent/emergent healthcare visits at baseline (2.94 visits/year) vs. 45% (1.45 visits/year) (P< 0.001); 39% reported absenteeism (5.0 days/year) vs. 19% (3.0 days/year) (P< 0.001). The mean follow-up interval was 22 (SD = ± 7) months. Lessons learned A knowledge-translation framework can guide multi-level organizational change, facilitate asthma guideline implementation, and improve health outcomes in community primary care practices. Program costs are similar to those of diabetes programs. Program savings offset costs in a ratio of 2.1:1.

  10. Research on machine learning framework based on random forest algorithm

    Science.gov (United States)

    Ren, Qiong; Cheng, Hui; Han, Hai

    2017-03-01

    With the continuous development of machine learning, industry and academia have released a lot of machine learning frameworks based on distributed computing platform, and have been widely used. However, the existing framework of machine learning is limited by the limitations of machine learning algorithm itself, such as the choice of parameters and the interference of noises, the high using threshold and so on. This paper introduces the research background of machine learning framework, and combined with the commonly used random forest algorithm in machine learning classification algorithm, puts forward the research objectives and content, proposes an improved adaptive random forest algorithm (referred to as ARF), and on the basis of ARF, designs and implements the machine learning framework.

  11. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  12. Agent Communications using Distributed Metaobjects

    Energy Technology Data Exchange (ETDEWEB)

    Goldsmith, Steven Y.; Spires, Shannon V.

    1999-06-10

    There are currently two proposed standards for agent communication languages, namely, KQML (Finin, Lobrou, and Mayfield 1994) and the FIPA ACL. Neither standard has yet achieved primacy, and neither has been evaluated extensively in an open environment such as the Internet. It seems prudent therefore to design a general-purpose agent communications facility for new agent architectures that is flexible yet provides an architecture that accepts many different specializations. In this paper we exhibit the salient features of an agent communications architecture based on distributed metaobjects. This architecture captures design commitments at a metaobject level, leaving the base-level design and implementation up to the agent developer. The scope of the metamodel is broad enough to accommodate many different communication protocols, interaction protocols, and knowledge sharing regimes through extensions to the metaobject framework. We conclude that with a powerful distributed object substrate that supports metaobject communications, a general framework can be developed that will effectively enable different approaches to agent communications in the same agent system. We have implemented a KQML-based communications protocol and have several special-purpose interaction protocols under development.

  13. Implementation of High Speed Distributed Data Acquisition System

    Science.gov (United States)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high

  14. Implementing ATML in Distributed ATS for SG-III Prototype

    International Nuclear Information System (INIS)

    Chen Ming; Yang Cunbang; Lu Junfeng; Ding Yongkun; Yin Zejie; Zheng Zhijian

    2007-01-01

    With the forthcoming large-scale scientific experimental systems, we are looking for ways to construct an open, distributed architecture within the new and the existing automatic test systems. The new standard of Automatic Test Markup Language meets our demand for data exchange for this architecture through defining the test routines and resultant data in the XML format. This paper introduces the concept of ATML(Automatic Test Markup Language) and related standards, and the significance of these new standards for a distributed automatic test system. It also describes the implementation of ATML through the integration of this technology among the existing and new test systems

  15. Implementation of a Real-Time Microgrid Simulation Platform Based on Centralized and Distributed Management

    Directory of Open Access Journals (Sweden)

    Omid Abrishambaf

    2017-06-01

    Full Text Available Demand response and distributed generation are key components of power systems. Several challenges are raised at both technical and business model levels for integration of those resources in smart grids and microgrids. The implementation of a distribution network as a test bed can be difficult and not cost-effective; using computational modeling is not sufficient for producing realistic results. Real-time simulation allows us to validate the business model’s impact at the technical level. This paper comprises a platform supporting the real-time simulation of a microgrid connected to a larger distribution network. The implemented platform allows us to use both centralized and distributed energy resource management. Using an optimization model for the energy resource operation, a virtual power player manages all the available resources. Then, the simulation platform allows us to technically validate the actual implementation of the requested demand reduction in the scope of demand response programs. The case study has 33 buses, 220 consumers, and 68 distributed generators. It demonstrates the impact of demand response events, also performing resource management in the presence of an energy shortage.

  16. Implementing the water framework directive in Denmark - Lessons on agricultural measures from a legal and regulatory perspective

    DEFF Research Database (Denmark)

    Jacobsen, Brian H.; Anker, Helle Tegner; Baaner, Lasse

    2017-01-01

    One of the major challenges in the implementation of the Water Framework Directive (WFD) is how to address diffuse agricultural pollution of the aquatic environment. In Denmark the implementation of agricultural measures has been fraught with difficulty in the form of delays and legal proceedings...... and a policy failure. It is argued that the adoption of more flexible measures to be implemented at the local level could have resulted in fewer difficulties from an economic and legal point of view as measures could have been applied where there was a clear environmental benefit, and possibly also at a lower...

  17. Teacher Competencies for the Implementation of Collaborative Learning in the Classroom: A Framework and Research Review

    Science.gov (United States)

    Kaendler, Celia; Wiedmann, Michael; Rummel, Nikol; Spada, Hans

    2015-01-01

    This article describes teacher competencies for implementing collaborative learning in the classroom. Research has shown that the effectiveness of collaborative learning largely depends on the quality of student interaction. We therefore focus on what a "teacher" can do to foster student interaction. First, we present a framework that…

  18. MARIANE: MApReduce Implementation Adapted for HPC Environments

    Energy Technology Data Exchange (ETDEWEB)

    Fadika, Zacharia; Dede, Elif; Govindaraju, Madhusudhan; Ramakrishnan, Lavanya

    2011-07-06

    MapReduce is increasingly becoming a popular framework, and a potent programming model. The most popular open source implementation of MapReduce, Hadoop, is based on the Hadoop Distributed File System (HDFS). However, as HDFS is not POSIX compliant, it cannot be fully leveraged by applications running on a majority of existing HPC environments such as Teragrid and NERSC. These HPC environments typicallysupport globally shared file systems such as NFS and GPFS. On such resourceful HPC infrastructures, the use of Hadoop not only creates compatibility issues, but also affects overall performance due to the added overhead of the HDFS. This paper not only presents a MapReduce implementation directly suitable for HPC environments, but also exposes the design choices for better performance gains in those settings. By leveraging inherent distributed file systems' functions, and abstracting them away from its MapReduce framework, MARIANE (MApReduce Implementation Adapted for HPC Environments) not only allows for the use of the model in an expanding number of HPCenvironments, but also allows for better performance in such settings. This paper shows the applicability and high performance of the MapReduce paradigm through MARIANE, an implementation designed for clustered and shared-disk file systems and as such not dedicated to a specific MapReduce solution. The paper identifies the components and trade-offs necessary for this model, and quantifies the performance gains exhibited by our approach in distributed environments over Apache Hadoop in a data intensive setting, on the Magellan testbed at the National Energy Research Scientific Computing Center (NERSC).

  19. ICT, Policy, Politics, and Democracy: An Integrated Framework for G2G Implementation

    Directory of Open Access Journals (Sweden)

    Iliana Mizinova

    2006-12-01

    Full Text Available This research approaches the issue of G2G digitization using an integrated policy dynamics model. The essence of the contradictions in the G2G integration discourse is followed by a description of two policy paradigms that are then incorporated into an integrated or synthetic framework to evaluate the specifics of the G2G implementation in DHS and HUD. Speculations are made about the implications of this study for the democratic principles of government rule.

  20. Hybrid Multi-Agent Control in Microgrids: Framework, Models and Implementations Based on IEC 61850

    Directory of Open Access Journals (Sweden)

    Xiaobo Dou

    2014-12-01

    Full Text Available Operation control is a vital and complex issue for microgrids. The objective of this paper is to explore the practical means of applying decentralized control by using a multi agent system in actual microgrids and devices. This paper presents a hierarchical control framework (HCF consisting of local reaction control (LRC level, local decision control (LDC level, horizontal cooperation control (HCC level and vertical cooperation control (VCC level to meet different control requirements of a microgrid. Then, a hybrid multi-agent control model (HAM is proposed to implement HCF, and the properties, functionalities and operating rules of HAM are described. Furthermore, the paper elaborates on the implementation of HAM based on the IEC 61850 Standard, and proposes some new implementation methods, such as extended information models of IEC 61850 with agent communication language and bidirectional interaction mechanism of generic object oriented substation event (GOOSE communication. A hardware design and software system are proposed and the results of simulation and laboratory tests verify the effectiveness of the proposed strategies, models and implementations.

  1. The Road to Basel III – Quantitative Impact Study, the Basel III Framework and Implementation in the EU

    OpenAIRE

    Anastasia Gromova-Schneider; Caroline Niziolek

    2011-01-01

    In response to the financial crisis, the Basel Committee on Banking Supervision (BCBS) in December 2009 published its first consultative proposals to review the Basel II regulatory framework. Following a consultation process and a quantitative impact study (QIS), on December 16, 2010, the BCBS published the final Basel III framework for tightening the globally applicable capital adequacy and liquidity rules. The implementation of the new provisions in the EU is currently under way. The Europe...

  2. Implementing the Gaia Astrometric Global Iterative Solution (AGIS) in Java

    Science.gov (United States)

    O'Mullane, William; Lammers, Uwe; Lindegren, Lennart; Hernandez, Jose; Hobbs, David

    2011-10-01

    This paper provides a description of the Java software framework which has been constructed to run the Astrometric Global Iterative Solution for the Gaia mission. This is the mathematical framework to provide the rigid reference frame for Gaia observations from the Gaia data itself. This process makes Gaia a self calibrated, and input catalogue independent, mission. The framework is highly distributed typically running on a cluster of machines with a database back end. All code is written in the Java language. We describe the overall architecture and some of the details of the implementation.

  3. Toward the sustainability of health interventions implemented in sub-Saharan Africa: a systematic review and conceptual framework.

    Science.gov (United States)

    Iwelunmor, Juliet; Blackstone, Sarah; Veira, Dorice; Nwaozuru, Ucheoma; Airhihenbuwa, Collins; Munodawafa, Davison; Kalipeni, Ezekiel; Jutal, Antar; Shelley, Donna; Ogedegebe, Gbenga

    2016-03-23

    Sub-Saharan Africa (SSA) is facing a double burden of disease with a rising prevalence of non-communicable diseases (NCDs) while the burden of communicable diseases (CDs) remains high. Despite these challenges, there remains a significant need to understand how or under what conditions health interventions implemented in sub-Saharan Africa are sustained. The purpose of this study was to conduct a systematic review of empirical literature to explore how health interventions implemented in SSA are sustained. We searched MEDLINE, Biological Abstracts, CINAHL, Embase, PsycInfo, SCIELO, Web of Science, and Google Scholar for available research investigating the sustainability of health interventions implemented in sub-Saharan Africa. We also used narrative synthesis to examine factors whether positive or negative that may influence the sustainability of health interventions in the region. The search identified 1819 citations, and following removal of duplicates and our inclusion/exclusion criteria, only 41 papers were eligible for inclusion in the review. Twenty-six countries were represented in this review, with Kenya and Nigeria having the most representation of available studies examining sustainability. Study dates ranged from 1996 to 2015. Of note, majority of these studies (30 %) were published in 2014. The most common framework utilized was the sustainability framework, which was discussed in four of the studies. Nineteen out of 41 studies (46 %) reported sustainability outcomes focused on communicable diseases, with HIV and AIDS represented in majority of the studies, followed by malaria. Only 21 out of 41 studies had clear definitions of sustainability. Community ownership and mobilization were recognized by many of the reviewed studies as crucial facilitators for intervention sustainability, both early on and after intervention implementation, while social and ecological conditions as well as societal upheavals were barriers that influenced the sustainment

  4. Kodiak: An Implementation Framework for Branch and Bound Algorithms

    Science.gov (United States)

    Smith, Andrew P.; Munoz, Cesar A.; Narkawicz, Anthony J.; Markevicius, Mantas

    2015-01-01

    Recursive branch and bound algorithms are often used to refine and isolate solutions to several classes of global optimization problems. A rigorous computation framework for the solution of systems of equations and inequalities involving nonlinear real arithmetic over hyper-rectangular variable and parameter domains is presented. It is derived from a generic branch and bound algorithm that has been formally verified, and utilizes self-validating enclosure methods, namely interval arithmetic and, for polynomials and rational functions, Bernstein expansion. Since bounds computed by these enclosure methods are sound, this approach may be used reliably in software verification tools. Advantage is taken of the partial derivatives of the constraint functions involved in the system, firstly to reduce the branching factor by the use of bisection heuristics and secondly to permit the computation of bifurcation sets for systems of ordinary differential equations. The associated software development, Kodiak, is presented, along with examples of three different branch and bound problem types it implements.

  5. Generic framework for vessel detection and tracking based on distributed marine radar image data

    Science.gov (United States)

    Siegert, Gregor; Hoth, Julian; Banyś, Paweł; Heymann, Frank

    2018-04-01

    Situation awareness is understood as a key requirement for safe and secure shipping at sea. The primary sensor for maritime situation assessment is still the radar, with the AIS being introduced as supplemental service only. In this article, we present a framework to assess the current situation picture based on marine radar image processing. Essentially, the framework comprises a centralized IMM-JPDA multi-target tracker in combination with a fully automated scheme for track management, i.e., target acquisition and track depletion. This tracker is conditioned on measurements extracted from radar images. To gain a more robust and complete situation picture, we are exploiting the aspect angle diversity of multiple marine radars, by fusing them a priori to the tracking process. Due to the generic structure of the proposed framework, different techniques for radar image processing can be implemented and compared, namely the BLOB detector and SExtractor. The overall framework performance in terms of multi-target state estimation will be compared for both methods based on a dedicated measurement campaign in the Baltic Sea with multiple static and mobile targets given.

  6. Invertebrate distribution patterns and river typology for the implementation of the water framework directive in Martinique, French Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Bernadet C.

    2013-03-01

    Full Text Available Over the past decade, Europe’s Water Framework Directive provided compelling reasons for developing tools for the biological assessment of freshwater ecosystem health in member States. Yet, the lack of published study for Europe’s overseas regions reflects minimal knowledge of the distribution patterns of aquatic species in Community’s outermost areas. Benthic invertebrates (84 taxa and land-cover, physical habitat and water chemistry descriptors (26 variables were recorded at fifty-one stations in Martinique, French Lesser Antilles. Canonical Correspondence Analysis and Ward’s algorithm were used to bring out patterns in community structure in relation to environmental conditions, and variation partitioning was used to specify the influence of geomorphology and anthropogenic disturbance on invertebrate communities. Species richness decreased from headwater to lowland streams, and species composition changed from northern to southern areas. The proportion of variation explained by geomorphological variables was globally higher than that explained by anthropogenic variables. Geomorphology and land cover played key roles in delineating ecological sub-regions for the freshwater biota. Despite this and the small surface area of Martinique (1080 km2, invertebrate communities showed a clear spatial turnover in composition and biological traits (e.g., insects, crustaceans and molluscs in relation to natural conditions.

  7. Interdisciplinary Priorities for Dissemination, Implementation, and Improvement Science: Frameworks, Mechanics, and Measures.

    Science.gov (United States)

    Brunner, Julian W; Sankaré, Ibrahima C; Kahn, Katherine L

    2015-12-01

    Much of dissemination, implementation, and improvement (DII) science is conducted by social scientists, healthcare practitioners, and biomedical researchers. While each of these groups has its own venues for sharing methods and findings, forums that bring together the diverse DII science workforce provide important opportunities for cross-disciplinary collaboration and learning. In particular, such forums are uniquely positioned to foster the sharing of three important components of research. First: they allow the sharing of conceptual frameworks for DII science that focus on the use and spread of innovations. Second: they provide an opportunity to share strategies for initiating and governing DII research, including approaches for eliciting and incorporating the research priorities of patients, study participants, and healthcare practitioners, and decision-makers. Third: they allow the sharing of outcome measures well-suited to the goals of DII science, thereby helping to validate these outcomes in diverse contexts, improving the comparability of findings across settings, and elevating the study of the implementation process itself. © 2015 Wiley Periodicals, Inc.

  8. A Framework for Implementing and Valuing Biodiversity Offsets in Colombia: A Landscape Scale Perspective

    Directory of Open Access Journals (Sweden)

    Shirley Saenz

    2013-11-01

    Full Text Available Biodiversity offsets provide a mechanism for maintaining or enhancing environmental values in situations where development is sought, despite negative environmental impacts. They seek to ensure that unavoidable deleterious environmental impacts of development are balanced by environmental gains. When onsite impacts warrant the use of offsets there is often little attention paid to make sure that the location of offset sites provides the greatest conservation benefit, ensuring they are consistent with landscape level conservation goals. In most offset frameworks it is difficult for developers to proactively know the offset requirements they will need to implement. Here we propose a framework to address these needs. We propose a series of rules for selecting offset sites that meet the conservation needs of potentially impacted biological targets. We then discuss an accounting approach that seeks to support offset ratio determinations based on a structured and transparent approach. To demonstrate the approach, we present a framework developed in partnership with the Colombian Ministry of Environment and Sustainable Development to reform existing mitigation regulatory processes.

  9. IMPLEMENTATION OF MULTIAGENT REINFORCEMENT LEARNING MECHANISM FOR OPTIMAL ISLANDING OPERATION OF DISTRIBUTION NETWORK

    DEFF Research Database (Denmark)

    Saleem, Arshad; Lind, Morten

    2008-01-01

    among electric power utilities to utilize modern information and communication technologies (ICT) in order to improve the automation of the distribution system. In this paper we present our work for the implementation of a dynamic multi-agent based distributed reinforcement learning mechanism...

  10. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  11. Design and implementation of the reconstruction software for the photon multiplicity detector in object oriented programming framework

    International Nuclear Information System (INIS)

    Chattopadhayay, Subhasis; Ghosh, Premomoy; Gupta, R.; Mishra, D.; Phatak, S.C.; Sood, G.

    2002-01-01

    High granularity photon multiplicity detector (PMD) is scheduled to take data in Relativistic Heavy Ion Collision(RHIC) this year. A detailed scheme has been designed and implemented in object oriented programming framework using C++ for the monitoring and reconstruction job of PMD data

  12. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  13. A sustainable livelihood framework to implement CSR project in coal mining sector

    Directory of Open Access Journals (Sweden)

    Sapna A. Narula

    2017-01-01

    Full Text Available Corporate social responsibility (CSR in mining areas has increased momentum especially in countries like India where it has been made mandatory. The primary objective of this paper is to document actual social challenges of mining in field areas and find out how companies in the coal sector can work in a systematic manner to achieve uplift of affected communities. The first part of the paper draws evidence from three different bodies of literature, i.e. CSR and coal mining, capacity building and livelihood generation in mining areas. We try to converge the literature to propose a novel framework for livelihood generation work through capacity building with the help of CSR investments. The paper also documents a live case of planning and the implementation of capacity building activities in Muriadih coal mines in the Jharkhand state of India and offers lessons to both business and policy makers. The proposed framework has only been experimented in a local context, yet has the potential to be replicated in other mining areas.

  14. Surveillance indicators and their use in implementation of the Marine Strategy Framework Directive

    DEFF Research Database (Denmark)

    Shephard, Samuel; Greenstreet, Simon P. R.; Piet, GerJan J.

    2015-01-01

    The European Union Marine Strategy Framework Directive (MSFD) uses indicators to track ecosystem state in relation to Good Environmental Status (GES). These indicators were initially expected to be “operational”, i.e. to have well-understood relationships between state and specified anthropogenic...... pressure(s), and to have defined targets. Recent discussion on MSFD implementation has highlighted an additional class of “surveillance” indicators. Surveillance indicators monitor key aspects of the ecosystem for which there is: first, insufficient evidence to define targets and support formal state...

  15. Joint Implementation under the UN Framework Convention on Climate Change. Technical and institutional challenges

    International Nuclear Information System (INIS)

    Swisher, J.N.

    1997-01-01

    The UN Framework Convention of Climate Change (FCCC) allows for the Joint Implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. At present, all JI transactions are voluntary, and no country has claimed JI credit against existing FCCC commitments. Nevertheless, JI could have important implications for both the economic efficiency and the international equity of the implementation of the FCCC. The paper discusses some of the information needs of JI projects and seeks to clarify some of the common assumptions and arguments about JI. Issues regarding JI are distinguished according to those that are specific to JI as well as other types of regimes and transactions. The focus is on the position of developing countries and their potential risks and benefits regarding JI. 2 figs., 3 tabs., 35 refs

  16. Efficient implementation of a multidimensional fast fourier transform on a distributed-memory parallel multi-node computer

    Science.gov (United States)

    Bhanot, Gyan V [Princeton, NJ; Chen, Dong [Croton-On-Hudson, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Heidelberger, Philip [Cortlandt Manor, NY; Steinmacher-Burow, Burkhard D [Mount Kisco, NY; Vranas, Pavlos M [Bedford Hills, NY

    2008-01-01

    The present in invention is directed to a method, system and program storage device for efficiently implementing a multidimensional Fast Fourier Transform (FFT) of a multidimensional array comprising a plurality of elements initially distributed in a multi-node computer system comprising a plurality of nodes in communication over a network, comprising: distributing the plurality of elements of the array in a first dimension across the plurality of nodes of the computer system over the network to facilitate a first one-dimensional FFT; performing the first one-dimensional FFT on the elements of the array distributed at each node in the first dimension; re-distributing the one-dimensional FFT-transformed elements at each node in a second dimension via "all-to-all" distribution in random order across other nodes of the computer system over the network; and performing a second one-dimensional FFT on elements of the array re-distributed at each node in the second dimension, wherein the random order facilitates efficient utilization of the network thereby efficiently implementing the multidimensional FFT. The "all-to-all" re-distribution of array elements is further efficiently implemented in applications other than the multidimensional FFT on the distributed-memory parallel supercomputer.

  17. Modeling framework for representing long-term effectiveness of best management practices in addressing hydrology and water quality problems: Framework development and demonstration using a Bayesian method

    Science.gov (United States)

    Liu, Yaoze; Engel, Bernard A.; Flanagan, Dennis C.; Gitau, Margaret W.; McMillan, Sara K.; Chaubey, Indrajeet; Singh, Shweta

    2018-05-01

    Best management practices (BMPs) are popular approaches used to improve hydrology and water quality. Uncertainties in BMP effectiveness over time may result in overestimating long-term efficiency in watershed planning strategies. To represent varying long-term BMP effectiveness in hydrologic/water quality models, a high level and forward-looking modeling framework was developed. The components in the framework consist of establishment period efficiency, starting efficiency, efficiency for each storm event, efficiency between maintenance, and efficiency over the life cycle. Combined, they represent long-term efficiency for a specific type of practice and specific environmental concern (runoff/pollutant). An approach for possible implementation of the framework was discussed. The long-term impacts of grass buffer strips (agricultural BMP) and bioretention systems (urban BMP) in reducing total phosphorus were simulated to demonstrate the framework. Data gaps were captured in estimating the long-term performance of the BMPs. A Bayesian method was used to match the simulated distribution of long-term BMP efficiencies with the observed distribution with the assumption that the observed data represented long-term BMP efficiencies. The simulated distribution matched the observed distribution well with only small total predictive uncertainties. With additional data, the same method can be used to further improve the simulation results. The modeling framework and results of this study, which can be adopted in hydrologic/water quality models to better represent long-term BMP effectiveness, can help improve decision support systems for creating long-term stormwater management strategies for watershed management projects.

  18. Implementing a Mentally Healthy Schools Framework Based on the Population Wide Act-Belong-Commit Mental Health Promotion Campaign: A Process Evaluation

    Science.gov (United States)

    Anwar-McHenry, Julia; Donovan, Robert John; Nicholas, Amberlee; Kerrigan, Simone; Francas, Stephanie; Phan, Tina

    2016-01-01

    Purpose: Mentally Healthy WA developed and implemented the Mentally Healthy Schools Framework in 2010 in response to demand from schools wanting to promote the community-based Act-Belong-Commit mental health promotion message within a school setting. Schools are an important setting for mental health promotion, therefore, the Framework encourages…

  19. Implementation of Web-based Information Systems in Distributed Organizations

    DEFF Research Database (Denmark)

    Bødker, Keld; Pors, Jens Kaaber; Simonsen, Jesper

    2004-01-01

    This article presents results elicited from studies conducted in relation to implementing a web-based information system throughout a large distributed organization. We demonstrate the kind of expectations and conditions for change that management face in relation to open-ended, configurable......, and context specific web-based information systems like Lotus QuickPlace. Our synthesis from the empirical findings is related to two recent models, the improvisational change management model suggested by Orlikowski and Hofman (1997), and Gallivan's (2001) model for organizational adoption and assimilation....... In line with comparable approaches from the knowledge management area (Dixon 2000; Markus 2001), we relate to, refine, and operationalize the models from an overall organizational view by identifying and characterizing four different and general implementation contexts...

  20. Design and Analysis of Electrical Distribution Networks and Balancing Markets in the UK: A New Framework with Applications

    Directory of Open Access Journals (Sweden)

    Vijayanarasimha Hindupur Pakka

    2016-02-01

    Full Text Available We present a framework for the design and simulation of electrical distribution systems and short term electricity markets specific to the UK. The modelling comprises packages relating to the technical and economic features of the electrical grid. The first package models the medium/low distribution networks with elements such as transformers, voltage regulators, distributed generators, composite loads, distribution lines and cables. This model forms the basis for elementary analysis such as load flow and short circuit calculations and also enables the investigation of effects of integrating distributed resources, voltage regulation, resource scheduling and the like. The second part of the modelling exercise relates to the UK short term electricity market with specific features such as balancing mechanism and bid-offer strategies. The framework is used for investigating methods of voltage regulation using multiple control technologies, to demonstrate the effects of high penetration of wind power on balancing prices and finally use these prices towards achieving demand response through aggregated prosumers.

  1. Optimal Electricity Distribution Framework for Public Space: Assessing Renewable Energy Proposals for Freshkills Park, New York City

    Directory of Open Access Journals (Sweden)

    Kaan Ozgun

    2015-03-01

    Full Text Available Integrating renewable energy into public space is becoming more common as a climate change solution. However, this approach is often guided by the environmental pillar of sustainability, with less focus on the economic and social pillars. The purpose of this paper is to examine this issue in the speculative renewable energy propositions for Freshkills Park in New York City submitted for the 2012 Land Art Generator Initiative (LAGI competition. This paper first proposes an optimal electricity distribution (OED framework in and around public spaces based on relevant ecology and energy theory (Odum’s fourth and fifth law of thermodynamics. This framework addresses social engagement related to public interaction, and economic engagement related to the estimated quantity of electricity produced, in conjunction with environmental engagement related to the embodied energy required to construct the renewable energy infrastructure. Next, the study uses the OED framework to analyse the top twenty-five projects submitted for the LAGI 2012 competition. The findings reveal an electricity distribution imbalance and suggest a lack of in-depth understanding about sustainable electricity distribution within public space design. The paper concludes with suggestions for future research.

  2. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225867; The ATLAS collaboration

    2017-01-01

    We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent ...

  3. The Use of Ethical Frameworks for Implementing Science as a Human Endeavour in Year 10 Biology

    Science.gov (United States)

    Yap, Siew Fong; Dawson, Vaille

    2014-01-01

    This research focuses on the use of ethical frameworks as a pedagogical model for socio-scientific education in implementing the "Science as a Human Endeavour" (SHE) strand of the Australian Curriculum: Science in a Year 10 biology class in a Christian college in metropolitan Perth, Western Australia. Using a case study approach, a mixed…

  4. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  5. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  6. Using an ontology as a model for the implementation of the National Cybersecurity Policy Framework for South Africa

    CSIR Research Space (South Africa)

    Jansen van Vuuren, JC

    2014-03-01

    Full Text Available National Cybersecurity Policy Framework that is easy to understand and implement. In this paper, the authors motivate that an ontology can assist in defining a model that describes the relationships between different stakeholders and cybersecurity...

  7. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks.

    Science.gov (United States)

    Gehring, Tobias; Händchen, Vitus; Duhme, Jörg; Furrer, Fabian; Franz, Torsten; Pacher, Christoph; Werner, Reinhard F; Schnabel, Roman

    2015-10-30

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our implementation is based on the distribution of continuous-variable Einstein-Podolsky-Rosen entangled light. It is one-sided device independent, which means the security of the generated key is independent of any memoryfree attacks on the remote detector. Since continuous-variable encoding is compatible with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components.

  8. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks

    Science.gov (United States)

    Gehring, Tobias; Händchen, Vitus; Duhme, Jörg; Furrer, Fabian; Franz, Torsten; Pacher, Christoph; Werner, Reinhard F.; Schnabel, Roman

    2015-10-01

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our implementation is based on the distribution of continuous-variable Einstein-Podolsky-Rosen entangled light. It is one-sided device independent, which means the security of the generated key is independent of any memoryfree attacks on the remote detector. Since continuous-variable encoding is compatible with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components.

  9. Thermodynamic framework for compact q-Gaussian distributions

    Science.gov (United States)

    Souza, Andre M. C.; Andrade, Roberto F. S.; Nobre, Fernando D.; Curado, Evaldo M. F.

    2018-02-01

    Recent works have associated systems of particles, characterized by short-range repulsive interactions and evolving under overdamped motion, to a nonlinear Fokker-Planck equation within the class of nonextensive statistical mechanics, with a nonlinear diffusion contribution whose exponent is given by ν = 2 - q. The particular case ν = 2 applies to interacting vortices in type-II superconductors, whereas ν > 2 covers systems of particles characterized by short-range power-law interactions, where correlations among particles are taken into account. In the former case, several studies presented a consistent thermodynamic framework based on the definition of an effective temperature θ (presenting experimental values much higher than typical room temperatures T, so that thermal noise could be neglected), conjugated to a generalized entropy sν (with ν = 2). Herein, the whole thermodynamic scheme is revisited and extended to systems of particles interacting repulsively, through short-ranged potentials, described by an entropy sν, with ν > 1, covering the ν = 2 (vortices in type-II superconductors) and ν > 2 (short-range power-law interactions) physical examples. One basic requirement concerns a cutoff in the equilibrium distribution Peq(x) , approached due to a confining external harmonic potential, ϕ(x) = αx2 / 2 (α > 0). The main results achieved are: (a) The definition of an effective temperature θ conjugated to the entropy sν; (b) The construction of a Carnot cycle, whose efficiency is shown to be η = 1 -(θ2 /θ1) , where θ1 and θ2 are the effective temperatures associated with two isothermal transformations, with θ1 >θ2; (c) Thermodynamic potentials, Maxwell relations, and response functions. The present thermodynamic framework, for a system of interacting particles under the above-mentioned conditions, and associated to an entropy sν, with ν > 1, certainly enlarges the possibility of experimental verifications.

  10. Integrating a Trust Framework with a Distributed Certificate Validation Scheme for MANETs

    Directory of Open Access Journals (Sweden)

    Marias Giannis F

    2006-01-01

    Full Text Available Many trust establishment solutions in mobile ad hoc networks (MANETs rely on public key certificates. Therefore, they should be accompanied by an efficient mechanism for certificate revocation and validation. Ad hoc distributed OCSP for trust (ADOPT is a lightweight, distributed, on-demand scheme based on cached OCSP responses, which provides certificate status information to the nodes of a MANET. In this paper we discuss the ADOPT scheme and issues on its deployment over MANETs. We present some possible threats to ADOPT and suggest the use of a trust assessment and establishment framework, named ad hoc trust framework (ATF, to support ADOPT's robustness and efficiency. ADOPT is deployed as a trust-aware application that provides feedback to ATF, which calculates the trustworthiness of the peer nodes' functions and helps ADOPT to improve its performance by rapidly locating valid certificate status information. Moreover, we introduce the TrustSpan algorithm to reduce the overhead that ATF produces, and the TrustPath algorithm to identify and use trusted routes for propagating sensitive information, such as third parties' accusations. Simulation results show that ATF adds limited overhead compared to its efficiency in detecting and isolating malicious and selfish nodes. ADOPT's reliability is increased, since it can rapidly locate a legitimate response by using information provided by ATF.

  11. Framework and operational procedure for implementing Strategic Environmental Assessment in China

    International Nuclear Information System (INIS)

    Bao Cunkuan; Lu Yongsen; Shang Jincheng

    2004-01-01

    Over the last 20 years, Environmental Impact Assessment (EIA) has been implemented and become an important instrument for decision-making in development projects in China. The Environmental Impact Assessment Law of the P.R. China was promulgated on 28 October 2002 and will be put into effect on 1 September of 2003. The law provides that Strategic Environmental Assessment (SEA) is required in regional and sector plans and programs. This paper introduces the research achievements and practice of SEA in China, discusses the relationship of SEA and 'integrating of environment and development in decision-making (IEDD)', and relevant political and legal basis of SEA. The framework and operational procedures of SEA administration and enforcement are presented. Nine cases are analyzed and some proposals are given

  12. A multi-criteria vertical coordination framework for a reliable aid distribution

    Energy Technology Data Exchange (ETDEWEB)

    Regis-Hernández, Fabiola; Mora-Vargas, Jaime; Ruíz, Angel

    2017-07-01

    This study proposes a methodology that translates multiple humanitarian supply chain stakeholders’ preferences from qualitative to quantitative values, enabling these preferences to be integrated into optimization models to ensure their balanced and simultaneous implementation during the decision-making process. Design/methodology/approach: An extensive literature review is used to justify the importance of developing a strategy that minimizes the impact of a lack of coordination on humanitarian logistics decisions. A methodology for a multi-criteria framework is presented that allows humanitarian stakeholders’ interests to be integrated into the humanitarian decision-making process. Findings: The findings suggest that integrating stakeholders’ interests into the humanitarian decision-making process will improve its reliability. Research limitations/implications: To further validate the weights of each stakeholder’s interests obtained from the literature review requires interviews with the corresponding organizations. However, the literature review supports the statements in this paper. Practical implications: The cost of a lack of coordination between stakeholders in humanitarian logistics has been increasing during the last decade. These coordination costs can be minimized if humanitarian logistics’ decision-makers measure and simultaneously consider multiple stakeholders’ preferences. Social implications: When stakeholders’ goals are aligned, the humanitarian logistics response becomes more efficient, increasing the quality of delivered aid and providing timely assistance to the affected population in order to minimize their suffering. This study provides a methodology that translates humanitarian supply chain stakeholders’ interests into quantitative values, enabling them to be integrated into mathematical models to ensure relief distribution based on the stakeholders’ preferences.

  13. A multi-criteria vertical coordination framework for a reliable aid distribution

    International Nuclear Information System (INIS)

    Regis-Hernández, Fabiola; Mora-Vargas, Jaime; Ruíz, Angel

    2017-01-01

    This study proposes a methodology that translates multiple humanitarian supply chain stakeholders’ preferences from qualitative to quantitative values, enabling these preferences to be integrated into optimization models to ensure their balanced and simultaneous implementation during the decision-making process. Design/methodology/approach: An extensive literature review is used to justify the importance of developing a strategy that minimizes the impact of a lack of coordination on humanitarian logistics decisions. A methodology for a multi-criteria framework is presented that allows humanitarian stakeholders’ interests to be integrated into the humanitarian decision-making process. Findings: The findings suggest that integrating stakeholders’ interests into the humanitarian decision-making process will improve its reliability. Research limitations/implications: To further validate the weights of each stakeholder’s interests obtained from the literature review requires interviews with the corresponding organizations. However, the literature review supports the statements in this paper. Practical implications: The cost of a lack of coordination between stakeholders in humanitarian logistics has been increasing during the last decade. These coordination costs can be minimized if humanitarian logistics’ decision-makers measure and simultaneously consider multiple stakeholders’ preferences. Social implications: When stakeholders’ goals are aligned, the humanitarian logistics response becomes more efficient, increasing the quality of delivered aid and providing timely assistance to the affected population in order to minimize their suffering. This study provides a methodology that translates humanitarian supply chain stakeholders’ interests into quantitative values, enabling them to be integrated into mathematical models to ensure relief distribution based on the stakeholders’ preferences

  14. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  15. A numerical framework for bubble transport in a subcooled fluid flow

    Science.gov (United States)

    Jareteg, Klas; Sasic, Srdjan; Vinai, Paolo; Demazière, Christophe

    2017-09-01

    In this paper we present a framework for the simulation of dispersed bubbly two-phase flows, with the specific aim of describing vapor-liquid systems with condensation. We formulate and implement a framework that consists of a population balance equation (PBE) for the bubble size distribution and an Eulerian-Eulerian two-fluid solver. The PBE is discretized using the Direct Quadrature Method of Moments (DQMOM) in which we include the condensation of the bubbles as an internal phase space convection. We investigate the robustness of the DQMOM formulation and the numerical issues arising from the rapid shrinkage of the vapor bubbles. In contrast to a PBE method based on the multiple-size-group (MUSIG) method, the DQMOM formulation allows us to compute a distribution with dynamic bubble sizes. Such a property is advantageous to capture the wide range of bubble sizes associated with the condensation process. Furthermore, we compare the computational performance of the DQMOM-based framework with the MUSIG method. The results demonstrate that DQMOM is able to retrieve the bubble size distribution with a good numerical precision in only a small fraction of the computational time required by MUSIG. For the two-fluid solver, we examine the implementation of the mass, momentum and enthalpy conservation equations in relation to the coupling to the PBE. In particular, we propose a formulation of the pressure and liquid continuity equations, that was shown to correctly preserve mass when computing the vapor fraction with DQMOM. In addition, the conservation of enthalpy was also proven. Therefore a consistent overall framework that couples the PBE and two-fluid solvers is achieved.

  16. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks

    DEFF Research Database (Denmark)

    Gehring, Tobias; Haendchen, Vitus; Duhme, Joerg

    2015-01-01

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State......-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our...... with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components....

  17. Implementation of the EU-policy framework WFD and GWD in Europe - Activities of CIS Working Group Groundwater

    Science.gov (United States)

    Grath, Johannes; Ward, Rob; Hall, Anna

    2013-04-01

    At the European level, the basic elements for groundwater management and protection are laid down in the Water Framework Directive (WFD) (2000/60/EC) and the Groundwater Daughter Directive (2006/118/EC). EU Member States, Norway and the European Commission (EC) have jointly developed a common strategy for supporting the implementation of the WFD. The main aim of this Common Implementation Strategy (CIS) is to ensure the coherent and harmonious implementation of the directives through the clarification of a number of methodological questions enabling a common understanding to be reached on the technical and scientific implications of the WFD (European Communities, 2008). Groundwater specific issues are dealt with in Working Group C Groundwater. Members of the working group are experts nominated by Member states, Norway, Switzerland and Accession Countries (from administrative bodies, research institutes, …) and representatives from relevant stakeholders and NGOs. Working Group C Groundwater has produced numerous guidance documents and technical reports that have been endorsed by EU Water Directors to support and enable Member States to implement the directives. All the documents are published by the EC. Access is available via the following link: http://ec.europa.eu/environment/water/water-framework/groundwater/activities.htm Having addressed implementations issues during the 1st river basin planning cycle, WG C Groundwater is currently focussing on the following issues: groundwater dependent ecosystems, and climate change and groundwater. In the future, the outcome and recommendations of the "Blueprint" - to safeguard Europe's water resources - which was recently published by the EC will be of utmost importance in setting the agenda for the group. Most likely this will include water pricing, water demand management and water abstraction. Complementory to the particular working groups, a Science Policy Interface (SPI) activity has been established. Its purpose is

  18. HERA-B framework for online calibration and alignment

    International Nuclear Information System (INIS)

    Hernandez, J.M.; Ressing, D.; Rybnikov, V.; Sanchez, F.; Medinnis, M.; Kreuzer, P.; Schwanke, U.; Amorim, A.

    2004-09-01

    This paper describes the architecture and implementation of the HERA-B framework for online calibration and alignment. At HERA-B the performance of all trigger levels, including the online reconstruction, strongly depends on using the appropriate calibration and alignment constants, which might change during data taking. A system to monitor, recompute and distribute those constants to online processes has been integrated in the data acquisition and trigger systems. (orig.)

  19. The implement of java based GUI for data acquisition system

    International Nuclear Information System (INIS)

    Yang Xiaoqing

    2003-01-01

    Web based technique have been used to produce a Graphic User Interface framework for small Data Acquisition System. A CORBA library used for the communication with the JRCS servers. The GUI was implemented by Java Swing. the integration between Java and CORBA provide a powerful independent distributed environment. (authors)

  20. Distributed cooperative control of AC microgrids

    Science.gov (United States)

    Bidram, Ali

    In this dissertation, the comprehensive secondary control of electric power microgrids is of concern. Microgrid technical challenges are mainly realized through the hierarchical control structure, including primary, secondary, and tertiary control levels. Primary control level is locally implemented at each distributed generator (DG), while the secondary and tertiary control levels are conventionally implemented through a centralized control structure. The centralized structure requires a central controller which increases the reliability concerns by posing the single point of failure. In this dissertation, the distributed control structure using the distributed cooperative control of multi-agent systems is exploited to increase the secondary control reliability. The secondary control objectives are microgrid voltage and frequency, and distributed generators (DGs) active and reactive powers. Fully distributed control protocols are implemented through distributed communication networks. In the distributed control structure, each DG only requires its own information and the information of its neighbors on the communication network. The distributed structure obviates the requirements for a central controller and complex communication network which, in turn, improves the system reliability. Since the DG dynamics are nonlinear and non-identical, input-output feedback linearization is used to transform the nonlinear dynamics of DGs to linear dynamics. Proposed control frameworks cover the control of microgrids containing inverter-based DGs. Typical microgrid test systems are used to verify the effectiveness of the proposed control protocols.

  1. PHP frameworks

    OpenAIRE

    Srša, Aljaž

    2016-01-01

    The thesis presents one of the four most popular PHP web frameworks: Laravel, Symfony, CodeIgniter and CakePHP. These frameworks are compared with each other according to the four criteria, which can help with the selection of a framework. These criteria are size of the community, quality of official support, comprehensibility of framework’s documentation and implementation of functionalities in individual frameworks, which are automatic code generation, routing, object-relational mapping and...

  2. Creating an outcomes framework.

    Science.gov (United States)

    Doerge, J B

    2000-01-01

    Four constructs used to build a framework for outcomes management for a large midwestern tertiary hospital are described in this article. A system framework outlining a model of clinical integration and population management based in Steven Shortell's work is discussed. This framework includes key definitions of high-risk patients, target groups, populations and community. Roles for each level of population management and how they were implemented in the health care system are described. A point of service framework centered on seven dimensions of care is the next construct applied on each nursing unit. The third construct outlines the framework for role development. Three roles for nursing were created to implement strategies for target groups that are strategic disease categories; two of those roles are described in depth. The philosophy of nursing practice is centered on caring and existential advocacy. The final construct is the modification of the Dartmouth model as a common framework for outcomes. System applications of the scorecard and lessons learned in the 2-year process of implementation are shared

  3. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  4. A framework for stochastic simulation of distribution practices for hotel reservations

    International Nuclear Information System (INIS)

    Halkos, George E.; Tsilika, Kyriaki D.

    2015-01-01

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system

  5. Use of the Theoretical Domains Framework to evaluate factors driving successful implementation of the Accelerated Chest pain Risk Evaluation (ACRE) project.

    Science.gov (United States)

    Skoien, Wade; Page, Katie; Parsonage, William; Ashover, Sarah; Milburn, Tanya; Cullen, Louise

    2016-10-12

    The translation of healthcare research into practice is typically challenging and limited in effectiveness. The Theoretical Domains Framework (TDF) identifies 12 domains of behaviour determinants which can be used to understand the principles of behavioural change, a key factor influencing implementation. The Accelerated Chest pain Risk Evaluation (ACRE) project has successfully translated research into practice, by implementing an intervention to improve the assessment of low to intermediate risk patients presenting to emergency departments (EDs) with chest pain. The aims of this paper are to describe use of the TDF to determine which factors successfully influenced implementation and to describe use of the TDF as a tool to evaluate implementation efforts and which domains are most relevant to successful implementation. A 30-item questionnaire targeting clinicians was developed using the TDF as a guide. Questions encompassed ten of the domains of the TDF: Knowledge; Skills; Social/professional role and identity; Beliefs about capabilities; Optimism; Beliefs about consequences; Intentions; Memory, attention and decision processes; Environmental context and resources; and Social influences. Sixty-three of 176 stakeholders (36 %) responded to the questionnaire. Responses for all scales showed that respondents were highly favourable to all aspects of the implementation. Scales with the highest mean responses were Intentions, Knowledge, and Optimism, suggesting that initial education and awareness strategies around the ACRE project were effective. Scales with the lowest mean responses were Environmental context and resources, and Social influences, perhaps highlighting that implementation planning could have benefitted from further consideration of the factors underlying these scales. The ACRE project was successful, and therefore, a perfect case study for understanding factors which drive implementation success. The overwhelmingly positive response suggests that it

  6. A theoretical framework for convergence and continuous dependence of estimates in inverse problems for distributed parameter systems

    Science.gov (United States)

    Banks, H. T.; Ito, K.

    1988-01-01

    Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.

  7. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  8. How to Invest in Getting Cost-effective Technologies into Practice? A Framework for Value of Implementation Analysis Applied to Novel Oral Anticoagulants.

    Science.gov (United States)

    Faria, Rita; Walker, Simon; Whyte, Sophie; Dixon, Simon; Palmer, Stephen; Sculpher, Mark

    2017-02-01

    Cost-effective interventions are often implemented slowly and suboptimally in clinical practice. In such situations, a range of implementation activities may be considered to increase uptake. A framework is proposed to use cost-effectiveness analysis to inform decisions on how best to invest in implementation activities. This framework addresses 2 key issues: 1) how to account for changes in utilization in the future in the absence of implementation activities; and 2) how to prioritize implementation efforts between subgroups. A case study demonstrates the framework's application: novel oral anticoagulants (NOACs) for the prevention of stroke in the National Health Service in England and Wales. The results suggest that there is value in additional implementation activities to improve uptake of NOACs, particularly in targeting patients with average or poor warfarin control. At a cost-effectiveness threshold of £20,000 per quality-adjusted life-year (QALY) gained, additional investment in an educational activity that increases the utilization of NOACs by 5% in all patients currently taking warfarin generates an additional 254 QALYs, compared with 973 QALYs in the subgroup with average to poor warfarin control. However, greater value could be achieved with higher uptake of anticoagulation more generally: switching 5% of patients who are potentially eligible for anticoagulation but are currently receiving no treatment or are using aspirin would generate an additional 4990 QALYs. This work can help health services make decisions on investment at different points of the care pathway or across disease areas in a manner consistent with the value assessment of new interventions.

  9. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: An Earth Modeling System Software Framework Strawman Design that Integrates Cactus and UCLA/UCB Distributed Data Broker

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task. both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation, while maintaining high performance across numerous supercomputer and workstation architectures. This document proposes a strawman framework design for the climate community based on the integration of Cactus, from the relativistic physics community, and UCLA/UCB Distributed Data Broker (DDB) from the climate community. This design is the result of an extensive survey of climate models and frameworks in the climate community as well as frameworks from many other scientific communities. The design addresses fundamental development and runtime needs using Cactus, a framework with interfaces for FORTRAN and C-based languages, and high-performance model communication needs using DDB. This document also specifically explores object-oriented design issues in the context of climate modeling as well as climate modeling issues in terms of object-oriented design.

  10. More performance results and implementation of an object oriented track reconstruction model in different OO frameworks

    International Nuclear Information System (INIS)

    Gaines, Irwin; Qian Sijin

    2001-01-01

    This is an update of the report about an Object Oriented (OO) track reconstruction model, which was presented in the previous AIHENP'99 at Crete, Greece. The OO model for the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. It has been coded in the C++ programming language and successfully implemented into a few different OO computing environments of the CMS and ATLAS experiments at the future Large Hadron Collider at CERN. We shall report: (1) more performance result: (2) implementing the OO model into the new SW OO framework 'Athena' of ATLAS experiment and some upgrades of the OO model itself

  11. Towards Scalable Distributed Framework for Urban Congestion Traffic Patterns Warehousing

    Directory of Open Access Journals (Sweden)

    A. Boulmakoul

    2015-01-01

    Full Text Available We put forward architecture of a framework for integration of data from moving objects related to urban transportation network. Most of this research refers to the GPS outdoor geolocation technology and uses distributed cloud infrastructure with big data NoSQL database. A network of intelligent mobile sensors, distributed on urban network, produces congestion traffic patterns. Congestion predictions are based on extended simulation model. This model provides traffic indicators calculations, which fuse with the GPS data for allowing estimation of traffic states across the whole network. The discovery process of congestion patterns uses semantic trajectories metamodel given in our previous works. The challenge of the proposed solution is to store patterns of traffic, which aims to ensure the surveillance and intelligent real-time control network to reduce congestion and avoid its consequences. The fusion of real-time data from GPS-enabled smartphones integrated with those provided by existing traffic systems improves traffic congestion knowledge, as well as generating new information for a soft operational control and providing intelligent added value for transportation systems deployment.

  12. Effect of implementing the 5As of obesity management framework on provider-patient interactions in primary care.

    Science.gov (United States)

    Rueda-Clausen, C F; Benterud, E; Bond, T; Olszowka, R; Vallis, M T; Sharma, A M

    2014-02-01

    Obesity counselling in primary care is positively associated with self-reported behaviour change in patients with obesity. Obesity counselling is rare, and when it does occur, it is often of low quality because of poor training and/or competency of providers' obesity management, lack of time and economical disincentives, and negative attitude towards obesity and obesity management. 5As frameworks are routinely used for behaviour-change counselling and addiction management (e.g. smoking cessation), but few studies have examined its efficacy for weight management. This study presents pilot data from the implementation and evaluation of an obesity management tool (5As of Obesity Management developed by the Canadian Obesity Network) in a primary care setting. Results show that the tool facilitates weight management in primary care by promoting physician-patient communications, medical assessments for obesity and plans for follow-up care. Obesity remains poorly managed in primary care. The 5As of Obesity Management is a theory-driven, evidence-based minimal intervention designed to facilitate obesity counselling and management by primary care practitioners. This project tested the impact of implementing this tool in primary care clinics. Electronic self-administered surveys were completed by pre-screened obese subjects at the end of their appointments in four primary care clinics (over 25 healthcare providers [HCPs]). These measurements were performed before (baseline, n = 51) and 1 month after implementing the 5As of Obesity Management (post-intervention, n = 51). Intervention consisted of one online training session (90 min) and distribution of the 5As toolkit to HCPs of participating clinics. Subjects completing the survey before and after the intervention were comparable in terms of age, sex, body mass index, comorbidities, satisfaction and self-reported health status (P > 0.2). Implementing the 5As of Obesity Management resulted in a twofold increase

  13. Toward enhancing the distributed video coder under a multiview video codec framework

    Science.gov (United States)

    Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua

    2016-11-01

    The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.

  14. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation.

    Science.gov (United States)

    Keith, Rosalind E; Crosson, Jesse C; O'Malley, Ann S; Cromp, DeAnn; Taylor, Erin Fries

    2017-02-10

    Much research does not address the practical needs of stakeholders responsible for introducing health care delivery interventions into organizations working to achieve better outcomes. In this article, we present an approach to using the Consolidated Framework for Implementation Research (CFIR) to guide systematic research that supports rapid-cycle evaluation of the implementation of health care delivery interventions and produces actionable evaluation findings intended to improve implementation in a timely manner. To present our approach, we describe a formative cross-case qualitative investigation of 21 primary care practices participating in the Comprehensive Primary Care (CPC) initiative, a multi-payer supported primary care practice transformation intervention led by the Centers for Medicare and Medicaid Services. Qualitative data include observational field notes and semi-structured interviews with primary care practice leadership, clinicians, and administrative and medical support staff. We use intervention-specific codes, and CFIR constructs to reduce and organize the data to support cross-case analysis of patterns of barriers and facilitators relating to different CPC components. Using the CFIR to guide data collection, coding, analysis, and reporting of findings supported a systematic, comprehensive, and timely understanding of barriers and facilitators to practice transformation. Our approach to using the CFIR produced actionable findings for improving implementation effectiveness during this initiative and for identifying improvements to implementation strategies for future practice transformation efforts. The CFIR is a useful tool for guiding rapid-cycle evaluation of the implementation of practice transformation initiatives. Using the approach described here, we systematically identified where adjustments and refinements to the intervention could be made in the second year of the 4-year intervention. We think the approach we describe has broad

  15. A framework using cluster-based hybrid network architecture for collaborative virtual surgery.

    Science.gov (United States)

    Qin, Jing; Choi, Kup-Sze; Poon, Wai-Sang; Heng, Pheng-Ann

    2009-12-01

    Research on collaborative virtual environments (CVEs) opens the opportunity for simulating the cooperative work in surgical operations. It is however a challenging task to implement a high performance collaborative surgical simulation system because of the difficulty in maintaining state consistency with minimum network latencies, especially when sophisticated deformable models and haptics are involved. In this paper, an integrated framework using cluster-based hybrid network architecture is proposed to support collaborative virtual surgery. Multicast transmission is employed to transmit updated information among participants in order to reduce network latencies, while system consistency is maintained by an administrative server. Reliable multicast is implemented using distributed message acknowledgment based on cluster cooperation and sliding window technique. The robustness of the framework is guaranteed by the failure detection chain which enables smooth transition when participants join and leave the collaboration, including normal and involuntary leaving. Communication overhead is further reduced by implementing a number of management approaches such as computational policies and collaborative mechanisms. The feasibility of the proposed framework is demonstrated by successfully extending an existing standalone orthopedic surgery trainer into a collaborative simulation system. A series of experiments have been conducted to evaluate the system performance. The results demonstrate that the proposed framework is capable of supporting collaborative surgical simulation.

  16. A multistage framework for reliability-based distribution expansion planning considering distributed generations by a self-adaptive global-based harmony search algorithm

    International Nuclear Information System (INIS)

    Shivaie, Mojtaba; Ameli, Mohammad T.; Sepasian, Mohammad S.; Weinsier, Philip D.; Vahidinasab, Vahid

    2015-01-01

    In this paper, the authors present a new multistage framework for reliability-based Distribution Expansion Planning (DEP) in which expansion options are a reinforcement and/or installation of substations, feeders, and Distributed Generations (DGs). The proposed framework takes into account not only costs associated with investment, maintenance, and operation, but also expected customer interruption cost in the optimization as four problem objectives. At the same time, operational restrictions, Kirchhoff's laws, radial structure limitation, voltage limits, and capital expenditure budget restriction are considered as problem constraints. The proposed model is a non-convex optimization problem having a non-linear, mixed-integer nature. Hence, a hybrid Self-adaptive Global-based Harmony Search Algorithm (SGHSA) and Optimal Power Flow (OPF) were used and followed by a fuzzy satisfying method in order to obtain the final optimal solution. The SGHSA is a recently developed optimization algorithm which imitates the music improvisation process. In this process, the harmonists improvise their instrument pitches, searching for the perfect state of harmony. The planning methodology was demonstrated on the 27-node, 13.8-kV test system in order to demonstrate the feasibility and capability of the proposed model. Simulation results illustrated the sufficiency and profitableness of the newly developed framework, when compared with other methods. - Highlights: • A new multistage framework is presented for reliability-based DEP problem. • In this paper, DGs are considered as an expansion option to increase the flexibility of the proposed model. • In this paper, effective factors of DEP problem are incorporated as a multi-objective model. • In this paper, three new algorithms HSA, IHSA and SGHSA are proposed. • Results obtained by the proposed SGHSA algorithm are better than others

  17. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    Science.gov (United States)

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  18. A framework for multi-object tracking over distributed wireless camera networks

    Science.gov (United States)

    Gau, Victor; Hwang, Jenq-Neng

    2010-07-01

    In this paper, we propose a unified framework targeting at two important issues in a distributed wireless camera network, i.e., object tracking and network communication, to achieve reliable multi-object tracking over distributed wireless camera networks. In the object tracking part, we propose a fully automated approach for tracking of multiple objects across multiple cameras with overlapping and non-overlapping field of views without initial training. To effectively exchange the tracking information among the distributed cameras, we proposed an idle probability based broadcasting method, iPro, which adaptively adjusts the broadcast probability to improve the broadcast effectiveness in a dense saturated camera network. Experimental results for the multi-object tracking demonstrate the promising performance of our approach on real video sequences for cameras with overlapping and non-overlapping views. The modeling and ns-2 simulation results show that iPro almost approaches the theoretical performance upper bound if cameras are within each other's transmission range. In more general scenarios, e.g., in case of hidden node problems, the simulation results show that iPro significantly outperforms standard IEEE 802.11, especially when the number of competing nodes increases.

  19. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    Science.gov (United States)

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  20. European union water policy--tasks for implementing "Water Framework Directive" in pre-accession countries.

    Science.gov (United States)

    Sözen, Seval; Avcioglu, Ebru; Ozabali, Asli; Görgun, Erdem; Orhon, Derin

    2003-08-01

    Water Framework Directive aiming to maintain and improve the aquatic environment in the EU was launched by the European Parliament in 2000. According to this directive, control of quantity is an ancillary element in securing good water quality and therefore measures on quantity, serving the objective of ensuring good quality should also be established. Accordingly, it is a comprehensive and coordinated package that will ensure all European waters to be protected according to a common standard. Therefore, it refers to all other Directives related to water resources management such as Urban Wastewater Treatment Directive Nitrates Directive, Drinking Water Directive, Integrated Pollution Prevention Control etc. Turkey, as a candidate state targeting full-membership, should comply the necessary preparations for the implementation of the "Water Framework Directive" as soon as possible. In this study, the necessary legislative, political, institutional, and technical attempts of the pre-accession countries have been discussed and effective recommendations have been offered for future activities in Turkey.

  1. Evaluation of the implementation of a whole-workplace walking programme using the RE-AIM framework

    Directory of Open Access Journals (Sweden)

    Emma J. Adams

    2017-05-01

    Full Text Available Abstract Background Promoting walking for the journey to/from work and during the working day is one potential approach to increase physical activity in adults. Walking Works was a practice-led, whole-workplace walking programme delivered by employees (walking champions. This study aimed to evaluate the implementation of Walking Works using the RE-AIM framework and provide recommendations for future delivery of whole-workplace walking programmes. Methods Two cross sectional surveys were conducted; 1544 (28% employees completed the baseline survey and 918 employees (21% completed the follow-up survey. Effectiveness was assessed using baseline and follow-up data; reach, implementation and maintenance were assessed using follow-up data only. For categorical data, Chi square tests were conducted to assess differences between surveys or groups. Continuous data were analysed to test for significant differences using a Mann-Whitney U test. Telephone interviews were conducted with the lead organisation co-ordinator, eight walking champions and three business representatives at follow-up. Interviews were transcribed verbatim and analysed to identify key themes related to adoption, implementation and maintenance. Results Adoption: Five workplaces participated in Walking Works. Reach: 480 (52.3% employees were aware of activities and 221 (24.1% participated. Implementation: A variety of walking activities were delivered. Some programme components were not delivered as planned which was partly due to barriers in using walking champions to deliver activities. These included the walking champions’ capacity, skills, support needs, ability to engage senior management, and the number and type of activities they could deliver. Other barriers included lack of management support, difficulties communicating information about activities and challenges embedding the programme into normal business activities. Effectiveness: No significant changes in walking to

  2. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  3. The Vehicular Information Space Framework

    Science.gov (United States)

    Prinz, Vivian; Schlichter, Johann; Schweiger, Benno

    Vehicular networks are distributed, self-organizing and highly mobile ad hoc networks. They allow for providing drivers with up-to-the-minute information about their environment. Therefore, they are expected to be a decisive future enabler for enhancing driving comfort and safety. This article introduces the Vehicular Information Space framework (VIS). Vehicles running the VIS form a kind of distributed database. It enables them to provide information like existing hazards, parking spaces or traffic densities in a location aware and fully distributed manner. In addition, vehicles can retrieve, modify and delete these information items. The underlying algorithm is based on features derived from existing structured Peer-to-Peer algorithms and extended to suit the specific characteristics of highly mobile ad hoc networks. We present, implement and simulate the VIS using a motorway and an urban traffic environment. Simulation studies on VIS message occurrence show that the VIS implies reasonable traffic overhead. Also, overall VIS message traffic is independent from the number of information items provided.

  4. Climate Services Information System Activities in Support of The Global Framework for Climate Services Implementation

    Science.gov (United States)

    Timofeyeva-Livezey, M. M.; Horsfall, F. M. C.; Pulwarty, R. S.; Klein-Tank, A.; Kolli, R. K.; Hechler, P.; Dilley, M.; Ceron, J. P.; Goodess, C.

    2017-12-01

    The WMO Commission on Climatology (CCl) supports the implementation of the Global Framework for Climate Services (GFCS) with a particular focus on the Climate Services Information System (CSIS), which is the core operational component of GFCS at the global, regional, and national level. CSIS is designed for producing, packaging and operationally delivering authoritative climate information data and products through appropriate operational systems, practices, data exchange, technical standards, authentication, communication, and product delivery. Its functions include climate analysis and monitoring, assessment and attribution, prediction (monthly, seasonal, decadal), and projection (centennial scale) as well as tailoring the associated products tUEAo suit user requirements. A central, enabling piece of implementation of CSIS is a Climate Services Toolkit (CST). In its development phase, CST exists as a prototype (www.wmo.int/cst) as a compilation of tools for generating tailored data and products for decision-making, with a special focus on national requirements in developing countries. WMO provides a server to house the CST prototype as well as support operations and maintenance. WMO members provide technical expertise and other in-kind support, including leadership of the CSIS development team. Several recent WMO events have helped with the deployment of CST within the eight countries that have been recognized by GFCS as illustrative for developing their climate services at national levels. Currently these countries are developing climate services projects focusing service development and delivery for selected economic sectors, such as for health, agriculture, energy, water resources, and hydrometeorological disaster risk reduction. These countries are working together with their respective WMO Regional Climate Centers (RCCs), which provide technical assistance with implementation of climate services projects at the country level and facilitate development of

  5. A Survey of Software Infrastructures and Frameworks for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    Christoph Endres

    2005-01-01

    Full Text Available In this survey, we discuss 29 software infrastructures and frameworks which support the construction of distributed interactive systems. They range from small projects with one implemented prototype to large scale research efforts, and they come from the fields of Augmented Reality (AR, Intelligent Environments, and Distributed Mobile Systems. In their own way, they can all be used to implement various aspects of the ubiquitous computing vision as described by Mark Weiser [60]. This survey is meant as a starting point for new projects, in order to choose an existing infrastructure for reuse, or to get an overview before designing a new one. It tries to provide a systematic, relatively broad (and necessarily not very deep overview, while pointing to relevant literature for in-depth study of the systems discussed.

  6. Integrating the SRB with the GIGGLE framework

    Energy Technology Data Exchange (ETDEWEB)

    Barrass, T A [Physics Department, University of Bristol, Bristol BS8 1TL (United Kingdom); Maroney, O J.E. . [Physics Department, University of Bristol, Bristol BS8 1TL (United Kingdom); Metson, S [Physics Department, University of Bristol, Bristol BS8 1TL (United Kingdom); Newbold, D [Physics Department, University of Bristol, Bristol BS8 1TL (United Kingdom)

    2004-11-21

    Distributed data transfer is currently characterised by the use of widely disparate tools, meaning that significant human effort is required to maintain the distributed system. In order to realise the possibilities represented by Grid infrastructure, the reality of a heterogenous computing environment must be tackled by providing means by which these disparate elements can communicate. Two such data distribution tools are the SRB and the EU DataGrid's Data Management fabric, both widely used by many large scientific projects. Both provide similar functionality--the replication and cataloguing of datasets in a globally distributed environment. Significant quantities of data are currently stored in both. Moving data from the SRB to the EUDG, however, requires significant intervention and is therefore not scalable. This paper presents a mechanism by which the SRB can automatically interact with the GIGGLE framework as implemented by the EUDG, allowing access to SRB data using Grid tools.

  7. Microplastics in seawater: Recommendations from the Marine Strategy Framework Directive implementation process

    Directory of Open Access Journals (Sweden)

    Jesus Gago

    2016-11-01

    Full Text Available Microplastic litter is a pervasive pollutant present in marine systems across the globe. The legacy of microplastics pollution in the marine environment today may remain for years to come due to the persistence of these materials. Microplastics are emerging contaminants of potential concern and as yet there are few recognised approaches for monitoring. In 2008, the EU Marine Strategy Framework Directive (MSFD, 2008/56/EC included microplastics as an aspect to be measured. Here we outline the approach as discussed by the European Union expert group on marine litter, the technical Subgroup on Marine litter (TSG-ML, with a focus on the implementation of monitoring microplastics in seawater in European seas. It is concluded that harmonization and coherence is needed to achieve reliable monitoring.

  8. An Erlang Implementation of Multiparty Session Actors

    Directory of Open Access Journals (Sweden)

    Simon Fowler

    2016-08-01

    Full Text Available By requiring co-ordination to take place using explicit message passing instead of relying on shared memory, actor-based programming languages have been shown to be effective tools for building reliable and fault-tolerant distributed systems. Although naturally communication-centric, communication patterns in actor-based applications remain informally specified, meaning that errors in communication are detected late, if at all. Multiparty session types are a formalism to describe, at a global level, the interactions between multiple communicating entities. This article describes the implementation of a prototype framework for monitoring Erlang/OTP gen_server applications against multiparty session types, showing how previous work on multiparty session actors can be adapted to a purely actor-based language, and how monitor violations and termination of session participants can be reported in line with the Erlang mantra of "let it fail". Finally, the framework is used to implement two case studies: an adaptation of a freely-available DNS server, and a chat server.

  9. Determining the predictors of innovation implementation in healthcare: a quantitative analysis of implementation effectiveness.

    Science.gov (United States)

    Jacobs, Sara R; Weiner, Bryan J; Reeve, Bryce B; Hofmann, David A; Christian, Michael; Weinberger, Morris

    2015-01-22

    The failure rates for implementing complex innovations in healthcare organizations are high. Estimates range from 30% to 90% depending on the scope of the organizational change involved, the definition of failure, and the criteria to judge it. The innovation implementation framework offers a promising approach to examine the organizational factors that determine effective implementation. To date, the utility of this framework in a healthcare setting has been limited to qualitative studies and/or group level analyses. Therefore, the goal of this study was to quantitatively examine this framework among individual participants in the National Cancer Institute's Community Clinical Oncology Program using structural equation modeling. We examined the innovation implementation framework using structural equation modeling (SEM) among 481 physician participants in the National Cancer Institute's Community Clinical Oncology Program (CCOP). The data sources included the CCOP Annual Progress Reports, surveys of CCOP physician participants and administrators, and the American Medical Association Physician Masterfile. Overall the final model fit well. Our results demonstrated that not only did perceptions of implementation climate have a statistically significant direct effect on implementation effectiveness, but physicians' perceptions of implementation climate also mediated the relationship between organizational implementation policies and practices (IPP) and enrollment (p innovation implementation framework between IPP, implementation climate, and implementation effectiveness among individual physicians. This finding is important, as although the model has been discussed within healthcare organizations before, the studies have been predominately qualitative in nature and/or at the organizational level. In addition, our findings have practical applications. Managers looking to increase implementation effectiveness of an innovation should focus on creating an environment that

  10. Optimizing Client Latency in a Distributed System by Using the “Remote Façade” Design Pattern

    Directory of Open Access Journals (Sweden)

    Cosmin RABLOU

    2010-01-01

    Full Text Available In this paper we investigate the role of the Remote Façade pattern in the optimization of dis-tributed systems. The intent of this pattern is to wrap fine-grained remote objects in a coarse-grained interface and thus greatly reduce the total number of calls executed over the network. The measurement of the performance gain achieved by implementing this pattern is done through testing with a distributed application written in C# and using the latest Microsoft framework for distributed systems (Windows Communication Framework. Furthermore, we will be presenting the scenarios in which the implementation of the Remote Façade pattern brings a significant performance gain. Finally we show further scenarios in which the per-formance brought by this pattern can be investigated.

  11. Nonlocal approach to the analysis of the stress distribution in granular systems. I. Theoretical framework

    Science.gov (United States)

    Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.

    1998-05-01

    A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.

  12. Efficient string similarity join in multi-core and distributed systems.

    Directory of Open Access Journals (Sweden)

    Cairong Yan

    Full Text Available In big data area a significant challenge about string similarity join is to find all similar pairs more efficiently. In this paper, we propose a parallel processing framework for efficient string similarity join. First, the input is split into some disjoint small subsets according to the joint frequency distribution and the interval distribution of strings. Then the filter-verification strategy is adopted in the computation of string similarity for each subset so that the number of candidate pairs is reduced before an effective pruning strategy is used to improve the performance. Finally, the operation of string join is executed in parallel. Para-Join algorithm based on the multi-threading technique is proposed to implement the framework in a multi-core system while Pada-Join algorithm based on Spark platform is proposed to implement the framework in a cluster system. We prove that Para-Join and Pada-Join cannot only avoid reduplicate computation but also ensure the completeness of the result. Experimental results show that Para-Join can achieve high efficiency and significantly outperform than state-of-the-art approaches, meanwhile, Pada-Join can work on large datasets.

  13. Studying the implementation of the Water Framework Directive in Europe: a meta-analysis of 89 journal articles

    Directory of Open Access Journals (Sweden)

    Blandine Boeuf

    2016-06-01

    Full Text Available The Water Framework Directive (WFD is arguably the most ambitious piece of European Union (EU legislation in the field of water. The directive defines a general framework for integrated river basin management in Europe with a view to achieving "good water status" by 2015. Institutional novelties include, among others, water management at hydrological scales, the involvement of nonstate actors in water planning, and various economic principles, as well as a common strategy to support EU member states during the implementation of the directive. More than 15 years after the adoption of the WFD, and with the passing of an important milestone, 2015, we believe it is time for an interim assessment. This article provides a systematic review of existing scholarship on WFD implementation. We identify well-documented areas of research, describe largely unchartered territories, and suggest avenues for future studies. Methodologically, we relied on a meta-analysis. Based on a codebook of more than 35 items, we analyzed 89 journal articles reporting on the implementation of the directive in EU member states. Our review is organized around three major themes. The first is "who, when, and where"; we explore publication patterns, thereby looking into authors, timelines, and target journals. The second is "what"; we analyze the object of study in our source articles with a particular focus on case study countries, policy levels, the temporal stage of WFD implementation, and if the directive was not studied in its entirety, the aspect of the WFD that received scholarly attention. The third is "how," i.e., theoretical and methodological choices made when studying the WFD.

  14. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  15. Distributed power-line outage detection based on wide area measurement system.

    Science.gov (United States)

    Zhao, Liang; Song, Wen-Zhan

    2014-07-21

    In modern power grids, the fast and reliable detection of power-line outages is an important functionality, which prevents cascading failures and facilitates an accurate state estimation to monitor the real-time conditions of the grids. However, most of the existing approaches for outage detection suffer from two drawbacks, namely: (i) high computational complexity; and (ii) relying on a centralized means of implementation. The high computational complexity limits the practical usage of outage detection only for the case of single-line or double-line outages. Meanwhile, the centralized means of implementation raises security and privacy issues. Considering these drawbacks, the present paper proposes a distributed framework, which carries out in-network information processing and only shares estimates on boundaries with the neighboring control areas. This novel framework relies on a convex-relaxed formulation of the line outage detection problem and leverages the alternating direction method of multipliers (ADMM) for its distributed solution. The proposed framework invokes a low computational complexity, requiring only linear and simple matrix-vector operations. We also extend this framework to incorporate the sparse property of the measurement matrix and employ the LSQRalgorithm to enable a warm start, which further accelerates the algorithm. Analysis and simulation tests validate the correctness and effectiveness of the proposed approaches.

  16. Deterministic Design Optimization of Structures in OpenMDAO Framework

    Science.gov (United States)

    Coroneos, Rula M.; Pai, Shantaram S.

    2012-01-01

    Nonlinear programming algorithms play an important role in structural design optimization. Several such algorithms have been implemented in OpenMDAO framework developed at NASA Glenn Research Center (GRC). OpenMDAO is an open source engineering analysis framework, written in Python, for analyzing and solving Multi-Disciplinary Analysis and Optimization (MDAO) problems. It provides a number of solvers and optimizers, referred to as components and drivers, which users can leverage to build new tools and processes quickly and efficiently. Users may download, use, modify, and distribute the OpenMDAO software at no cost. This paper summarizes the process involved in analyzing and optimizing structural components by utilizing the framework s structural solvers and several gradient based optimizers along with a multi-objective genetic algorithm. For comparison purposes, the same structural components were analyzed and optimized using CometBoards, a NASA GRC developed code. The reliability and efficiency of the OpenMDAO framework was compared and reported in this report.

  17. Implementation of internet-delivered cognitive behavior therapy within community mental health clinics: a process evaluation using the consolidated framework for implementation research.

    Science.gov (United States)

    Hadjistavropoulos, H D; Nugent, M M; Dirkse, D; Pugh, N

    2017-09-12

    Depression and anxiety are prevalent and under treated conditions that create enormous burden for the patient and the health system. Internet-delivered cognitive behavior therapy (ICBT) improves patient access to treatment by providing therapeutic information via the Internet, presented in sequential lessons, accompanied by brief weekly therapist support. While there is growing research supporting ICBT, use of ICBT within community mental health clinics is limited. In a recent trial, an external unit specializing in ICBT facilitated use of ICBT in community mental health clinics in one Canadian province (ISRCTN42729166; registered November 5, 2013). Patient outcomes were very promising and uptake was encouraging. This paper reports on a parallel process evaluation designed to understand facilitators and barriers impacting the uptake and implementation of ICBT. Therapists (n = 22) and managers (n = 11) from seven community mental health clinics dispersed across one Canadian province who were involved in implementing ICBT over ~2 years completed an online survey (including open and closed-ended questions) about ICBT experiences. The questions were based on the Consolidated Framework for Implementation Research (CFIR), which outlines diverse constructs that have the potential to impact program implementation. Analyses suggested ICBT implementation was perceived to be most prominently facilitated by intervention characteristics (namely the relative advantages of ICBT compared to face-to-face therapy, the quality of the ICBT program that was delivered, and evidence supporting ICBT) and implementation processes (namely the use of an external facilitation unit that aided with engaging patients, therapists, and managers and ICBT implementation). The inner setting was identified as the most significant barrier to implementation as a result of limited resources for ICBT combined with greater priority given to face-to-face care. The results contribute to understanding

  18. ToPS: a framework to manipulate probabilistic models of sequence data.

    Directory of Open Access Journals (Sweden)

    André Yoshiaki Kashiwabara

    Full Text Available Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i independent and identically distributed process; (ii variable-length Markov chain; (iii inhomogeneous Markov chain; (iv hidden Markov model; (v profile hidden Markov model; (vi pair hidden Markov model; (vii generalized hidden Markov model; and (viii similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC. The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently.

  19. A policy-based multi-objective optimisation framework for residential distributed energy system design★

    Directory of Open Access Journals (Sweden)

    Wouters Carmen

    2017-01-01

    Full Text Available Distributed energy systems (DES are increasingly being introduced as solutions to alleviate conventional energy system challenges related to energy security, climate change and increasing demands. From a technological and economic perspective, distributed energy resources are already becoming viable. The question still remains as to how these technologies and practices can be “best” selected, sized and integrated within consumer areas. To aid decision-makers and enable widespread DES adoption, a strategic superstructure design framework is therefore still required that ensures balancing of multiple stakeholder interests and fits in with liberalised energy system objectives of competition, security of supply and sustainability. Such a design framework is presented in this work. An optimisation-based approach for the design of neighbourhood-based DES is developed that enables meeting their yearly electricity, heating and cooling needs by appropriately selecting, sizing and locating technologies and energy interactions. A pool of poly-generation and storage technologies is hereto considered combined with local energy sharing between participating prosumers through thermal pipeline design and microgrid operation, and, a bi-directional connection with the central distribution grid. A superstructure mixed-integer linear programming approach (MILP is proposed to trade off three minimisation objectives in the design process: total annualised cost, annual CO2 emissions and electrical system unavailability, aligned with the three central energy system objectives. The developed model is applied on a small South Australian neighbourhood. The approach enables identifying “knee-point” neighbourhood energy system designs through Pareto trade-offs between objectives and serves to inform decision-makers about the impact of policy objectives on DES development strategies.

  20. A framework for assessing cost management system changes: the case of activity-based costing implementation at food industry

    Directory of Open Access Journals (Sweden)

    Tayebeh Faraji

    2015-04-01

    Full Text Available An opportunity to investigate the technical and organizational effect of management accounting system changes has appeared with companies' adoption of activity-based costing (ABC. This paper presents an empirical investigation to study the effects of ABC system for case study from food industry in Iran. From this case, the paper develops a framework for assessing ABC implementation and hypotheses about factors that influence implementation. The study detects five cost centers and for each cost center, it determines different cost drivers. The results of our survey has detected that implementation of ABC system not only helps precise allocation of overhead costs but also helps internal management companies for better planning and control of production, making better decisions for company's profits.

  1. Bridging the gap between Hydrologic and Atmospheric communities through a standard based framework

    Science.gov (United States)

    Boldrini, E.; Salas, F.; Maidment, D. R.; Mazzetti, P.; Santoro, M.; Nativi, S.; Domenico, B.

    2012-04-01

    Data interoperability in the study of Earth sciences is essential to performing interdisciplinary multi-scale multi-dimensional analyses (e.g. hydrologic impacts of global warming, regional urbanization, global population growth etc.). This research aims to bridge the existing gap between hydrologic and atmospheric communities both at semantic and technological levels. Within the context of hydrology, scientists are usually concerned with data organized as time series: a time series can be seen as a variable measured at a particular point in space over a period of time (e.g. the stream flow values as periodically measured by a buoy sensor in a river); atmospheric scientists instead usually organize their data as coverages: a coverage can be seen as a multidimensional data array (e.g. satellite images acquired through time). These differences make non-trivial the set up of a common framework to perform data discovery and access. A set of web services specifications and implementations is already in place in both the scientific communities to allow data discovery and access in the different domains. The CUAHSI-Hydrologic Information System (HIS) service stack lists different services types and implementations: - a metacatalog (implemented as a CSW) used to discover metadata services by distributing the query to a set of catalogs - time series catalogs (implemented as CSW) used to discover datasets published by the feature services - feature services (implemented as WFS) containing features with data access link - sensor observation services (implemented as SOS) enabling access to the stream of acquisitions Within the Unidata framework, there lies a similar service stack for atmospheric data: - the broker service (implemented as a CSW) distributes a user query to a set of heterogeneous services (i.e. catalogs services, but also inventory and access services) - the catalog service (implemented as a CSW) is able to harvest the available metadata offered by THREDDS

  2. Transactive control: a framework for operating power systems characterized by high penetration of distributed energy resources

    DEFF Research Database (Denmark)

    Hu, Junjie; Yang, Guangya; Kok, Koen

    2016-01-01

    The increasing number of distributed energy resources connected to power systems raises operational challenges for the network operator, such as introducing grid congestion and voltage deviations in the distribution network level, as well as increasing balancing needs at the whole system level......, followed by a literature review and demonstration projects that apply to transactive control. Cases are then presented to illustrate the transactive control framework. At the end, discussions and research directions are presented, for applying transactive control to operating power systems, characterized...

  3. 76 FR 22944 - Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity Management...

    Science.gov (United States)

    2011-04-25

    ... oversight program and operating conditions as well as the evolutionary process that distribution system... 20590. Hand Delivery: Docket Management System, Room W12-140, on the ground floor of the West Building... PHMSA-2011-0084] Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity...

  4. Luiza: Analysis Framework for GLORIA

    Directory of Open Access Journals (Sweden)

    Aleksander Filip Żarnecki

    2013-01-01

    Full Text Available The Luiza analysis framework for GLORIA is based on the Marlin package, which was originally developed for data analysis in the new High Energy Physics (HEP project, International Linear Collider (ILC. The HEP experiments have to deal with enormous amounts of data and distributed data analysis is therefore essential. The Marlin framework concept seems to be well suited for the needs of GLORIA. The idea (and large parts of the code taken from Marlin is that every computing task is implemented as a processor (module that analyzes the data stored in an internal data structure, and the additional output is also added to that collection. The advantage of this modular approach is that it keeps things as simple as possible. Each step of the full analysis chain, e.g. from raw images to light curves, can be processed step-by-step, and the output of each step is still self consistent and can be fed in to the next step without any manipulation.

  5. MODELING AND IMPLEMENTATION OF A DISTRIBUTED SHOP FLOOR MANAGEMENT AND CONTROL SYSTEM

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Adopting distributed control architecture is the important development direction for shop floor management and control system,is also the requirement of making it agile,intelligent and concurrent. Some key problems in achieving distributed control architecture are researched. An activity model of shop floor is presented as the requirement definition of the prototype system. The multi-agent based software architecture is constructed. How the core part in shop floor management and control system,production plan and scheduling is achieved. The cooperation of different agents is illustrated. Finally,the implementation of the prototype system is narrated.

  6. An Open Architecture Framework for Electronic Warfare Based Approach to HLA Federate Development

    Directory of Open Access Journals (Sweden)

    HyunSeo Kang

    2018-01-01

    Full Text Available A variety of electronic warfare models are developed in the Electronic Warfare Research Center. An Open Architecture Framework for Electronic Warfare (OAFEw has been developed for reusability of various object models participating in the electronic warfare simulation and for extensibility of the electronic warfare simulator. OAFEw is a kind of component-based software (SW lifecycle management support framework. This OAFEw is defined by six components and ten rules. The purpose of this study is to construct a Distributed Simulation Interface Model, according to the rules of OAFEw, and create Use Case Model of OAFEw Reference Conceptual Model version 1.0. This is embodied in the OAFEw-FOM (Federate Object Model for High-Level Architecture (HLA based distributed simulation. Therefore, we design and implement EW real-time distributed simulation that can work with a model in C++ and MATLAB API (Application Programming Interface. In addition, OAFEw-FOM, electronic component model, and scenario of the electronic warfare domain were designed through simple scenarios for verification, and real-time distributed simulation between C++ and MATLAB was performed through OAFEw-Distributed Simulation Interface.

  7. [Sustainable Implementation of Evidence-Based Programmes in Health Promotion: A Theoretical Framework and Concept of Interactive Knowledge to Action].

    Science.gov (United States)

    Rütten, A; Wolff, A; Streber, A

    2016-03-01

    This article discusses 2 current issues in the field of public health research: (i) transfer of scientific knowledge into practice and (ii) sustainable implementation of good practice projects. It also supports integration of scientific and practice-based evidence production. Furthermore, it supports utilisation of interactive models that transcend deductive approaches to the process of knowledge transfer. Existing theoretical approaches, pilot studies and thoughtful conceptual considerations are incorporated into a framework showing the interplay of science, politics and prevention practice, which fosters a more sustainable implementation of health promotion programmes. The framework depicts 4 key processes of interaction between science and prevention practice: interactive knowledge to action, capacity building, programme adaptation and adaptation of the implementation context. Ensuring sustainability of health promotion programmes requires a concentrated process of integrating scientific and practice-based evidence production in the context of implementation. Central to the integration process is the approach of interactive knowledge to action, which especially benefits from capacity building processes that facilitate participation and systematic interaction between relevant stakeholders. Intense cooperation also induces a dynamic interaction between multiple actors and components such as health promotion programmes, target groups, relevant organisations and social, cultural and political contexts. The reciprocal adaptation of programmes and key components of the implementation context can foster effectiveness and sustainability of programmes. Sustainable implementation of evidence-based health promotion programmes requires alternatives to recent deductive models of knowledge transfer. Interactive approaches prove to be promising alternatives. Simultaneously, they change the responsibilities of science, policy and public health practice. Existing boundaries

  8. Intelligent and robust optimization frameworks for smart grids

    Science.gov (United States)

    Dhansri, Naren Reddy

    A smart grid implies a cyberspace real-time distributed power control system to optimally deliver electricity based on varying consumer characteristics. Although smart grids solve many of the contemporary problems, they give rise to new control and optimization problems with the growing role of renewable energy sources such as wind or solar energy. Under highly dynamic nature of distributed power generation and the varying consumer demand and cost requirements, the total power output of the grid should be controlled such that the load demand is met by giving a higher priority to renewable energy sources. Hence, the power generated from renewable energy sources should be optimized while minimizing the generation from non renewable energy sources. This research develops a demand-based automatic generation control and optimization framework for real-time smart grid operations by integrating conventional and renewable energy sources under varying consumer demand and cost requirements. Focusing on the renewable energy sources, the intelligent and robust control frameworks optimize the power generation by tracking the consumer demand in a closed-loop control framework, yielding superior economic and ecological benefits and circumvent nonlinear model complexities and handles uncertainties for superior real-time operations. The proposed intelligent system framework optimizes the smart grid power generation for maximum economical and ecological benefits under an uncertain renewable wind energy source. The numerical results demonstrate that the proposed framework is a viable approach to integrate various energy sources for real-time smart grid implementations. The robust optimization framework results demonstrate the effectiveness of the robust controllers under bounded power plant model uncertainties and exogenous wind input excitation while maximizing economical and ecological performance objectives. Therefore, the proposed framework offers a new worst-case deterministic

  9. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy

    International Nuclear Information System (INIS)

    Ramos-Méndez, J; Faddegon, B; Perl, J; Schümann, J; Paganetti, H; Shin, J

    2015-01-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman–Kutcher–Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models.As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH’s, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively.We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH’s, DVH point spacing, and results of the organ effect models are

  10. Challenges to the Implementation of a New Framework for Safeguarding Financial Stability

    Directory of Open Access Journals (Sweden)

    Vlahović Ana

    2014-09-01

    Full Text Available There is probably no single economic concept that has attracted more attention and intrigued scientific and professional circles than financial stability. For over a decade now that have been efforts to establish the starting point in explaining this condition or characteristic of the financial system since some find that the key to defining financial stability lies in stability and others argue in favour of the opposite, instability. Unfortunately, no agreement has been reached on a universal definition that would be widely accepted at the international level. Consequently, this gave rise to open discussions on systemic risk, creating a framework for preserving financial stability, and the role of central banks in this process. This article analyses the results achieved in the development of a theoretical concept of financial stability and its practical implementation. A consensus has been reached on the necessity of removing rigid barriers between macro and prudential policies and on the necessity of their coordinated actions. The primary objectives of monetary and fiscal stability have been shifted towards preserving financial stability. The isolated macroprudential principle rightfully got the epithet of an archaic approach. Coordinated micro and macroprudential policies have definitely prevailed and become reality in many countries, including Montenegro. Created institutional frameworks for safeguarding financial stability at all levels - national, Pan-European and global - represent a challenge for further comparative studies.

  11. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems.

    Science.gov (United States)

    Atkins, Lou; Francis, Jill; Islam, Rafat; O'Connor, Denise; Patey, Andrea; Ivers, Noah; Foy, Robbie; Duncan, Eilidh M; Colquhoun, Heather; Grimshaw, Jeremy M; Lawton, Rebecca; Michie, Susan

    2017-06-21

    Implementing new practices requires changes in the behaviour of relevant actors, and this is facilitated by understanding of the determinants of current and desired behaviours. The Theoretical Domains Framework (TDF) was developed by a collaboration of behavioural scientists and implementation researchers who identified theories relevant to implementation and grouped constructs from these theories into domains. The collaboration aimed to provide a comprehensive, theory-informed approach to identify determinants of behaviour. The first version was published in 2005, and a subsequent version following a validation exercise was published in 2012. This guide offers practical guidance for those who wish to apply the TDF to assess implementation problems and support intervention design. It presents a brief rationale for using a theoretical approach to investigate and address implementation problems, summarises the TDF and its development, and describes how to apply the TDF to achieve implementation objectives. Examples from the implementation research literature are presented to illustrate relevant methods and practical considerations. Researchers from Canada, the UK and Australia attended a 3-day meeting in December 2012 to build an international collaboration among researchers and decision-makers interested in the advancing use of the TDF. The participants were experienced in using the TDF to assess implementation problems, design interventions, and/or understand change processes. This guide is an output of the meeting and also draws on the authors' collective experience. Examples from the implementation research literature judged by authors to be representative of specific applications of the TDF are included in this guide. We explain and illustrate methods, with a focus on qualitative approaches, for selecting and specifying target behaviours key to implementation, selecting the study design, deciding the sampling strategy, developing study materials, collecting and

  12. An analysis of Cobit 5 as a framework for the implementation of it governance with reference to King III

    Directory of Open Access Journals (Sweden)

    Maseko, L.

    2016-02-01

    Full Text Available Owing to the complexity and general lack of understanding of information technology (“IT”, the management of IT is often treated as a separately managed value-providing asset. This has resulted in IT rarely receiving the necessary attention of the board, thus creating a disconnect between the board and IT. The King Code of Governance for South Africa 2009 (hereafter referred to as “King III” provides principles and recommended practices for effective IT governance in order to create a greater awareness at board level. King III, however, provides no detailed guidance with regard to the practical implementation of these principles and practices. It is worth noting that numerous international guidelines are recommended within King III that can be adopted as frameworks to assist in the effective implementation of IT governance. COBIT 5 provides, as part of its governance process practices, related guidance activities linking it to the seven IT governance principles of King III, thus making it a practical framework for the implementation of King III recommendations. This study sought to establish the extent to which the governance processes, practices and activities of COBIT 5 are mapped to the recommended practices of IT governance as highlighted in King III in order to resolve COBIT 5 as the de facto framework for IT governance in terms of King III. The study found that though King III principles and practices may be interpreted as vague with regard to how to implement IT governance principles, COBIT 5 succeeds in bridging the gap between control requirements, technical issues, information systems and business risk, which consequently results in a better facilitation of IT governance. The study also revealed that COBIT 5 contains additional activities to assist the board in more transparent reporting of IT performance and conformance management to stakeholders as well activities which enable the connection of resource management with human

  13. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  14. Framework for sequential approximate optimization

    NARCIS (Netherlands)

    Jacobs, J.H.; Etman, L.F.P.; Keulen, van F.; Rooda, J.E.

    2004-01-01

    An object-oriented framework for Sequential Approximate Optimization (SAO) isproposed. The framework aims to provide an open environment for thespecification and implementation of SAO strategies. The framework is based onthe Python programming language and contains a toolbox of Python

  15. Evaluation of the implementation of a whole-workplace walking programme using the RE-AIM framework.

    Science.gov (United States)

    Adams, Emma J; Chalkley, Anna E; Esliger, Dale W; Sherar, Lauren B

    2017-05-18

    Promoting walking for the journey to/from work and during the working day is one potential approach to increase physical activity in adults. Walking Works was a practice-led, whole-workplace walking programme delivered by employees (walking champions). This study aimed to evaluate the implementation of Walking Works using the RE-AIM framework and provide recommendations for future delivery of whole-workplace walking programmes. Two cross sectional surveys were conducted; 1544 (28%) employees completed the baseline survey and 918 employees (21%) completed the follow-up survey. Effectiveness was assessed using baseline and follow-up data; reach, implementation and maintenance were assessed using follow-up data only. For categorical data, Chi square tests were conducted to assess differences between surveys or groups. Continuous data were analysed to test for significant differences using a Mann-Whitney U test. Telephone interviews were conducted with the lead organisation co-ordinator, eight walking champions and three business representatives at follow-up. Interviews were transcribed verbatim and analysed to identify key themes related to adoption, implementation and maintenance. Adoption: Five workplaces participated in Walking Works. Reach: 480 (52.3%) employees were aware of activities and 221 (24.1%) participated. A variety of walking activities were delivered. Some programme components were not delivered as planned which was partly due to barriers in using walking champions to deliver activities. These included the walking champions' capacity, skills, support needs, ability to engage senior management, and the number and type of activities they could deliver. Other barriers included lack of management support, difficulties communicating information about activities and challenges embedding the programme into normal business activities. Effectiveness: No significant changes in walking to/from work or walking during the working day were observed. Maintenance

  16. An Ambient Intelligence Framework for the Provision of Geographically Distributed Multimedia Content to Mobility Impaired Users

    Science.gov (United States)

    Kehagias, Dionysios D.; Giakoumis, Dimitris; Tzovaras, Dimitrios; Bekiaris, Evangelos; Wiethoff, Marion

    This chapter presents an ambient intelligence framework whose goal is to facilitate the information needs of mobility impaired users on the move. This framework couples users with geographically distributed services and the corresponding multimedia content, enabling access to context-sensitive information based on user geographic location and the use case under consideration. It provides a multi-modal facility that is realized through a set of mobile devices and user interfaces that address the needs of ten different types of user impairments. The overall ambient intelligence framework enables users who are equipped with mobile devices to access multimedia content in order to undertake activities relevant to one or more of the following domains: transportation, tourism and leisure, personal support services, work, business, education, social relations and community building. User experience is being explored against those activities through a specific usage scenario.

  17. Examination of the utility of the promoting action on research implementation in health services framework for implementation of evidence based practice in residential aged care settings.

    Science.gov (United States)

    Perry, Lin; Bellchambers, Helen; Howie, Andrew; Moxey, Annette; Parkinson, Lynne; Capra, Sandra; Byles, Julie

    2011-10-01

    This study examined the relevance and fit of the PARiHS framework (Promoting Action on Research Implementation in Health Services) as an explanatory model for practice change in residential aged care. Translation of research knowledge into routine practice is a complex matter in health and social care environments. Examination of the environment may identify factors likely to support and hinder practice change, inform strategy development, predict and explain successful uptake of new ways of working. Frameworks to enable this have been described but none has been tested in residential aged care. This paper reports preliminary qualitative analyses from the Encouraging Best Practice in Residential Aged Care Nutrition and Hydration project conducted in New South Wales in 2007-2009. We examined congruence with the PARiHS framework of factors staff described as influential for practice change during 29 digitally recorded and transcribed staff interviews and meetings at three facilities. Unique features of the setting were flagged, with facilities simultaneously filling the roles of residents' home, staff's workplace and businesses. Participants discussed many of the same characteristics identified by the PARiHS framework, but in addition temporal dimensions of practice change were flagged. Overall factors described by staff as important for practice change in aged care settings showed good fit with those of the PARiHS framework. This framework can be recommended for use in this setting. Widespread adoption will enable cross-project and international synthesis of findings, a major step towards building a cumulative science of knowledge translation and practice change. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  18. A study on the establishment of national regulatory framework for effective implementation of exemption or clearance concept

    International Nuclear Information System (INIS)

    Cheong, J.H.; Park, S.H.; Suk, T.W.

    1998-01-01

    The concepts of exemption and clearance have a lot of advantages in the aspects of effective use of limited resources, land, and optimization of regulatory works. The exact scopes and extent of the implementation of the concepts, however, can widely vary depending upon each country's own specific situations. In order to support the political decision-making on the practical implementation, a series of possible alternatives, general methodology for decision-making, and factors to be considered were proposed. Five primary categories and subsequent nineteen secondary categories were suggested and discussed, and four-step-approach was introduced in order to show the general guidelines for establishing an appropriate national regulatory framework. Though the specific procedure for each country to get to the practical implementation of the exemption and clearance concepts was not described, it is anticipated that the basic guidelines proposed in this paper can be used as a general reference. (author)

  19. Implementation of Enterprise Risk Management (ERM Framework in Enhancing Business Performances in Oil and Gas Sector

    Directory of Open Access Journals (Sweden)

    Sanmugam Annamalah

    2018-01-01

    Full Text Available This study empirically investigated the ERM Implementation model and proposed framework to identify and manage risks in Oil and Gas Sector in Malaysia. The study examined the role of ERM framework implementation in improving business performance by utilizing Economic Value Added as a measurement tool. The study also provides insights to the Oil and Gas Sector to gain higher profit returns, reduce cost of capital, and improve shareholders value. Moreover, it contributes significantly in the field of Enterprise risk management in Malaysia. The identification and management of risk is significant to organizations in managing risks efficiently. Expectations of stakeholders of the organization are high from executives and board of directors in managing the risk effectively. Linear regression analysis is utilized in analyzing the data obtained from the data collection performed for this paper. Purposive sampling has been employed in order to select the firms that are operating in Malaysian oil and gas sector. Primary data has been utilized to collect data with the help of structured questions and interview techniques that involve semi structured questions. The results of the regression analysis conducted for in this study suggested that a significant and positive relationship between Enterprise Risk Management with operational risk; market risk; political risk; health, safety and environmental risk; and, also business performance.

  20. Assessing the risk of impact of farming intensification on calcareous grasslands in Europe: a quantitative implementation of the MIRABEL framework

    NARCIS (Netherlands)

    Petit, S.; Elbersen, B.S.

    2006-01-01

    Intensification of farming practices is still a major driver of biodiversity loss in Europe, despite the implementation of policies that aim to reverse this trend. A conceptual framework called MIRABEL was previously developed that enabled a qualitative and expert-based assessment of the impact of

  1. Regulation of electricity distribution: Issues for implementing a norm model

    International Nuclear Information System (INIS)

    Bjoerndal, Endre; Bjoerndal, Mette; Bjoernenak, Trond; Johnsen, Thore

    2005-01-01

    The Norwegian regulation of transmission and distribution of electricity is currently under revision, and several proposals, including price caps, various norm models and adjustments to the present revenue cap model, have been considered by the Norwegian regulator, NVE. Our starting point is that a successful and sustainable income-regulation-model for electricity distribution should be in accordance with the way of thinking, and the managerial tools of modern businesses. In the regulation it is assumed that decisions regarding operations and investments are made by independent, business oriented entities. The ambition of a dynamically efficient industry therefore requires that the regulatory model and its implementation support best practice business performance. This will influence how the cost base is determined and the way investments are dealt with. We will investigate a possible implementation of a regulatory model based on cost norms. In this we will distinguish between on the one hand, customer driven costs, and on the other hand, costs related to the network itself. The network related costs, which account for approximately 80% of the total cost of electricity distribution, include the costs of operating and maintaining the network, as well as capital costs. These are the ''difficult'' costs, as their levels depend on structural and climatic factors, as well as the number of customers and the load that is served. Additionally, the costs are not separable, since for instance maintenance and investments can be substitutable activities. The work concentrates on verifying the cost model, and evaluating implications for the use of the present efficiency model (DEA) in the regulation. Moreover, we consider how network related costs can be managed in a norm model. Finally, it is highlighted that an important part of a regulatory model based on cost norms is to devise quality measures and how to use them in the economic regulation. (Author)

  2. Demo - Talk2Me: A Framework for Device–to–Device Augmented Reality Social Network

    DEFF Research Database (Denmark)

    Shu, Jiayu; Kosta, Sokol; Zheng, Rui

    2018-01-01

    –to–Device fashion. When a user looks at nearby persons through her camera–enabled wearable devices (e.g., Google Glass), the framework automatically extracts the face–signature of the person of interest, compares it with the previously captured signatures, and presents the information shared by this person......In this demo, we present Talk2Me, an augmented reality social network framework that enables users to disseminate information in a distributed way and view others’ information instantly. Talk2Me advertises users’ messages, together with their face–signatures, to every nearby device in a Device...... to the user. We design a lightweight and yet accurate face recognition algorithm, together with an efficient distributed dissemination protocol. We integrate their implementations in an Android prototype....

  3. Progress towards and barriers to implementation of a risk framework for US federal wildland fire policy and decision making

    Science.gov (United States)

    David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert

    2011-01-01

    In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...

  4. Time-reversal symmetric work distributions for closed quantum dynamics in the histories framework

    International Nuclear Information System (INIS)

    Miller, Harry J D; Anders, Janet

    2017-01-01

    A central topic in the emerging field of quantum thermodynamics is the definition of thermodynamic work in the quantum regime. One widely used solution is to define work for a closed system undergoing non-equilibrium dynamics according to the two-point energy measurement scheme. However, due to the invasive nature of measurement the two-point quantum work probability distribution cannot describe the statistics of energy change from the perspective of the system alone. We here introduce the quantum histories framework as a method to characterise the thermodynamic properties of the unmeasured , closed dynamics. Constructing continuous power operator trajectories allows us to derive an alternative quantum work distribution for closed quantum dynamics that fulfils energy conservation and is time-reversal symmetric. This opens the possibility to compare the measured work with the unmeasured work, contrasting with the classical situation where measurement does not affect the work statistics. We find that the work distribution of the unmeasured dynamics leads to deviations from the classical Jarzynski equality and can have negative values highlighting distinctly non-classical features of quantum work. (fast track communication)

  5. The Midwifery Services Framework: Lessons learned from the initial stages of implementation in six countries.

    Science.gov (United States)

    Garg, Shantanu; Moyo, Nester T; Nove, Andrea; Bokosi, Martha

    2018-07-01

    In 2015, the International Confederation of Midwives (ICM) launched the Midwifery Services Framework (MSF): an evidence-based tool to guide countries through the process of improving their sexual, reproductive, maternal and newborn health services through strengthening and developing the midwifery workforce. The MSF is aligned with key global architecture for sexual, reproductive, maternal and newborn health and human resources for health. This third in a series of three papers describes the experience of starting to implement the MSF in the first six countries that requested ICM support to adopt the tool, and the lessons learned during these early stages of implementation. The early adopting countries selected a variety of priority work areas, but nearly all highlighted the importance of improving the attractiveness of midwifery as a career so as to improve attraction and retention, and several saw the need for improvements to midwifery regulation, pre-service education, availability and/or accessibility of midwives. Key lessons from the early stages of implementation include the need to ensure a broad range of stakeholder involvement from the outset and the need for an in-country lead organisation to maintain the momentum of implementation even when there are changes in political leadership, security concerns or other barriers to progress. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. Incentive regulation of electricity distribution networks: Lessons of experience from Britain

    International Nuclear Information System (INIS)

    Jamasb, Tooraj; Pollitt, Michael

    2007-01-01

    This paper reviews the recent experience of the UK electricity distribution sector under incentive regulation. The UK has a significant and transparent history in implementing incentive regulation in the period since 1990. We demonstrate the successes of this period in reducing costs, prices, and energy losses while maintaining quality of service. We also draw out the lessons for other countries in implementing distribution sector reform. We conclude by discussing the place of incentive regulation of networks within the wider reform context, the required legislative framework, the need for appropriate unbundling, the importance of quality of service incentives, the regulatory information requirements, and the role of sector rationalisation. (author)

  7. Framework for managing mycotoxin risks in the food industry.

    Science.gov (United States)

    Baker, Robert C; Ford, Randall M; Helander, Mary E; Marecki, Janusz; Natarajan, Ramesh; Ray, Bonnie

    2014-12-01

    We propose a methodological framework for managing mycotoxin risks in the food processing industry. Mycotoxin contamination is a well-known threat to public health that has economic significance for the food processing industry; it is imperative to address mycotoxin risks holistically, at all points in the procurement, processing, and distribution pipeline, by tracking the relevant data, adopting best practices, and providing suitable adaptive controls. The proposed framework includes (i) an information and data repository, (ii) a collaborative infrastructure with analysis and simulation tools, (iii) standardized testing and acceptance sampling procedures, and (iv) processes that link the risk assessments and testing results to the sourcing, production, and product release steps. The implementation of suitable acceptance sampling protocols for mycotoxin testing is considered in some detail.

  8. SIED, a Data Privacy Engineering Framework

    OpenAIRE

    Mivule, Kato

    2013-01-01

    While a number of data privacy techniques have been proposed in the recent years, a few frameworks have been suggested for the implementation of the data privacy process. Most of the proposed approaches are tailored towards implementing a specific data privacy algorithm but not the overall data privacy engineering and design process. Therefore, as a contribution, this study proposes SIED (Specification, Implementation, Evaluation, and Dissemination), a conceptual framework that takes a holist...

  9. Distributed Framework for Dynamic Telescope and Instrument Control

    Science.gov (United States)

    Ames, Troy J.; Case, Lynne

    2002-01-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see httD://www.jxta.org,) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device's IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have

  10. Post Implementation Review Framework and Procedures

    Data.gov (United States)

    Social Security Administration — This template outlines the Social Security Administration's (SSA) approach to initiating, conducting, and completing Post Implementation Reviews (PIRs). The template...

  11. Islanded operation of distributed networks

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    This report summarises the findings of a study to investigate the regulatory, commercial and technical risks and benefits associated with the operation of distributed generation to power an islanded section of distributed network. A review of published literature was carried out, and UK generators were identified who could operate as part of an island network under the existing technical, regulatory, and safety framework. Agreement on case studies for consideration with distributed network operators (DNOs) is discussed as well as the quantification of the risks, benefits and costs of islanding, and the production of a case implementation plan for each case study. Technical issues associated with operating sections of network in islanded mode are described, and impacts of islanding on trading and settlement, and technical and commercial modelling are explored.

  12. Islanded operation of distributed networks

    International Nuclear Information System (INIS)

    2005-01-01

    This report summarises the findings of a study to investigate the regulatory, commercial and technical risks and benefits associated with the operation of distributed generation to power an islanded section of distributed network. A review of published literature was carried out, and UK generators were identified who could operate as part of an island network under the existing technical, regulatory, and safety framework. Agreement on case studies for consideration with distributed network operators (DNOs) is discussed as well as the quantification of the risks, benefits and costs of islanding, and the production of a case implementation plan for each case study. Technical issues associated with operating sections of network in islanded mode are described, and impacts of islanding on trading and settlement, and technical and commercial modelling are explored

  13. Monitoring and evaluation of spatially managed areas: A generic framework for implementation of ecosystem based marine management and its application

    DEFF Research Database (Denmark)

    Stelzenmüller, Vanessa; Breen, Patricia; Stamford, Tammy

    2013-01-01

    This study introduces a framework for the monitoring and evaluation of spatially managed areas (SMAs), which is currently being tested by nine European case studies. The framework provides guidance on the selection, mapping, and assessment of ecosystem components and human pressures, the evaluati...... on qualitative information are addressed. The lessons learned will provide a better insight into the full range of methods and approaches required to support the implementation of the ecosystem approach to marine spatial management in Europe and elsewhere.......This study introduces a framework for the monitoring and evaluation of spatially managed areas (SMAs), which is currently being tested by nine European case studies. The framework provides guidance on the selection, mapping, and assessment of ecosystem components and human pressures, the evaluation...... of management effectiveness and potential adaptations to management. Moreover, it provides a structured approach with advice on spatially explicit tools for practical tasks like the assessment of cumulative impacts of human pressures or pressure-state relationships. The case studies revealed emerging challenges...

  14. The impact of the implementations of the Sysrust’s framework upon the quality of financial reporting: structural equation modelling approach

    Directory of Open Access Journals (Sweden)

    Ahmed Al-Dmour

    2018-03-01

    Full Text Available The purpose of this research is to examine empirically, validate, and predict the reliability of the proposed relationship between the reliability of AIS process in the context of SysTrust' framework (principles and criteria and the quality of financial reporting in shareholdings companies in Jordan. For this purpose, a primary data was used that was collected through a self-structured questionnaire from 239 of shareholdings companies. The extent of SysTrust's framework (principles and criteria and the quality of financial reporting were also measured. The data were analyzed using structural equation modeling. The results showed that the magnitude and significance of the loading estimate and they indicated that all of the main five principles of SysTrust's framework are relevant in predicting the quality of financial reporting. Moreover, the reliability of AIS by the implementation of these five principles of SysTrust's framework were positively impacting the quality of financial reporting, as the structural coefficient for these paths are significant.

  15. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    Science.gov (United States)

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  16. A Bayesian Framework for Estimating the Concordance Correlation Coefficient Using Skew-elliptical Distributions.

    Science.gov (United States)

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2018-04-05

    The concordance correlation coefficient (CCC) is a widely used scaled index in the study of agreement. In this article, we propose estimating the CCC by a unified Bayesian framework that can (1) accommodate symmetric or asymmetric and light- or heavy-tailed data; (2) select model from several candidates; and (3) address other issues frequently encountered in practice such as confounding covariates and missing data. The performance of the proposal was studied and demonstrated using simulated as well as real-life biomarker data from a clinical study of an insomnia drug. The implementation of the proposal is accessible through a package in the Comprehensive R Archive Network.

  17. Implementation of the Leaching Environmental Assessment Framework

    Science.gov (United States)

    New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field condit...

  18. A Market Framework for Enabling Electric Vehicles Flexibility Procurement at the Distribution Level Considering Grid Constraints

    DEFF Research Database (Denmark)

    Gadea, Ana; Marinelli, Mattia; Zecchino, Antonio

    2018-01-01

    In a context of extensive electrification of the transport sector, the use of flexibility services from electric vehicles (EVs) is becoming of paramount importance. This paper defines a market framework for enabling EVs flexibility at the distribution level, considering grid constraints. The main...... the benefit for DSOs and society, proving a technical and economic feasible solution....

  19. Implementing an overdose education and naloxone distribution program in a health system.

    Science.gov (United States)

    Devries, Jennifer; Rafie, Sally; Polston, Gregory

    To design and implement a health system-wide program increasing provision of take-home naloxone in patients at risk for opioid overdose, with the downstream aim of reducing fatalities. The program includes health care professional education and guidelines, development, and dissemination of patient education materials, electronic health record changes to promote naloxone prescriptions, and availability of naloxone in pharmacies. Academic health system, San Diego, California. University of California, San Diego Health (UCSDH), offers both inpatient and outpatient primary care and specialty services with 563 beds spanning 2 hospitals and 6 pharmacies. UCSDH is part of the University of California health system, and it serves as the county's safety net hospital. In January 2016, a multisite academic health system initiated a system-wide overdose education and naloxone distribution program to prevent opioid overdose and opioid overdose-related deaths. An interdisciplinary, interdepartmental team came together to develop and implement the program. To strengthen institutional support, naloxone prescribing guidelines were developed and approved for the health system. Education on naloxone for physicians, pharmacists, and nurses was provided through departmental trainings, bulletins, and e-mail notifications. Alerts in the electronic health record and preset naloxone orders facilitated co-prescribing of naloxone with opioid prescriptions. Electronic health record reports captured naloxone prescriptions ordered. Summary reports on the electronic health record measured naloxone reminder alerts and response rates. Since the start of the program, the health system has trained 252 physicians, pharmacists, and nurses in overdose education and take-home naloxone. There has been an increase in the number of prescriptions for naloxone from a baseline of 4.5 per month to an average of 46 per month during the 3 months following full implementation of the program including

  20. Large distributed control system using Ada in fusion research

    International Nuclear Information System (INIS)

    Van Arsdall, P J; Woodruff, J P.

    1998-01-01

    Construction of the National Ignition Facility laser at Lawrence Livermore National Laboratory features a distributed control system that uses object-oriented software engineering techniques. Control of 60,000 devices is effected using a network of some 500 computers. The software is being written in Ada and communicates through CORBA. Software controls are implemented in two layers: individual device controllers and a supervisory layer. The software architecture provides services in the form of frameworks that address issues common to event-driven control systems. Those services are allocated to levels that strictly prescribe their interdependency so the levels are separately reusable. The project has completed its final design review. The delivery of the first increment takes place in October 1998. Keywords Distributed control system, object-oriented development, CORBA, application frameworks, levels of abstraction

  1. Development, implementation and critique of a bioethics framework for pharmaceutical sponsors of human biomedical research.

    Science.gov (United States)

    Van Campen, Luann E; Therasse, Donald G; Klopfenstein, Mitchell; Levine, Robert J

    2015-11-01

    Pharmaceutical human biomedical research is a multi-dimensional endeavor that requires collaboration among many parties, including those who sponsor, conduct, participate in, or stand to benefit from the research. Human subjects' protections have been promulgated to ensure that the benefits of such research are accomplished with respect for and minimal risk to individual research participants, and with an overall sense of fairness. Although these protections are foundational to clinical research, most ethics guidance primarily highlights the responsibilities of investigators and ethics review boards. Currently, there is no published resource that comprehensively addresses bioethical responsibilities of industry sponsors; including their responsibilities to parties who are not research participants, but are, nevertheless key stakeholders in the endeavor. To fill this void, in 2010 Eli Lilly and Company instituted a Bioethics Framework for Human Biomedical Research. This paper describes how the framework was developed and implemented and provides a critique based on four years of experience. A companion article provides the actual document used by Eli Lilly and Company to guide ethical decisions regarding all phases of human clinical trials. While many of the concepts presented in this framework are not novel, compiling them in a manner that articulates the ethical responsibilities of a sponsor is novel. By utilizing this type of bioethics framework, we have been able to develop bioethics positions on various topics, provide research ethics consultations, and integrate bioethics into the daily operations of our human biomedical research. We hope that by sharing these companion papers we will stimulate discussion within and outside the biopharmaceutical industry for the benefit of the multiple parties involved in pharmaceutical human biomedical research.

  2. A general modeling framework for describing spatially structured population dynamics

    Science.gov (United States)

    Sample, Christine; Fryxell, John; Bieri, Joanna; Federico, Paula; Earl, Julia; Wiederholt, Ruscena; Mattsson, Brady; Flockhart, Tyler; Nicol, Sam; Diffendorfer, James E.; Thogmartin, Wayne E.; Erickson, Richard A.; Norris, D. Ryan

    2017-01-01

    Variation in movement across time and space fundamentally shapes the abundance and distribution of populations. Although a variety of approaches model structured population dynamics, they are limited to specific types of spatially structured populations and lack a unifying framework. Here, we propose a unified network-based framework sufficiently novel in its flexibility to capture a wide variety of spatiotemporal processes including metapopulations and a range of migratory patterns. It can accommodate different kinds of age structures, forms of population growth, dispersal, nomadism and migration, and alternative life-history strategies. Our objective was to link three general elements common to all spatially structured populations (space, time and movement) under a single mathematical framework. To do this, we adopt a network modeling approach. The spatial structure of a population is represented by a weighted and directed network. Each node and each edge has a set of attributes which vary through time. The dynamics of our network-based population is modeled with discrete time steps. Using both theoretical and real-world examples, we show how common elements recur across species with disparate movement strategies and how they can be combined under a unified mathematical framework. We illustrate how metapopulations, various migratory patterns, and nomadism can be represented with this modeling approach. We also apply our network-based framework to four organisms spanning a wide range of life histories, movement patterns, and carrying capacities. General computer code to implement our framework is provided, which can be applied to almost any spatially structured population. This framework contributes to our theoretical understanding of population dynamics and has practical management applications, including understanding the impact of perturbations on population size, distribution, and movement patterns. By working within a common framework, there is less chance

  3. Implementation of the ATLAS trigger within the ATLAS Multi­Threaded Software Framework AthenaMT

    CERN Document Server

    Wynne, Benjamin; The ATLAS collaboration

    2016-01-01

    We present an implementation of the ATLAS High Level Trigger that provides parallel execution of trigger algorithms within the ATLAS multi­threaded software framework, AthenaMT. This development will enable the ATLAS High Level Trigger to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data­taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the High Level Trigger input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that process events independently, executing algorithms sequentially in each process. AthenaMT will provide a fully multi­threaded env...

  4. Modelling altered revenue function based on varying power consumption distribution and electricity tariff charge using data analytics framework

    Science.gov (United States)

    Zainudin, W. N. R. A.; Ramli, N. A.

    2017-09-01

    In 2010, Energy Commission (EC) had introduced Incentive Based Regulation (IBR) to ensure sustainable Malaysian Electricity Supply Industry (MESI), promotes transparent and fair returns, encourage maximum efficiency and maintains policy driven end user tariff. To cater such revolutionary transformation, a sophisticated system to generate policy driven electricity tariff structure is in great need. Hence, this study presents a data analytics framework that generates altered revenue function based on varying power consumption distribution and tariff charge function. For the purpose of this study, the power consumption distribution is being proxy using proportion of household consumption and electricity consumed in KwH and the tariff charge function is being proxy using three-tiered increasing block tariff (IBT). The altered revenue function is useful to give an indication on whether any changes in the power consumption distribution and tariff charges will give positive or negative impact to the economy. The methodology used for this framework begins by defining the revenue to be a function of power consumption distribution and tariff charge function. Then, the proportion of household consumption and tariff charge function is derived within certain interval of electricity power. Any changes in those proportion are conjectured to contribute towards changes in revenue function. Thus, these changes can potentially give an indication on whether the changes in power consumption distribution and tariff charge function are giving positive or negative impact on TNB revenue. Based on the finding of this study, major changes on tariff charge function seems to affect altered revenue function more than power consumption distribution. However, the paper concludes that power consumption distribution and tariff charge function can influence TNB revenue to some great extent.

  5. Generic-distributed framework for cloud services marketplace based on unified ontology

    Directory of Open Access Journals (Sweden)

    Samer Hasan

    2017-11-01

    Full Text Available Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors’ knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  6. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  7. Hydroclimatic regimes: a distributed water-balance framework for hydrologic assessment, classification, and management

    Science.gov (United States)

    Weiskel, Peter K.; Wolock, David M.; Zarriello, Phillip J.; Vogel, Richard M.; Levin, Sara B.; Lent, Robert M.

    2014-01-01

    Runoff-based indicators of terrestrial water availability are appropriate for humid regions, but have tended to limit our basic hydrologic understanding of drylands – the dry-subhumid, semiarid, and arid regions which presently cover nearly half of the global land surface. In response, we introduce an indicator framework that gives equal weight to humid and dryland regions, accounting fully for both vertical (precipitation + evapotranspiration) and horizontal (groundwater + surface-water) components of the hydrologic cycle in any given location – as well as fluxes into and out of landscape storage. We apply the framework to a diverse hydroclimatic region (the conterminous USA) using a distributed water-balance model consisting of 53 400 networked landscape hydrologic units. Our model simulations indicate that about 21% of the conterminous USA either generated no runoff or consumed runoff from upgradient sources on a mean-annual basis during the 20th century. Vertical fluxes exceeded horizontal fluxes across 76% of the conterminous area. Long-term-average total water availability (TWA) during the 20th century, defined here as the total influx to a landscape hydrologic unit from precipitation, groundwater, and surface water, varied spatially by about 400 000-fold, a range of variation ~100 times larger than that for mean-annual runoff across the same area. The framework includes but is not limited to classical, runoff-based approaches to water-resource assessment. It also incorporates and reinterprets the green- and blue-water perspective now gaining international acceptance. Implications of the new framework for several areas of contemporary hydrology are explored, and the data requirements of the approach are discussed in relation to the increasing availability of gridded global climate, land-surface, and hydrologic data sets.

  8. A development framework for semantically interoperable health information systems.

    Science.gov (United States)

    Lopez, Diego M; Blobel, Bernd G M E

    2009-02-01

    Semantic interoperability is a basic challenge to be met for new generations of distributed, communicating and co-operating health information systems (HIS) enabling shared care and e-Health. Analysis, design, implementation and maintenance of such systems and intrinsic architectures have to follow a unified development methodology. The Generic Component Model (GCM) is used as a framework for modeling any system to evaluate and harmonize state of the art architecture development approaches and standards for health information systems as well as to derive a coherent architecture development framework for sustainable, semantically interoperable HIS and their components. The proposed methodology is based on the Rational Unified Process (RUP), taking advantage of its flexibility to be configured for integrating other architectural approaches such as Service-Oriented Architecture (SOA), Model-Driven Architecture (MDA), ISO 10746, and HL7 Development Framework (HDF). Existing architectural approaches have been analyzed, compared and finally harmonized towards an architecture development framework for advanced health information systems. Starting with the requirements for semantic interoperability derived from paradigm changes for health information systems, and supported in formal software process engineering methods, an appropriate development framework for semantically interoperable HIS has been provided. The usability of the framework has been exemplified in a public health scenario.

  9. Formalization, implementation, and modeling of institutional controllers for distributed robotic systems.

    Science.gov (United States)

    Pereira, José N; Silva, Porfírio; Lima, Pedro U; Martinoli, Alcherio

    2014-01-01

    The work described is part of a long term program of introducing institutional robotics, a novel framework for the coordination of robot teams that stems from institutional economics concepts. Under the framework, institutions are cumulative sets of persistent artificial modifications made to the environment or to the internal mechanisms of a subset of agents, thought to be functional for the collective order. In this article we introduce a formal model of institutional controllers based on Petri nets. We define executable Petri nets-an extension of Petri nets that takes into account robot actions and sensing-to design, program, and execute institutional controllers. We use a generalized stochastic Petri net view of the robot team controlled by the institutional controllers to model and analyze the stochastic performance of the resulting distributed robotic system. The ability of our formalism to replicate results obtained using other approaches is assessed through realistic simulations of up to 40 e-puck robots. In particular, we model a robot swarm and its institutional controller with the goal of maintaining wireless connectivity, and successfully compare our model predictions and simulation results with previously reported results, obtained by using finite state automaton models and controllers.

  10. Exploring a clinically friendly web-based approach to clinical decision support linked to the electronic health record: design philosophy, prototype implementation, and framework for assessment.

    Science.gov (United States)

    Miller, Perry; Phipps, Michael; Chatterjee, Sharmila; Rajeevan, Nallakkandi; Levin, Forrest; Frawley, Sandra; Tokuno, Hajime

    2014-07-01

    Computer-based clinical decision support (CDS) is an important component of the electronic health record (EHR). As an increasing amount of CDS is implemented, it will be important that this be accomplished in a fashion that assists in clinical decision making without imposing unacceptable demands and burdens upon the provider's practice. The objective of our study was to explore an approach that allows CDS to be clinician-friendly from a variety of perspectives, to build a prototype implementation that illustrates features of the approach, and to gain experience with a pilot framework for assessment. The paper first discusses the project's design philosophy and goals. It then describes a prototype implementation (Neuropath/CDS) that explores the approach in the domain of neuropathic pain and in the context of the US Veterans Administration EHR. Finally, the paper discusses a framework for assessing the approach, illustrated by a pilot assessment of Neuropath/CDS. The paper describes the operation and technical design of Neuropath/CDS, as well as the results of the pilot assessment, which emphasize the four areas of focus, scope, content, and presentation. The work to date has allowed us to explore various design and implementation issues relating to the approach illustrated in Neuropath/CDS, as well as the development and pilot application of a framework for assessment.

  11. A conceptual framework to study the role of communication through social software for coordination in globally-distributed software teams

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2015-01-01

    Background In Global Software Development (GSD) the lack of face-to-face communication is a major challenge and effective computer-mediated practices are necessary to mitigate the effect of physical distance. Communication through Social Software (SoSo) supports team coordination, helping to deal...... with geographical distance; however, in Software Engineering literature, there is a lack of suitable theoretical concepts to analyze and describe everyday practices of globally-distributed software development teams and to study the role of communication through SoSo. Objective The paper proposes a theoretical...... framework for analyzing how communicative and coordinative practices are constituted and maintained in globally-distributed teams. Method The framework is based on the concepts of communicative genres and coordination mechanisms; it is motivated and explicated through examples from two qualitative empirical...

  12. On the Design of Smart Homes: A Framework for Activity Recognition in Home Environment.

    Science.gov (United States)

    Cicirelli, Franco; Fortino, Giancarlo; Giordano, Andrea; Guerrieri, Antonio; Spezzano, Giandomenico; Vinci, Andrea

    2016-09-01

    A smart home is a home environment enriched with sensing, actuation, communication and computation capabilities which permits to adapt it to inhabitants preferences and requirements. Establishing a proper strategy of actuation on the home environment can require complex computational tasks on the sensed data. This is the case of activity recognition, which consists in retrieving high-level knowledge about what occurs in the home environment and about the behaviour of the inhabitants. The inherent complexity of this application domain asks for tools able to properly support the design and implementation phases. This paper proposes a framework for the design and implementation of smart home applications focused on activity recognition in home environments. The framework mainly relies on the Cloud-assisted Agent-based Smart home Environment (CASE) architecture offering basic abstraction entities which easily allow to design and implement Smart Home applications. CASE is a three layered architecture which exploits the distributed multi-agent paradigm and the cloud technology for offering analytics services. Details about how to implement activity recognition onto the CASE architecture are supplied focusing on the low-level technological issues as well as the algorithms and the methodologies useful for the activity recognition. The effectiveness of the framework is shown through a case study consisting of a daily activity recognition of a person in a home environment.

  13. Biomass energy projects for joint implementation of the UN FCCC [Framework Convention on Climate Change

    International Nuclear Information System (INIS)

    Swisher, Joel N.; Renner, Frederick P.

    1998-01-01

    The UN Framework Convention on Climate Change (FCCC) allows for the joint implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial or full financial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. This paper addresses some key issues related to JI under the FCCC as they relate to the development of biomass energy projects for carbon offsets in developing countries. Issues include the reference case or baseline, carbon accounting and net carbon storage, potential project implementation barriers and risks, monitoring and verification, local agreements and host-country approval. All of these issues are important in project design and evaluation. We discuss briefly several case studies, which consist of a biomass-fueled co-generation projects under development at large sugar mills in the Philippines, India and Brazil, as potential JI projects. The case studies illustrate the benefits of bioenergy for reducing carbon emissions and some of the important barriers and difficulties in developing and crediting such projects. Results to date illustrate both the achievements and the difficulties of this type of project. (author)

  14. Psychological first aid following trauma: implementation and evaluation framework for high-risk organizations.

    Science.gov (United States)

    Forbes, David; Lewis, Virginia; Varker, Tracey; Phelps, Andrea; O'Donnell, Meaghan; Wade, Darryl J; Ruzek, Josef I; Watson, Patricia; Bryant, Richard A; Creamer, Mark

    2011-01-01

    International clinical practice guidelines for the management of psychological trauma recommend Psychological First Aid (PFA) as an early intervention for survivors of potentially traumatic events. These recommendations are consensus-based, and there is little published evidence assessing the effectiveness of PFA. This is not surprising given the nature of the intervention and the complicating factors involved in any evaluation of PFA. There is, nevertheless, an urgent need for stronger evidence evaluating its effectiveness. The current paper posits that the implementation and evaluation of PFA within high risk organizational settings is an ideal place to start. The paper provides a framework for a phasic approach to implementing PFA within such settings and presents a model for evaluating its effectiveness using a logic- or theory-based approach which considers both pre-event and post-event factors. Phases 1 and 2 of the PFA model are pre-event actions, and phases 3 and 4 are post-event actions. It is hoped that by using the Phased PFA model and evaluation method proposed in this paper, future researchers will begin to undertake the important task of building the evidence about the most effective approach to providing PFA in high risk organizational and community disaster settings.

  15. BioNet Digital Communications Framework

    Science.gov (United States)

    Gifford, Kevin; Kuzminsky, Sebastian; Williams, Shea

    2010-01-01

    BioNet v2 is a peer-to-peer middleware that enables digital communication devices to talk to each other. It provides a software development framework, standardized application, network-transparent device integration services, a flexible messaging model, and network communications for distributed applications. BioNet is an implementation of the Constellation Program Command, Control, Communications and Information (C3I) Interoperability specification, given in CxP 70022-01. The system architecture provides the necessary infrastructure for the integration of heterogeneous wired and wireless sensing and control devices into a unified data system with a standardized application interface, providing plug-and-play operation for hardware and software systems. BioNet v2 features a naming schema for mobility and coarse-grained localization information, data normalization within a network-transparent device driver framework, enabling of network communications to non-IP devices, and fine-grained application control of data subscription band width usage. BioNet directly integrates Disruption Tolerant Networking (DTN) as a communications technology, enabling networked communications with assets that are only intermittently connected including orbiting relay satellites and planetary rover vehicles.

  16. Protocol Implementation Generator

    DEFF Research Database (Denmark)

    Carvalho Quaresma, Jose Nuno; Probst, Christian W.

    2010-01-01

    Users expect communication systems to guarantee, amongst others, privacy and integrity of their data. These can be ensured by using well-established protocols; the best protocol, however, is useless if not all parties involved in a communication have a correct implementation of the protocol and a...... Generator framework based on the LySatool and a translator from the LySa language into C or Java....... necessary tools. In this paper, we present the Protocol Implementation Generator (PiG), a framework that can be used to add protocol generation to protocol negotiation, or to easily share and implement new protocols throughout a network. PiG enables the sharing, verification, and translation...

  17. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  18. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    Science.gov (United States)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  19. A WEB-BASED FRAMEWORK FOR VISUALIZING INDUSTRIAL SPATIOTEMPORAL DISTRIBUTION USING STANDARD DEVIATIONAL ELLIPSE AND SHIFTING ROUTES OF GRAVITY CENTERS

    Directory of Open Access Journals (Sweden)

    Y. Song

    2017-09-01

    Full Text Available Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  20. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    Energy Technology Data Exchange (ETDEWEB)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J [Pusan National University Yangsan Hospital, Yangsan, Gyeongsangnam-do (Korea, Republic of); Kim, J; Kim, H [Pusan National University, Busan (Korea, Republic of); Cho, M; Yun, S [Samsung electronics Co., Suwon, Gyeonggi-do (Korea, Republic of); Park, D; Kim, W; Ki, Y; Kim, D [Pusan National University Hospital, Busan (Korea, Republic of)

    2016-06-15

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  1. TU-H-CAMPUS-IeP1-05: A Framework for the Analytic Calculation of Patient-Specific Dose Distribution Due to CBCT Scan for IGRT

    International Nuclear Information System (INIS)

    Youn, H; Jeon, H; Nam, J; Lee, J; Lee, J; Kim, J; Kim, H; Cho, M; Yun, S; Park, D; Kim, W; Ki, Y; Kim, D

    2016-01-01

    Purpose: To investigate the feasibility of an analytic framework to estimate patients’ absorbed dose distribution owing to daily cone-beam CT scan for image-guided radiation treatment. Methods: To compute total absorbed dose distribution, we separated the framework into primary and scattered dose calculations. Using the source parameters such as voltage, current, and bowtie filtration, for the primary dose calculation, we simulated the forward projection from the source to each voxel of an imaging object including some inhomogeneous inserts. Then we calculated the primary absorbed dose at each voxel based on the absorption probability deduced from the HU values and Beer’s law. In sequence, all voxels constructing the phantom were regarded as secondary sources to radiate scattered photons for scattered dose calculation. Details of forward projection were identical to that of the previous step. The secondary source intensities were given by using scatter-to- primary ratios provided by NIST. In addition, we compared the analytically calculated dose distribution with their Monte Carlo simulation results. Results: The suggested framework for absorbed dose estimation successfully provided the primary and secondary dose distributions of the phantom. Moreover, our analytic dose calculations and Monte Carlo calculations were well agreed each other even near the inhomogeneous inserts. Conclusion: This work indicated that our framework can be an effective monitor to estimate a patient’s exposure owing to cone-beam CT scan for image-guided radiation treatment. Therefore, we expected that the patient’s over-exposure during IGRT might be prevented by our framework.

  2. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  3. Efficient identification of opportunities for Distributed Generation based on Smart Grid Technology

    DEFF Research Database (Denmark)

    Mutule, Anna; Obushevs, Artjoms; Lvov, Aleksandr

    2013-01-01

    The paper presents the main goals and achievements of the Smart Grids ERA-NET project named “Efficient identification of opportunities for Distributed Generation based on Smart Grid Technology (SmartGen)” during the second stage of project implementation. A description of Smart Grid Technology (S......) models developed within the framework of the project is given. The performed study cases where the SGT-models were implemented to analyze the impact of the electrical grid are discussed....

  4. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  5. A framework to describe, analyze and generate interactive motor behaviors.

    Directory of Open Access Journals (Sweden)

    Nathanaël Jarrassé

    Full Text Available While motor interaction between a robot and a human, or between humans, has important implications for society as well as promising applications, little research has been devoted to its investigation. In particular, it is important to understand the different ways two agents can interact and generate suitable interactive behaviors. Towards this end, this paper introduces a framework for the description and implementation of interactive behaviors of two agents performing a joint motor task. A taxonomy of interactive behaviors is introduced, which can classify tasks and cost functions that represent the way each agent interacts. The role of an agent interacting during a motor task can be directly explained from the cost function this agent is minimizing and the task constraints. The novel framework is used to interpret and classify previous works on human-robot motor interaction. Its implementation power is demonstrated by simulating representative interactions of two humans. It also enables us to interpret and explain the role distribution and switching between roles when performing joint motor tasks.

  6. COSO internal control integrated framework 2013

    CERN Document Server

    American Institute of Certified Public Accountants

    2013-01-01

    Issued by the Committee of Sponsoring Organizations of the Treadway Commission (COSO), the 2013 Internal Control – Integrated Framework(Framework) is expected to help organizations design and implement internal control in light of many changes in business and operating environments since the issuance of the original Framework in 1992. The new Framework retains the core definition of internal control and the five components of internal control, and it continues to emphasize the importance of management judgment in designing, implementing, and conducting a system of internal control, and in assessing its effectiveness. It broadens the application of internal control in addressing operations and reporting objectives, and clarifies the requirements for determining what constitutes effective internal control.

  7. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  8. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  9. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  10. Implementation of density-based solver for all speeds in the framework of OpenFOAM

    Science.gov (United States)

    Shen, Chun; Sun, Fengxian; Xia, Xinlin

    2014-10-01

    In the framework of open source CFD code OpenFOAM, a density-based solver for all speeds flow field is developed. In this solver the preconditioned all speeds AUSM+(P) scheme is adopted and the dual time scheme is implemented to complete the unsteady process. Parallel computation could be implemented to accelerate the solving process. Different interface reconstruction algorithms are implemented, and their accuracy with respect to convection is compared. Three benchmark tests of lid-driven cavity flow, flow crossing over a bump, and flow over a forward-facing step are presented to show the accuracy of the AUSM+(P) solver for low-speed incompressible flow, transonic flow, and supersonic/hypersonic flow. Firstly, for the lid driven cavity flow, the computational results obtained by different interface reconstruction algorithms are compared. It is indicated that the one dimensional reconstruction scheme adopted in this solver possesses high accuracy and the solver developed in this paper can effectively catch the features of low incompressible flow. Then via the test cases regarding the flow crossing over bump and over forward step, the ability to capture characteristics of the transonic and supersonic/hypersonic flows are confirmed. The forward-facing step proves to be the most challenging for the preconditioned solvers with and without the dual time scheme. Nonetheless, the solvers described in this paper reproduce the main features of this flow, including the evolution of the initial transient.

  11. A Framework to Evaluate Ecological and Social Outcomes of Collaborative Management: Lessons from Implementation with a Northern Arizona Collaborative Group

    Science.gov (United States)

    Muñoz-Erickson, Tischa A.; Aguilar-González, Bernardo; Loeser, Matthew R. R.; Sisk, Thomas D.

    2010-01-01

    As collaborative groups gain popularity as an alternative means for addressing conflict over management of public lands, the need for methods to evaluate their effectiveness in achieving ecological and social goals increases. However, frameworks that examine both effectiveness of the collaborative process and its outcomes are poorly developed or altogether lacking. This paper presents and evaluates the utility of the holistic ecosystem health indicator (HEHI), a framework that integrates multiple ecological and socioeconomic criteria to evaluate management effectiveness of collaborative processes. Through the development and application of the HEHI to a collaborative in northern Arizona, the Diablo Trust, we present the opportunities and challenges in using this framework to evaluate the ecological and social outcomes of collaborative adaptive management. Baseline results from the first application of the HEHI are presented as an illustration of its potential as a co-adaptive management tool. We discuss lessons learned from the process of selecting indicators and potential issues to their long-term implementation. Finally, we provide recommendations for applying this framework to monitoring and adaptive management in the context of collaborative management.

  12. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2015-04-01

    The World Climate Conference-3 (Geneva 2009) unanimously decided to establish the Global Framework for Climate Services (GFCS), a UN-led initiative spearheaded by WMO to guide the development and application of science-based climate information and services in support of decision-making in climate sensitive sectors. By promoting science-based decision-making, the GFCS is empowering governments, communities and companies to build climate resilience, reduce vulnerabilities and adapt to impacts. The initial priority areas of GFCS are Agriculture and Food Security; Disaster Risk Reduction; Health; and Water Resources. The implementation of GFCS is well underway with a governance structure now fully established. The governance structure of GFCS includes the Partner Advisory Committee (PAC), which is GFCS's stakeholder engagement mechanism. The membership of the PAC allows for a broad participation of stakeholders. The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), the European Commission (EC), the Food and Agriculture Organization of the UN (FAO), the Global Water Partnership (GWP), the International Federation of Red Cross and Red Crescent Societies (IFRC), the International Union of Geodesy and Geophysics (IUGG), United Nations Environment Programme (UNEP), the United Nations Institute for Training and Research (UNITAR), the World Business Council for Sustainable Development (WBCSD), the World Food Programme (WFP) and WMO have already joined the PAC. Activities are being implemented in various countries in Africa, the Caribbean, Asia and Pacific Small Islands Developing States through flagship projects and activities in the four priority areas of GFCS to enable the development of a Proof of Concept. The focus at national level is on strengthening institutional capacities needed for development of capacities for co-design and co-production of climate services and their application in support of decision-making in climate sensitive

  13. Adaptivna digitalna sita v strukturi porazdeljene aritmetike: Adaptive digital filter implementation with distributed arithmetic structure:

    OpenAIRE

    Babič, Rudolf; Horvat, Bogomir; Osebik, Davorin

    2001-01-01

    Adaptive digital filters have a wide range of applications in the area of signal processing where only minimum a priori knowledge of signal characteristics is available. In this article the adaptive FIR digital filter implementation based on the distributed arithmetic technique is described. The major problem with conventional adaptive digital filter is the need for fast multipliers. When using a hardware implementation. These multipliers take up the disproportional amount of the overall cost...

  14. A comparison between the implementations of risk regulations in The Netherlands and France under the framework of the EC SEVESO II directive

    NARCIS (Netherlands)

    Ham, J.M.; Meulenbrugge, J.J.; Versloot, N.H.A.; Dechy, N.; Lecoze, J.-C.; Salvi, O.

    2006-01-01

    The SEVESO II directive has created a common framework for the European state members for the implementation of risk management strategies that require the introduction of various dimensions ranging from technical to organisational ones. Local regulations in countries have however diverse histories

  15. Reporting on the Strategies Needed to Implement Proven Interventions: An Example From a "Real-World" Cross-Setting Implementation Study.

    Science.gov (United States)

    Gold, Rachel; Bunce, Arwen E; Cohen, Deborah J; Hollombe, Celine; Nelson, Christine A; Proctor, Enola K; Pope, Jill A; DeVoe, Jennifer E

    2016-08-01

    The objective of this study was to empirically demonstrate the use of a new framework for describing the strategies used to implement quality improvement interventions and provide an example that others may follow. Implementation strategies are the specific approaches, methods, structures, and resources used to introduce and encourage uptake of a given intervention's components. Such strategies have not been regularly reported in descriptions of interventions' effectiveness, or in assessments of how proven interventions are implemented in new settings. This lack of reporting may hinder efforts to successfully translate effective interventions into "real-world" practice. A recently published framework was designed to standardize reporting on implementation strategies in the implementation science literature. We applied this framework to describe the strategies used to implement a single intervention in its original commercial care setting, and when implemented in community health centers from September 2010 through May 2015. Per this framework, the target (clinic staff) and outcome (prescribing rates) remained the same across settings; the actor, action, temporality, and dose were adapted to fit local context. The framework proved helpful in articulating which of the implementation strategies were kept constant and which were tailored to fit diverse settings, and simplified our reporting of their effects. Researchers should consider consistently reporting this information, which could be crucial to the success or failure of implementing proven interventions effectively across diverse care settings. clinicaltrials.gov Identifier: NCT02299791. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  17. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  18. Fast implementation of length-adaptive privacy amplification in quantum key distribution

    International Nuclear Information System (INIS)

    Zhang Chun-Mei; Li Mo; Huang Jing-Zheng; Li Hong-Wei; Li Fang-Yi; Wang Chuan; Yin Zhen-Qiang; Chen Wei; Han Zhen-Fu; Treeviriyanupab Patcharapong; Sripimanwat Keattisak

    2014-01-01

    Post-processing is indispensable in quantum key distribution (QKD), which is aimed at sharing secret keys between two distant parties. It mainly consists of key reconciliation and privacy amplification, which is used for sharing the same keys and for distilling unconditional secret keys. In this paper, we focus on speeding up the privacy amplification process by choosing a simple multiplicative universal class of hash functions. By constructing an optimal multiplication algorithm based on four basic multiplication algorithms, we give a fast software implementation of length-adaptive privacy amplification. “Length-adaptive” indicates that the implementation of privacy amplification automatically adapts to different lengths of input blocks. When the lengths of the input blocks are 1 Mbit and 10 Mbit, the speed of privacy amplification can be as fast as 14.86 Mbps and 10.88 Mbps, respectively. Thus, it is practical for GHz or even higher repetition frequency QKD systems. (general)

  19. Distributed analysis environment for HEP and interdisciplinary applications

    International Nuclear Information System (INIS)

    Moscicki, J.T.

    2003-01-01

    Huge data volumes of Larger Hadron Collider experiment require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R and D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modules implemented with newer technology as necessary. The paper gives an overview of DIANE architecture and explains the main design choices. Selected examples of diverse applications from a variety of domains applicable to DIANE are presented. As well as preliminary benchmarking results

  20. The performances of R GPU implementations of the GMRES method

    Directory of Open Access Journals (Sweden)

    Bogdan Oancea

    2018-03-01

    Full Text Available Although the performance of commodity computers has improved drastically with the introduction of multicore processors and GPU computing, the standard R distribution is still based on single-threaded model of computation, using only a small fraction of the computational power available now for most desktops and laptops. Modern statistical software packages rely on high performance implementations of the linear algebra routines there are at the core of several important leading edge statistical methods. In this paper we present a GPU implementation of the GMRES iterative method for solving linear systems. We compare the performance of this implementation with a pure single threaded version of the CPU. We also investigate the performance of our implementation using different GPU packages available now for R such as gmatrix, gputools or gpuR which are based on CUDA or OpenCL frameworks.

  1. Implementation of the framework convention on tobacco control in Africa: current status of legislation.

    Science.gov (United States)

    Tumwine, Jacqueline

    2011-11-01

    To describe, as of July 2011, the status of tobacco control legislation in Africa in three key areas of the Framework Convention on Tobacco Control (FCTC)-(1) Protection from exposure to tobacco smoke, (2) Packaging and labelling of tobacco products, and (3) Tobacco advertising, promotion and sponsorship. Review and analysis of tobacco control legislation in Africa, media reports, journal articles, tobacco industry documents and data published in the 2011 WHO Report on the Global Tobacco Epidemic. Modest progress in FCTC implementation in Africa with many countries having legislation or policies on the protection from exposure to tobacco smoke, however, only a handful of countries meet the standards of the FCTC Article 8 and its Guidelines particularly with regards to designated smoking areas. Little progress on packaging and labelling of tobacco products, with few countries having legislation meeting the minimum standards of the FCTC Article 11 and its Guidelines. Mauritius is the only African country with graphic or pictorial health warnings in place and has the largest warning labels in Africa. Slightly better progress in banning tobacco advertising, promotion and sponsorship has been shown by African countries, although the majority of legislation falls short of the standards of the FCTC Article 13 and its Guidelines. Despite their efforts, African countries' FCTC implementation at national level has not matched the strong regional commitment demonstrated during the FCTC treaty negotiations. This study highlights the need for Africa to step up efforts to adopt and implement effective tobacco control legislation that is fully compliant with the FCTC. In order to achieve this, countries should prioritise resources for capacity building for drafting strong FCTC compliant legislation, research to inform policy and boost political will, and countering the tobacco industry which is a major obstacle to FCTC implementation in Africa.

  2. Implementation of the Framework Convention on Tobacco Control in Africa: Current Status of Legislation

    Directory of Open Access Journals (Sweden)

    Jacqueline Tumwine

    2011-11-01

    Full Text Available Objective: To describe, as of July 2011, the status of tobacco control legislation in Africa in three key areas of the Framework Convention on Tobacco Control (FCTC—(1 Protection from exposure to tobacco smoke, (2 Packaging and labelling of tobacco products, and (3 Tobacco advertising, promotion and sponsorship. Methods: Review and analysis of tobacco control legislation in Africa, media reports, journal articles, tobacco industry documents and data published in the 2011 WHO Report on the Global Tobacco Epidemic. Results: Modest progress in FCTC implementation in Africa with many countries having legislation or policies on the protection from exposure to tobacco smoke, however, only a handful of countries meet the standards of the FCTC Article 8 and its Guidelines particularly with regards to designated smoking areas. Little progress on packaging and labelling of tobacco products, with few countries having legislation meeting the minimum standards of the FCTC Article 11 and its Guidelines. Mauritius is the only African country with graphic or pictorial health warnings in place and has the largest warning labels in Africa. Slightly better progress in banning tobacco advertising, promotion and sponsorship has been shown by African countries, although the majority of legislation falls short of the standards of the FCTC Article 13 and its Guidelines. Despite their efforts, African countries’ FCTC implementation at national level has not matched the strong regional commitment demonstrated during the FCTC treaty negotiations. Conclusion: This study highlights the need for Africa to step up efforts to adopt and implement effective tobacco control legislation that is fully compliant with the FCTC. In order to achieve this, countries should prioritise resources for capacity building for drafting strong FCTC compliant legislation, research to inform policy and boost political will, and countering the tobacco industry which is a major obstacle to FCTC

  3. Building an Ensemble Seismic Hazard Model for the Magnitude Distribution by Using Alternative Bayesian Implementations

    Science.gov (United States)

    Taroni, M.; Selva, J.

    2017-12-01

    In this work we show how we built an ensemble seismic hazard model for the magnitude distribution for the TSUMAPS-NEAM EU project (http://www.tsumaps-neam.eu/). The considered source area includes the whole NEAM region (North East Atlantic, Mediterranean and connected seas). We build our models by using the catalogs (EMEC and ISC), their completeness and the regionalization provided by the project. We developed four alternative implementations of a Bayesian model, considering tapered or truncated Gutenberg-Richter distributions, and fixed or variable b-value. The frequency size distribution is based on the Weichert formulation. This allows for simultaneously assessing all the frequency-size distribution parameters (a-value, b-value, and corner magnitude), using multiple completeness periods for the different magnitudes. With respect to previous studies, we introduce the tapered Pareto distribution (in addition to the classical truncated Pareto), and we build a novel approach to quantify the prior distribution. For each alternative implementation, we set the prior distributions using the global seismic data grouped according to the different types of tectonic setting, and assigned them to the related regions. The estimation is based on the complete (not declustered) local catalog in each region. Using the complete catalog also allows us to consider foreshocks and aftershocks in the seismic rate computation: the Poissonicity of the tsunami events (and similarly the exceedances of the PGA) will be insured by the Le Cam's theorem. This Bayesian approach provides robust estimations also in the zones where few events are available, but also leaves us the possibility to explore the uncertainty associated with the estimation of the magnitude distribution parameters (e.g. with the classical Metropolis-Hastings Monte Carlo method). Finally we merge all the models with their uncertainty to create the ensemble model that represents our knowledge of the seismicity in the

  4. A performance measurement framework for the South African bulk export wine supply chain

    Directory of Open Access Journals (Sweden)

    Johan B. Smit

    2017-09-01

    Full Text Available Background: Many participants in the South African wine industry still exhibit low supply chain maturity in the management of their supply chains. This hampers export performance and ultimately client satisfaction. The development and tracking of appropriate metrics are key steps in improving supply chain performance. Objectives: The purpose of this study was to develop a performance measurement framework for the South African wine industry, focussing on the bulk export segment. Method: The framework was developed using an emergent multi-phased exploratory approach. The approach was implemented in two distinct phases, namely qualitative research followed by quantitative research in each of three iterations to develop and refine the framework. In each iteration, the qualitative research phase consisted of a literature survey, semi-structured and unstructured interviews and case studies, while the quantitative research phase consisted of the development, distribution, completion and analysis of the framework questionnaire, each iteration building on the framework outputs from the previous iteration. Results: The research highlighted that the wine supply chain performance of bulk exports is hindered by the lack of a measurement culture, hampering the identification and prioritisation of interventions. The creation of a performance measurement framework in conjunction with industry, and informed by the Supply Chain Operations Reference framework, creates a platform for the industry to address these challenges. Conclusion: The implementation of this framework will provide performance visibility for cellars in the wine industry. This would enable them to improve their logistics processes and increase their supply chain maturity, ultimately enabling benchmarking against competing supply chains both within South Africa and abroad, such as in Australia, Argentina and Chile.

  5. Simulation and event reconstruction inside the PandaRoot framework

    International Nuclear Information System (INIS)

    Spataro, S

    2008-01-01

    The PANDA detector will be located at the future GSI accelerator FAIR. Its primary objective is the investigation of strong interaction with anti-proton beams, in the range up to 15 GeV/c as momentum of the incoming anti-proton. The PANDA offline simulation framework is called 'PandaRoot', as it is based upon the ROOT 5.14 package. It is characterized by a high versatility; it allows to perform simulation and analysis, to run different event generators (EvtGen, Pluto, UrQmd), different transport models (Geant3, Geant4, Fluka) with the same code, thus to compare the results simply by changing few macro lines without recompiling at all. Moreover auto-configuration scripts allow installing the full framework easily in different Linux distributions and with different compilers (the framework was installed and tested in more than 10 Linux platforms) without further manipulation. The final data are in a tree format, easily accessible and readable through simple clicks on the root browsers. The presentation will report on the actual status of the computing development inside the PandaRoot framework, in terms of detector implementation and event reconstruction

  6. Anionic silicate organic frameworks constructed from hexacoordinate silicon centres

    Science.gov (United States)

    Roeser, Jérôme; Prill, Dragica; Bojdys, Michael J.; Fayon, Pierre; Trewin, Abbie; Fitch, Andrew N.; Schmidt, Martin U.; Thomas, Arne

    2017-10-01

    Crystalline frameworks composed of hexacoordinate silicon species have thus far only been observed in a few high pressure silicate phases. By implementing reversible Si-O chemistry for the crystallization of covalent organic frameworks, we demonstrate the simple one-pot synthesis of silicate organic frameworks based on octahedral dianionic SiO6 building units. Clear evidence of the hexacoordinate environment around the silicon atoms is given by 29Si nuclear magnetic resonance analysis. Characterization by high-resolution powder X-ray diffraction, density functional theory calculation and analysis of the pair-distribution function showed that those anionic frameworks—M2[Si(C16H10O4)1.5], where M = Li, Na, K and C16H10O4 is 9,10-dimethylanthracene-2,3,6,7-tetraolate—crystallize as two-dimensional hexagonal layers stabilized in a fully eclipsed stacking arrangement with pronounced disorder in the stacking direction. Permanent microporosity with high surface area (up to 1,276 m2 g-1) was evidenced by gas-sorption measurements. The negatively charged backbone balanced with extra-framework cations and the permanent microporosity are characteristics that are shared with zeolites.

  7. Designing Caregiver-Implemented Shared-Reading Interventions to Overcome Implementation Barriers

    Science.gov (United States)

    Logan, Jessica R.; Damschroder, Laura

    2015-01-01

    Purpose This study presents an application of the theoretical domains framework (TDF; Michie et al., 2005), an integrative framework drawing on behavior-change theories, to speech-language pathology. Methods A multistep procedure was used to identify barriers affecting caregivers' implementation of shared-reading interventions with their children with language impairment (LI). The authors examined caregiver-level data corresponding to implementation issues from two randomized controlled trials and mapped these to domains in the TDF as well as empirically validated behavior-change techniques. Results Four barriers to implementation were identified as potentially affecting caregivers' implementation: time pressures, reading difficulties, discomfort with reading, and lack of awareness of benefits. These were mapped to 3 TDF domains: intentions, beliefs about capabilities, and skills. In turn, 4 behavior-change techniques were identified as potential vehicles for affecting these domains: reward, feedback, model, and encourage. An ongoing study is described that is determining the effects of these techniques for improving caregivers' implementation of a shared-reading intervention. Conclusions A description of the steps to identifying barriers to implementation, in conjunction with an ongoing experiment that will explicitly determine whether behavior-change techniques affect these barriers, provides a model for how implementation science can be used to identify and overcome implementation barriers in the treatment of communication disorders. PMID:26262941

  8. Where-Fi: a dynamic energy-efficient multimedia distribution framework for MANETs

    Science.gov (United States)

    Mohapatra, Shivajit; Carbunar, Bogdan; Pearce, Michael; Chaudhri, Rohit; Vasudevan, Venu

    2008-01-01

    Next generation mobile ad-hoc applications will revolve around users' need for sharing content/presence information with co-located devices. However, keeping such information fresh requires frequent meta-data exchanges, which could result in significant energy overheads. To address this issue, we propose distributed algorithms for energy efficient dissemination of presence and content usage information between nodes in mobile ad-hoc networks. First, we introduce a content dissemination protocol (called CPMP) for effectively distributing frequent small meta-data updates between co-located devices using multicast. We then develop two distributed algorithms that use the CPMP protocol to achieve "phase locked" wake up cycles for all the participating nodes in the network. The first algorithm is designed for fully-connected networks and then extended in the second to handle hidden terminals. The "phase locked" schedules are then exploited to adaptively transition the network interface to a deep sleep state for energy savings. We have implemented a prototype system (called "Where-Fi") on several Motorola Linux-based cell phone models. Our experimental results show that for all network topologies our algorithms were able to achieve "phase locking" between nodes even in the presence of hidden terminals. Moreover, we achieved battery lifetime extensions of as much as 28% for fully connected networks and about 20% for partially connected networks.

  9. Distributed Technologies in a Data Pool

    Science.gov (United States)

    Keiser, K.; Conover, H.; Graves, S.; He, Y.; Regner, K.; Smith, M.

    2004-12-01

    A Data Pool is an on-line repository providing interactive and programmatic access to data products through a variety of services. The University of Alabama in Huntsville has developed and deployed such a Data Pool in conjunction with the DISCOVER project, a collaboration with NASA and Remote Sensing Systems. DISCOVER provides long-term ocean and climate data from a variety of passive microwave satellite instruments, including such products as sea-surface temperature and wind, air temperature, atmospheric water vapor, cloud water and rain rate. The Data Pool provides multiple methods to access and visualize these products, including conventional HTTP and FTP access, as well as data services that provide for enhanced usability and interoperability, such as GridFTP, OPeNDAP, OpenGIS-compliant web mapping and coverage services, and custom subsetting and packaging services. This paper will focus on the distributed service technologies used in the Data Pool, which spans heterogeneous machines at multiple locations. For example, in order to provide seamless access to data at multiple sites, the Data Pool provides catalog services for all data products at the various data server locations. Under development is an automated metadata generation tool that crawls the online data repositories regularly to dynamically update the Data Pool catalog with information about newly generated data files. For efficient handling of data orders across distributed repositories, the Data Pool also implements distributed data processing services on the file servers where the data resides. Ontologies are planned to support automated service chaining for custom user requests. The UAH Data Pool is based on a configurable technology framework that integrates distributed data services with a web interface and a set of centralized database services for catalogs and order tracking. While this instantiation of the Data Pool was implemented to meet the needs of the DISCOVER project, the framework was

  10. The Development Of A Theoretical Lean Culture Causal Framework To Support The Effective Implementation Of Lean In Automotive Component Manufacturers

    Directory of Open Access Journals (Sweden)

    Van der Merwe, Karl Robert

    2014-05-01

    Full Text Available Although it is generally accepted that lean manufacturing improves operational performance, many organisations are struggling to adapt to the lean philosophy. The purpose of this study is to contribute to a more effective strategy for implementing the lean manufacturing improvement philosophy. The study sets out both to integrate well-researched findings and theories related to generic organisational culture with more recent research and experience related to lean culture, and to examine the role that culture plays in the effective implementation of lean manufacturing principles and techniques. The ultimate aim of this exercise is to develop a theoretical lean culture causal framework.

  11. Integrated evaluation framework. Based on the logical framework approach for project cycle management

    International Nuclear Information System (INIS)

    1996-11-01

    This Integrated Evaluation Framework (IEF) was developed by TC Evaluation with the aim of presenting in a comprehensive manner the logic of thinking used when evaluating projects and programmes. Thus, in the first place, the intended audience for this report are evaluation officers, so that when applying the evaluation procedures and check lists, data can be organized following a systematic and logical scheme and conclusions can be derived ''objectively''. The value of such a framework for reporting on performance and in providing a quality reference for disbursements represents one of its major advantages. However, when developing and applying the IEF, it was realized that a Logical Framework Approach (LFA), like the one upon which the IEF is based, needs to be followed throughout the project life cycle, from the Country Programme Framework planning stage, through project design and implementation. Then, the helpful consequences flow into project design quality and smooth implementation. It is only in such an environment that meaningful and consistent evaluation can take place. Therefore the main audience for this report are Agency staff involved in planning, designing and implementing TC projects as well as their counterparts in Member States. In this understanding, the IEF was subjected to review by a consultants meeting, which included both external consultants and Agency staff. This Consultants Review Meeting encouraged the Secretariat to further adopt the LFA into the TC management process

  12. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  13. JT Bachman Leadership Framework

    Science.gov (United States)

    2017-07-01

    DAHLGREN DIVISION NAVAL SURFACE WARFARE CENTER Dahlgren, Virginia 22448-5100 NSWCDD/MP-17/300 JT BACHMAN LEADERSHIP FRAMEWORK...REPORT TYPE Miscellaneous Publication 3. DATES COVERED (From - To) 27 Sept 2016 – 08 June 2017 4. TITLE AND SUBTITLE JT BACHMAN LEADERSHIP FRAMEWORK...distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This document describes the leadership framework of a civil servant following

  14. Distributed analysis environment for HEP and interdisciplinary applications

    CERN Document Server

    Moscicki, J T

    2003-01-01

    Huge data volumes of Large Hadron Collider experiments require parallel end-user analysis on clusters of hundreds of machines. While the focus of end-user High-Energy Physics analysis is on ntuples, the master-worker model of parallel processing may be also used in other contexts such as detector simulation. The aim of DIANE R&D project (http://cern.ch/it-proj-diane) currently held by CERN IT/API group is to create a generic, component-based framework for distributed, parallel data processing in master-worker model. Pre-compiled user analysis code is loaded dynamically at runtime in component libraries and called back when appropriate. Such application-oriented framework must be flexible enough to integrate with the emerging GRID technologies as they become available in the time to come. Therefore, common services such as environment reconstruction, code distribution, load balancing and authentication are designed and implemented as pluggable modules. This approach allows to easily replace them with modul...

  15. Managing ethics in higher education : implementing a code or embedding virtue ?

    OpenAIRE

    Moore, G.

    2006-01-01

    This paper reviews a publication entitled 'Ethics Matters. Managing Ethical Issues in Higher Education', which was distributed to all UK universities and equivalent (HEIs) in October 2005. The publication proposed that HEIs should put in place an institution-wide ethical policy framework, well beyond the customary focus on research ethics, together with the mechanisms necessary to ensure its implementation. Having summarised the processes that led to the publication and the publication itself...

  16. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  17. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  18. Internet of Things Framework for Home Care Systems

    Directory of Open Access Journals (Sweden)

    Biljana Risteska Stojkoska

    2017-01-01

    Full Text Available The increasing average age of the population in most industrialized countries imposes a necessity for developing advanced and practical services using state-of-the-art technologies, dedicated to personal living spaces. In this paper, we introduce a hierarchical distributed approach for home care systems based on a new paradigm known as Internet of Things (IoT. The proposed generic framework is supported by a three-level data management model composed of dew computing, fog computing, and cloud computing for efficient data flow in IoT based home care systems. We examine the proposed model through a real case scenario of an early fire detection system using a distributed fuzzy logic approach. The obtained results prove that such implementation of dew and fog computing provides high accuracy in fire detection IoT systems, while achieving minimum data latency.

  19. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  20. A PROOF Analysis Framework

    International Nuclear Information System (INIS)

    González Caballero, I; Cuesta Noriega, A; Rodríguez Marrero, A; Fernández del Castillo, E

    2012-01-01

    The analysis of the complex LHC data usually follows a standard path that aims at minimizing not only the amount of data but also the number of observables used. After a number of steps of slimming and skimming the data, the remaining few terabytes of ROOT files hold a selection of the events and a flat structure for the variables needed that can be more easily inspected and traversed in the final stages of the analysis. PROOF arises at this point as an efficient mechanism to distribute the analysis load by taking advantage of all the cores in modern CPUs through PROOF Lite, or by using PROOF Cluster or PROOF on Demand tools to build dynamic PROOF cluster on computing facilities with spare CPUs. However using PROOF at the level required for a serious analysis introduces some difficulties that may scare new adopters. We have developed the PROOF Analysis Framework (PAF) to facilitate the development of new analysis by uniformly exposing the PROOF related configurations across technologies and by taking care of the routine tasks as much as possible. We describe the details of the PAF implementation as well as how we succeeded in engaging a group of CMS physicists to use PAF as their daily analysis framework.

  1. Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System

    Science.gov (United States)

    Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan

    2009-01-01

    A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.

  2. Organization of the STAR experiment software framework at JINR. Results and experience from the first two years of work

    International Nuclear Information System (INIS)

    Arkhipkin, D.A.; Zul'karneeva, Yu.R.

    2004-01-01

    The organization of STAR experiment software framework at JINR is described. The approach being based on the distributed file system ASF was implemented at the NEOSTAR minicluster at LPP, JINR. An operation principle of the cluster as well as its work description and samples of the performed analysis are also given. The results of the NEOSTAR minicluster performance have demonstrated broad facilities of the distributed computing concept to be employed in experimental data analysis and high-energy physics modeling

  3. Framework and implementation of a continuous network-wide health monitoring system for roadways

    Science.gov (United States)

    Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar

    2014-03-01

    According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.

  4. Paving the Road to Success: A Framework for Implementing the Success Tutoring Approach

    Directory of Open Access Journals (Sweden)

    Spark Linda

    2017-12-01

    Full Text Available The exponential growth of higher education enrolment in South Africa has resulted in increased diversity of the student body, leading to a proliferation of factors that affect student performance and success. Various initiatives have been adopted by tertiary institutions to mitigate the negative impact these factors may have on student success, and it is suggested that interventions that include aspects of social integration are the most successful. This paper outlines an approach called Success Tutoring (a non-academic tutorial approach used as part of a student success and support programme in the Faculty of Commerce, Law, and Management at the University of the Witwatersrand, which is underscored by empirical evidence drawn from evaluation data collected during Success Tutor symposia. The authors draw conclusions and make recommendations based on a thematic analysis of the dataset, and ultimately provide readers with a framework for implementing Success Tutoring at their tertiary institutions.

  5. Global Framework for Climate Services (GFCS)

    Science.gov (United States)

    Lúcio, F.

    2012-04-01

    a set of international arrangements that will coordinate the activities and build on existing efforts to provide climate services that are truly focused on meeting user needs. It will be implemented through the development of five main components: 1) User Interface Platform — to provide ways for climate service users and providers to interact and improve the effectiveness of the Framework and its climate services 2) Climate Services Information System — to produce and distribute climate data and information according to the needs of users and to agreed standards 3) Observations and Monitoring - to develop agreements and standards for collecting and generating necessary climate data 4) Research, Modeling and Prediction section — to harness science capabilities and results to meet the needs of climate services 5) Capacity Building — to support the systematic development of the institutions, infrastructure and human resources needed for effective production of climate services and their application. Putting the GFCS in place will require unprecedented collaboration among agencies and across political, functional and disciplinary boundaries, and a global mobilization of effort. This communication will provide information on benefits and the process for the development of the GFCS as well as potential entry points for stakeholders to participate. In addition, it will highlight some of the research, modelling and prediction opportunities that will require intra-disciplinary science action.

  6. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    Science.gov (United States)

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  7. Individual performance review in hospital practice: the development of a framework and evaluation of doctors' attitudes to its value and implementation.

    Science.gov (United States)

    Trebble, T M; Cruickshank, L; Hockey, P M; Heyworth, N; Powell, T; Clarke, N

    2013-11-01

    Appraisal, or independent performance review (IPR) is used in human resources management in the commercial and public sectors to evaluate the performance of an employee against agreed local organisational expectations and objectives, and to identify their requirements for development and effective management. IPR for NHS consultants may provide essential information for job planning, contribute towards medical appraisal for revalidation, and facilitate productivity and quality improvement. To develop a framework for IPR for consultants, and to determine attitudes on its value, process and content. Information from commercial, public and voluntary sector models and published and other literature sources were used to develop an IPR framework. This was assessed through a three-cycle action research methodology involving qualitative interviews with 22 consultants (predominantly with medical management roles). The domains of the IPR framework included: (1) performance against objectives; (2) behaviour and leadership; (3) talent management; (4) agreed future objectives. A number of themes were identified from the consultant interviews including: ineffective current appraisal systems reflecting a lack of valid performance data and allotted time; a lack of empowerment of medical managers to address performance issues; IPR as a more explicit system, offering value in evaluating doctors performance; and the dependence of successful implementation on the engagement of the Trust executive. IPR may have value for performance evaluation of consultants, contributing toward job planning and complementing medical appraisal. Support by their employing organisation and engagement with medical managers in design and implementation is likely to be essential.

  8. Report on OCDE’s tax bases erosion and shifting benefits: origin and implementation within international and global framework

    Directory of Open Access Journals (Sweden)

    Fernando Serrano Antón

    2014-07-01

    Full Text Available This work is intended to analyze circumstances leading to OCDE’s report on tax bases erosion and shifting benefits. Inconsistency of tax systems and unilateralism in current economic globalization framework might have led to asymmetric tax situations, mostly exploited by multinational companies. Means and tools used and proposed by several international institutions in order to implement legally binding actions through soft law and acceptance by different countries as method used in the fight against tax avoidance and fraud are also discussed.

  9. A TBB-CUDA Implementation for Background Removal in a Video-Based Fire Detection System

    Directory of Open Access Journals (Sweden)

    Fan Wang

    2014-01-01

    Full Text Available This paper presents a parallel TBB-CUDA implementation for the acceleration of single-Gaussian distribution model, which is effective for background removal in the video-based fire detection system. In this framework, TBB mainly deals with initializing work of the estimated Gaussian model running on CPU, and CUDA performs background removal and adaption of the model running on GPU. This implementation can exploit the combined computation power of TBB-CUDA, which can be applied to the real-time environment. Over 220 video sequences are utilized in the experiments. The experimental results illustrate that TBB+CUDA can achieve a higher speedup than both TBB and CUDA. The proposed framework can effectively overcome the disadvantages of limited memory bandwidth and few execution units of CPU, and it reduces data transfer latency and memory latency between CPU and GPU.

  10. Improved Diagnosis and Care for Rare Diseases through Implementation of Precision Public Health Framework.

    Science.gov (United States)

    Baynam, Gareth; Bowman, Faye; Lister, Karla; Walker, Caroline E; Pachter, Nicholas; Goldblatt, Jack; Boycott, Kym M; Gahl, William A; Kosaki, Kenjiro; Adachi, Takeya; Ishii, Ken; Mahede, Trinity; McKenzie, Fiona; Townshend, Sharron; Slee, Jennie; Kiraly-Borri, Cathy; Vasudevan, Anand; Hawkins, Anne; Broley, Stephanie; Schofield, Lyn; Verhoef, Hedwig; Groza, Tudor; Zankl, Andreas; Robinson, Peter N; Haendel, Melissa; Brudno, Michael; Mattick, John S; Dinger, Marcel E; Roscioli, Tony; Cowley, Mark J; Olry, Annie; Hanauer, Marc; Alkuraya, Fowzan S; Taruscio, Domenica; Posada de la Paz, Manuel; Lochmüller, Hanns; Bushby, Kate; Thompson, Rachel; Hedley, Victoria; Lasko, Paul; Mina, Kym; Beilby, John; Tifft, Cynthia; Davis, Mark; Laing, Nigel G; Julkowska, Daria; Le Cam, Yann; Terry, Sharon F; Kaufmann, Petra; Eerola, Iiro; Norstedt, Irene; Rath, Ana; Suematsu, Makoto; Groft, Stephen C; Austin, Christopher P; Draghia-Akli, Ruxandra; Weeramanthri, Tarun S; Molster, Caron; Dawkins, Hugh J S

    2017-01-01

    Public health relies on technologies to produce and analyse data, as well as effectively develop and implement policies and practices. An example is the public health practice of epidemiology, which relies on computational technology to monitor the health status of populations, identify disadvantaged or at risk population groups and thereby inform health policy and priority setting. Critical to achieving health improvements for the underserved population of people living with rare diseases is early diagnosis and best care. In the rare diseases field, the vast majority of diseases are caused by destructive but previously difficult to identify protein-coding gene mutations. The reduction in cost of genetic testing and advances in the clinical use of genome sequencing, data science and imaging are converging to provide more precise understandings of the 'person-time-place' triad. That is: who is affected (people); when the disease is occurring (time); and where the disease is occurring (place). Consequently we are witnessing a paradigm shift in public health policy and practice towards 'precision public health'.Patient and stakeholder engagement has informed the need for a national public health policy framework for rare diseases. The engagement approach in different countries has produced highly comparable outcomes and objectives. Knowledge and experience sharing across the international rare diseases networks and partnerships has informed the development of the Western Australian Rare Diseases Strategic Framework 2015-2018 (RD Framework) and Australian government health briefings on the need for a National plan.The RD Framework is guiding the translation of genomic and other technologies into the Western Australian health system, leading to greater precision in diagnostic pathways and care, and is an example of how a precision public health framework can improve health outcomes for the rare diseases population.Five vignettes are used to illustrate how policy

  11. Guidelines for Implementing Advanced Distribution Management Systems-Requirements for DMS Integration with DERMS and Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Chen [Argonne National Lab. (ANL), Argonne, IL (United States); Lu, Xiaonan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-08-01

    This guideline focuses on the integration of DMS with DERMS and microgrids connected to the distribution grid by defining generic and fundamental design and implementation principles and strategies. It starts by addressing the current status, objectives, and core functionalities of each system, and then discusses the new challenges and the common principles of DMS design and implementation for integration with DERMS and microgrids to realize enhanced grid operation reliability and quality power delivery to consumers while also achieving the maximum energy economics from the DER and microgrid connections.

  12. Characteristics of the Audit Processes for Distributed Informatics Systems

    Directory of Open Access Journals (Sweden)

    Marius POPA

    2009-01-01

    Full Text Available The paper contains issues regarding: main characteristics and examples of the distributed informatics systems and main difference categories among them, concepts, principles, techniques and fields for auditing the distributed informatics systems, concepts and classes of the standard term, characteristics of this one, examples of standards, guidelines, procedures and controls for auditing the distributed informatics systems. The distributed informatics systems are characterized by the following issues: development process, resources, implemented functionalities, architectures, system classes, particularities. The audit framework has two sides: the audit process and auditors. The audit process must be led in accordance with the standard specifications in the IT&C field. The auditors must meet the ethical principles and they must have a high-level of professional skills and competence in IT&C field.

  13. Evaluation Protocol To Assess an Integrated Framework for the Implementation of the Childhood Obesity Research Demonstration Project at the California (CA-CORD) and Massachusetts (MA-CORD) Sites

    OpenAIRE

    Chuang, Emmeline; Ayala, Guadalupe X.; Schmied, Emily; Ganter, Claudia; Gittelsohn, Joel; Davison, Kirsten K.

    2015-01-01

    Background: The long-term success of child obesity prevention and control efforts depends not only on the efficacy of the approaches selected, but also on the strategies through which they are implemented and sustained. This study introduces the Multilevel Implementation Framework (MIF), a conceptual model of factors affecting the implementation of multilevel, multisector interventions, and describes its application to the evaluation of two of three state sites (CA and MA) participating in th...

  14. Development of the ATLAS simulation framework

    International Nuclear Information System (INIS)

    DellAcqua, A.; Stavrianakou, M.; Amako, K.; Kanzaki, J.; Morita, Y.; Murakami, K.; Sasaki; Kurashige, H.; Rimoldi, A.; Saeki, T.; Ueda, I.; Tanaka, S.; Yoshida, H.

    2001-01-01

    Object-oriented (OO) approach is the key technology to develop a software system in the LHC/ATLAS experiment. The authors developed a OO simulation framework based on the Geant4 general-purpose simulation toolkit. Because of complexity of simulation in ATLAS, the authors paid most attention to the scalability in the design. Although the first target to apply this framework is to implement the ATLAS full detector simulation program, there is no experiment-specific code in it, therefore it can be utilized for the development of any simulation package, not only for HEP experiments but also for various different research domains. The authors discuss our approach of design and implementation of the framework

  15. Operational Research: Evaluating Multimodel Implementations for 24/7 Runtime Environments

    Science.gov (United States)

    Burkhart, J. F.; Helset, S.; Abdella, Y. S.; Lappegard, G.

    2016-12-01

    We present a new open source framework for operational hydrologic rainfall-runoff modeling. The Statkraft Hydrologic Forecasting Toolbox (Shyft) is unique from existing frameworks in that two primary goals are to provide: i) modern, professionally developed source code, and ii) a platform that is robust and ready for operational deployment. Developed jointly between Statkraft AS and The University of Oslo, the framework is currently in operation in both private and academic environments. The hydrology presently available in the distribution is simple and proven. Shyft provides a platform for distributed hydrologic modeling in a highly efficient manner. In it's current operational deployment at Statkraft, Shyft is used to provide daily 10-day forecasts for critical reservoirs. In a research setting, we have developed a novel implementation of the SNICAR model to assess the impact of aerosol deposition on snow packs. Several well known rainfall-runoff algorithms are available for use, allowing for intercomparing different approaches based on available data and the geographical environment. The well known HBV model is a default option, and other routines with more localized methods handling snow and evapotranspiration, or simplifications of catchment scale processes are included. For the latter, we have implemented the Kirchner response routine. Being developed in Norway, a variety snow-melt routines, including simplified degree day models or more advanced energy balance models, may be selected. Ensemble forecasts, multi-model implementations, and statistical post-processing routines enable a robust toolbox for investigating optimal model configurations in an operational setting. The Shyft core is written in modern templated C++ and has Python wrappers developed for easy access to module sub-routines. The code is developed such that the modules that make up a "method stack" are easy to modify and customize, allowing one to create new methods and test them rapidly. Due

  16. Application of the Consolidated Framework for Implementation Research to assess factors that may influence implementation of tobacco use treatment guidelines in the Viet Nam public health care delivery system.

    Science.gov (United States)

    VanDevanter, Nancy; Kumar, Pritika; Nguyen, Nam; Nguyen, Linh; Nguyen, Trang; Stillman, Frances; Weiner, Bryan; Shelley, Donna

    2017-02-28

    Services to treat tobacco dependence are not readily available to smokers in low-middle income countries (LMICs) where smoking prevalence remains high. We are conducting a cluster randomized controlled trial comparing the effectiveness of two strategies for implementing tobacco use treatment guidelines in 26 community health centers (CHCs) in Viet Nam. Guided by the Consolidated Framework for Implementation Research (CFIR), prior to implementing the trial, we conducted formative research to (1) identify factors that may influence guideline implementation and (2) inform further modifications to the intervention that may be necessary to translate a model of care delivery from a high-income country (HIC) to the local context of a LMIC. We conducted semi-structured qualitative interviews with CHC medical directors, health care providers, and village health workers (VHWs) in eight CHCs (n = 40). Interviews were transcribed verbatim and translated into English. Two qualitative researchers used both deductive (CFIR theory driven) and inductive (open coding) approaches to analysis developed codes and themes relevant to the aims of this study. The interviews explored four out of five CFIR domains (i.e., intervention characteristics, outer setting, inner setting, and individual characteristics) that were relevant to the analysis. Potential facilitators of the intervention included the relative advantage of the intervention compared with current practice (intervention characteristics), awareness of the burden of tobacco use in the population (outer setting), tension for change due to a lack of training and need for skill building and leadership engagement (inner setting), and a strong sense of collective efficacy to provide tobacco cessation services (individual characteristics). Potential barriers included the perception that the intervention was more complex (intervention characteristic) and not necessarily compatible (inner setting) with current workflows and staffing

  17. Java Web Frameworks Which One to Choose?

    OpenAIRE

    Nassourou, Mohamadou

    2010-01-01

    This article discusses web frameworks that are available to a software developer in Java language. It introduces MVC paradigm and some frameworks that implement it. The article presents an overview of Struts, Spring MVC, JSF Frameworks, as well as guidelines for selecting one of them as development environment.

  18. On dose distribution comparison

    International Nuclear Information System (INIS)

    Jiang, Steve B; Sharp, Greg C; Neicu, Toni; Berbeco, Ross I; Flampouri, Stella; Bortfeld, Thomas

    2006-01-01

    In radiotherapy practice, one often needs to compare two dose distributions. Especially with the wide clinical implementation of intensity-modulated radiation therapy, software tools for quantitative dose (or fluence) distribution comparison are required for patient-specific quality assurance. Dose distribution comparison is not a trivial task since it has to be performed in both dose and spatial domains in order to be clinically relevant. Each of the existing comparison methods has its own strengths and weaknesses and there is room for improvement. In this work, we developed a general framework for comparing dose distributions. Using a new concept called maximum allowed dose difference (MADD), the comparison in both dose and spatial domains can be performed entirely in the dose domain. Formulae for calculating MADD values for various comparison methods, such as composite analysis and gamma index, have been derived. For convenience in clinical practice, a new measure called normalized dose difference (NDD) has also been proposed, which is the dose difference at a point scaled by the ratio of MADD to the predetermined dose acceptance tolerance. Unlike the simple dose difference test, NDD works in both low and high dose gradient regions because it considers both dose and spatial acceptance tolerances through MADD. The new method has been applied to a test case and a clinical example. It was found that the new method combines the merits of the existing methods (accurate, simple, clinically intuitive and insensitive to dose grid size) and can easily be implemented into any dose/intensity comparison tool

  19. Distributed models of radionuclide transport on watersheds: development and implementation for the Chernobyl and Fukushima catchments

    Energy Technology Data Exchange (ETDEWEB)

    Kivva, S.; Zheleznyak, M. [Institute of Environmental Radioactivity, Fukushima University (Japan)

    2014-07-01

    The distributed hydrological 'rainfall- runoff' models provide possibilities of the physically based simulation of surface and subsurface flow on watersheds based on the GIS processed data. The success of such modeling approaches for the predictions of the runoff and soil erosion provides a basis for the implementation of the distributed radionuclide transport watershed models. Two distributed watershed models of radionuclide transport - RUNTOX and DHSVM-R have been used to simulate the radionuclide transport in the basin of the Dnieper River, Ukraine and watersheds of Prefecture Fukushima. RUNTOX is used for the simulation of radionuclide wash off from the experimental plots and small watersheds, and DHSVM-R is used for medium and large watersheds RUNTOX is two dimensional distributed hydrological model based on the finite-difference solution of the coupled equations the surface flow, subsurface flow, groundwater flow and advection- dispersion equations of the sediments (eroded soil) and radionuclide transport in liquid and solid phases, taking into parameterize the radionuclide exchanges between liquid and solid phases.. This model has been applied to the experimental plots in Ukraine after the Chernobyl accident and experimental plots in the Fukushima Prefecture. The experience of RUNTOX development and application has been used for the extension of the distributed hydrological model DHSVM by the including of the module of the watershed radionuclide transport. The updated model was named by DHSMV-R. The original DHSVM (Distributed Hydrology Soil Vegetation Model) was developed in the University of Washington and Pacific Northwest National Laboratories. DHSVM is a physical distributed hydrology-vegetation model for complex terrain based on the numerical solution of the network of one dimensional equations. The model accounts explicitly for the spatial distribution of land-surface processes, and can be applied over a range of scales, from plot to large

  20. Evolution of a multilevel framework for health program evaluation.

    Science.gov (United States)

    Masso, Malcolm; Quinsey, Karen; Fildes, Dave

    2017-07-01

    A well-conceived evaluation framework increases understanding of a program's goals and objectives, facilitates the identification of outcomes and can be used as a planning tool during program development. Herein we describe the origins and development of an evaluation framework that recognises that implementation is influenced by the setting in which it takes place, the individuals involved and the processes by which implementation is accomplished. The framework includes an evaluation hierarchy that focuses on outcomes for consumers, providers and the care delivery system, and is structured according to six domains: program delivery, impact, sustainability, capacity building, generalisability and dissemination. These components of the evaluation framework fit into a matrix structure, and cells within the matrix are supported by relevant evaluation tools. The development of the framework has been influenced by feedback from various stakeholders, existing knowledge of the evaluators and the literature on health promotion and implementation science. Over the years, the framework has matured and is generic enough to be useful in a wide variety of circumstances, yet specific enough to focus data collection, data analysis and the presentation of findings.

  1. MODELS AND SOLUTIONS FOR THE IMPLEMENTATION OF DISTRIBUTED SYSTEMS

    Directory of Open Access Journals (Sweden)

    Tarca Naiana

    2011-07-01

    Full Text Available Software applications may have different degrees of complexity depending on the problems they try to solve and can integrate very complex elements that bring together functionality that sometimes are competing or conflicting. We can take for example a mobile communications system. Functionalities of such a system are difficult to understand, and they add to the non-functional requirements such as the use in practice, performance, cost, durability and security. The transition from local computer networks to cover large networks that allow millions of machines around the world at speeds exceeding one gigabit per second allowed universal access to data and design of applications that require simultaneous use of computing power of several interconnected systems. The result of these technologies has enabled the evolution from centralized to distributed systems that connect a large number of computers. To enable the exploitation of the advantages of distributed systems one had developed software and communications tools that have enabled the implementation of distributed processing of complex solutions. The objective of this document is to present all the hardware, software and communication tools, closely related to the possibility of their application in integrated social and economic level as a result of globalization and the evolution of e-society. These objectives and national priorities are based on current needs and realities of Romanian society, while being consistent with the requirements of Romania's European orientation towards the knowledge society, strengthening the information society, the target goal representing the accomplishment of e-Romania, with its strategic e-government component. Achieving this objective repositions Romania and gives an advantage for sustainable growth, positive international image, rapid convergence in Europe, inclusion and strengthening areas of high competence, in line with Europe 2020, launched by the

  2. FireCalc: An XML-based framework for distributed data analysis

    International Nuclear Information System (INIS)

    Duarte, A.S.; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F.

    2008-01-01

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis

  3. FireCalc: An XML-based framework for distributed data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duarte, A.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)], E-mail: andre.duarte@cfn.ist.utl.pt; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis.

  4. A Survey and Analysis of Frameworks and Framework Issues for Information Fusion Applications

    Science.gov (United States)

    Llinas, James

    This paper was stimulated by the proposed project for the Santander Bank-sponsored "Chairs of Excellence" program in Spain, of which the author is a recipient. That project involves research on characterizing a robust, problem-domain-agnostic framework in which Information Fusion (IF) processes of all description, to include artificial intelligence processes and techniques could be developed. The paper describes the IF process and its requirements, a literature survey on IF frameworks, and a new proposed framework that will be implemented and evaluated at Universidad Carlos III de Madrid, Colmenarejo Campus.

  5. Enabling pathways to health equity: developing a framework for implementing social capital in practice.

    Science.gov (United States)

    Putland, Christine; Baum, Fran; Ziersch, Anna; Arthurson, Kathy; Pomagalska, Dorota

    2013-05-29

    Mounting evidence linking aspects of social capital to health and wellbeing outcomes, in particular to reducing health inequities, has led to intense interest in social capital theory within public health in recent decades. As a result, governments internationally are designing interventions to improve health and wellbeing by addressing levels of social capital in communities. The application of theory to practice is uneven, however, reflecting differing views on the pathways between social capital and health, and divergent theories about social capital itself. Unreliable implementation may restrict the potential to contribute to health equity by this means, yet to date there has been limited investigation of how the theory is interpreted at the level of policy and then translated into practice. The paper outlines a collaborative research project designed to address this knowledge deficit in order to inform more effective implementation. Undertaken in partnership with government departments, the study explored the application of social capital theory in programs designed to promote health and wellbeing in Adelaide, South Australia. It comprised three case studies of community-based practice, employing qualitative interviews and focus groups with community participants, practitioners, program managers and policy makers, to examine the ways in which the concept was interpreted and operationalized and identify the factors influencing success. These key lessons informed the development of practical resources comprising a guide for practitioners and briefing for policy makers. Overall the study showed that effective community projects can contribute to population health and wellbeing and reducing health inequities. Of specific relevance to this paper, however, is the finding that community projects rely for their effectiveness on a broader commitment expressed through policies and frameworks at the highest level of government decision making. In particular this

  6. Using Ada to implement the operations management system in a community of experts

    Science.gov (United States)

    Frank, M. S.

    1986-01-01

    An architecture is described for the Space Station Operations Management System (OMS), consisting of a distributed expert system framework implemented in Ada. The motivation for such a scheme is based on the desire to integrate the very diverse elements of the OMS while taking maximum advantage of knowledge based systems technology. Part of the foundation of an Ada based distributed expert system was accomplished in the form of a proof of concept prototype for the KNOMES project (Knowledge-based Maintenance Expert System). This prototype successfully used concurrently active experts to accomplish monitoring and diagnosis for the Remote Manipulator System. The basic concept of this software architecture is named ACTORS for Ada Cognitive Task ORganization Scheme. It is when one considers the overall problem of integrating all of the OMS elements into a cooperative system that the AI solution stands out. By utilizing a distributed knowledge based system as the framework for OMS, it is possible to integrate those components which need to share information in an intelligent manner.

  7. WebSelF: A Web Scraping Framework

    DEFF Research Database (Denmark)

    Thomsen, Jakob; Ernst, Erik; Brabrand, Claus

    2012-01-01

    We present, WebSelF, a framework for web scraping which models the process of web scraping and decomposes it into four conceptually independent, reusable, and composable constituents. We have validated our framework through a full parameterized implementation that is flexible enough to capture...... previous work on web scraping. We have experimentally evaluated our framework and implementation in an experiment that evaluated several qualitatively different web scraping constituents (including previous work and combinations hereof) on about 11,000 HTML pages on daily versions of 17 web sites over...... a period of more than one year. Our framework solves three concrete problems with current web scraping and our experimental results indicate that com- position of previous and our new techniques achieve a higher degree of accuracy, precision and specificity than existing techniques alone....

  8. Implementation of the Integrated Alarm System for KOMAC facility using EPICS framework and Eclipse

    International Nuclear Information System (INIS)

    Song, Young-Gi; Kim, Jae-Ha; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2017-01-01

    The alarm detecting layer is the component that monitors alarm signals which are transported to the processing part through message queue. The main purpose of the processing part is to transfer the alarm signals connecting an alarm identification and state of the alarm to database system. The operation interface of system level signal links has been developed by EPICS framework. EPICS tools have been used for monitoring device alarm status. The KOMAC alarm system was developed for offering a user-friendly, intuitive user interface. The alarm system is implemented with EPICS IOC for alarm server, eclipse-mars integrated development tool for alarm viewer, and mariadb for alarm log. The new alarm system supports intuitive user interface on alarm information and alarm history. Alarm view has plans to add login function, user permission on alarm acknowledge, user permission of PV import, search and report function.

  9. Smart meter implementation plan : report of the Board to the Minister

    International Nuclear Information System (INIS)

    2005-01-01

    This report provides detailed information about Ontario's smart meter implementation plan. The smart metering system will measure how much electricity a customer uses on an hourly basis, with data being transferred daily to local electricity distributors. Energy prices will vary according to the time of day when energy was being consumed, a system that supports current methods of charging larger customers. The plan proposes that all new and existing customers in Ontario have some type of smart meter by 2010 as part of a two-phased plan. Customers will receive timely information on consumption, and distributors will offer variable pricing plans. It was advised that costs be included in the distribution rate immediately upon installation of smart meters. Detailed information on implementation, smart metering costs, minimum requirements, and non-commodity time of use rates were presented. Critical tasks for establishing a framework for implementation included: ministerial approval of the plan; identification of a program coordinator; the establishment of a correct regulatory framework; a vendor approval process requiring appropriate permissions for radio frequency licences; technology pilots on behalf of distributors to assure adequate adaptation and the development of procedures concerning procurement, internal schedules and deployment; coordination between government, regulatory bodies and distributors towards the establishment of communication strategies, implementation plans and distributor approaches. 5 tabs., 6 figs

  10. Development of an Analysis and Design Optimization Framework for Marine Propellers

    Science.gov (United States)

    Tamhane, Ashish C.

    In this thesis, a framework for the analysis and design optimization of ship propellers is developed. This framework can be utilized as an efficient synthesis tool in order to determine the main geometric characteristics of the propeller but also to provide the designer with the capability to optimize the shape of the blade sections based on their specific criteria. A hybrid lifting-line method with lifting-surface corrections to account for the three-dimensional flow effects has been developed. The prediction of the correction factors is achieved using Artificial Neural Networks and Support Vector Regression. This approach results in increased approximation accuracy compared to existing methods and allows for extrapolation of the correction factor values. The effect of viscosity is implemented in the framework via the coupling of the lifting line method with the open-source RANSE solver OpenFOAM for the calculation of lift, drag and pressure distribution on the blade sections using a transition kappa-o SST turbulence model. Case studies of benchmark high-speed propulsors are utilized in order to validate the proposed framework for propeller operation in open-water conditions but also in a ship's wake.

  11. The Case Study of Implementing the Delivery Optimization System at a Fast-Moving Consumer Goods Distributer

    Directory of Open Access Journals (Sweden)

    Ante Galić

    2013-12-01

    Full Text Available Using new optimization methods and information-communications technology has become the key issue in the competition among the distributers of fast-moving consumer goods. Introducing a delivery optimization system instead of manual routing enables significant cost savings. The prerequisites for optimization are stable information system and efficient company management. The rich vehicle routing problem model is discussed and the effects of implementing the delivery optimization system are presented. For four years of continuous utilisation, the system has helped the distributer to reduce the overall distribution costs. It also made possible to close down several depots and handle more customer requests without investing in the vehicle fleet. The developed optimization system enabled the distributer to adapt to the new distribution schedule and react to first indicators of recession very fast. Normal 0 21 false false false HR X-NONE X-NONE

  12. Spark - a modern approach for distributed analytics

    CERN Multimedia

    CERN. Geneva; Kothuri, Prasanth

    2016-01-01

    The Hadoop ecosystem is the leading opensource platform for distributed storing and processing big data. It is a very popular system for implementing data warehouses and data lakes. Spark has also emerged to be one of the leading engines for data analytics. The Hadoop platform is available at CERN as a central service provided by the IT department. By attending the session, a participant will acquire knowledge of the essential concepts need to benefit from the parallel data processing offered by Spark framework. The session is structured around practical examples and tutorials. Main topics: Architecture overview - work distribution, concepts of a worker and a driver Computing concepts of transformations and actions Data processing APIs - RDD, DataFrame, and SparkSQL

  13. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    Science.gov (United States)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  14. Situation awareness of active distribution network: roadmap, technologies, and bottlenecks

    DEFF Research Database (Denmark)

    Lin, Jin; Wan, Can; Song, Yonghua

    2016-01-01

    With the rapid development of local generation and demand response, the active distribution network (ADN), which aggregates and manages miscellaneous distributed resources, has moved from theory to practice. Secure and optimal operations now require an advanced situation awareness (SA) system so...... in the project of developing an SA system as the basic component of a practical active distribution management system (ADMS) deployed in Beijing, China, is presented. This paper reviews the ADN’s development roadmap by illustrating the changes that are made in elements, topology, structure, and control scheme....... Taking into consideration these hardware changes, a systematic framework is proposed for the main components and the functional hierarchy of an SA system for the ADN. The SA system’s implementation bottlenecks are also presented, including, but not limited to issues in big data platform, distribution...

  15. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua; Hope, Michael; Ley, Hubert; Sokolov, Vadim; Xu, Bo; Zhang, Kuilin

    2016-03-01

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typically done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.

  16. A dataflow meta-computing framework for event processing in the H1 experiment

    International Nuclear Information System (INIS)

    Campbell, A.; Gerhards, R.; Mkrtchyan, T.; Levonian, S.; Grab, C.; Martyniak, J.; Nowak, J.

    2001-01-01

    Linux based networked PCs clusters are replacing both the VME non uniform direct memory access systems and SMP shared memory systems used previously for the online event filtering and reconstruction. To allow an optimal use of the distributed resources of PC clusters an open software framework is presently being developed based on a dataflow paradigm for event processing. This framework allows for the distribution of the data of physics events and associated calibration data to multiple computers from multiple input sources for processing and the subsequent collection of the processed events at multiple outputs. The basis of the system is the event repository, basically a first-in first-out event store which may be read and written in a manner similar to sequential file access. Events are stored in and transferred between repositories as suitably large sequences to enable high throughput. Multiple readers can read simultaneously from a single repository to receive event sequences and multiple writers can insert event sequences to a repository. Hence repositories are used for event distribution and collection. To support synchronisation of the event flow the repository implements barriers. A barrier must be written by all the writers of a repository before any reader can read the barrier. A reader must read a barrier before it may receive data from behind it. Only after all readers have read the barrier is the barrier removed from the repository. A barrier may also have attached data. In this way calibration data can be distributed to all processing units. The repositories are implemented as multi-threaded CORBA objects in C++ and CORBA is used for all data transfers. Job setup scripts are written in python and interactive status and histogram display is provided by a Java program. Jobs run under the PBS batch system providing shared use of resources for online triggering, offline mass reprocessing and user analysis jobs

  17. Using the ecological framework to identify barriers and enablers to implementing Namaste Care in Canada's long-term care system.

    Science.gov (United States)

    Hunter, Paulette V; Kaasalainen, Sharon; Froggatt, Katherine A; Ploeg, Jenny; Dolovich, Lisa; Simard, Joyce; Salsali, Mahvash

    2017-10-01

    Higher acuity of care at the time of admission to long-term care (LTC) is resulting in a shorter period to time of death, yet most LTC homes in Canada do not have formalized approaches to palliative care. Namaste Care is a palliative care approach specifically tailored to persons with advanced cognitive impairment who are living in LTC. The purpose of this study was to employ the ecological framework to identify barriers and enablers to an implementation of Namaste Care. Six group interviews were conducted with families, unlicensed staff, and licensed staff at two Canadian LTC homes that were planning to implement Namaste Care. None of the interviewees had prior experience implementing Namaste Care. The resulting qualitative data were analyzed using a template organizing approach. We found that the strongest implementation enablers were positive perceptions of need for the program, benefits of the program, and fit within a resident-centred or palliative approach to care. Barriers included a generally low resource base for LTC, the need to adjust highly developed routines to accommodate the program, and reliance on a casual work force. We conclude that within the Canadian LTC system, positive perceptions of Namaste Care are tempered by concerns about organizational capacity to support new programming.

  18. A Framework for Hardware-Accelerated Services Using Partially Reconfigurable SoCs

    Directory of Open Access Journals (Sweden)

    MACHIDON, O. M.

    2016-05-01

    Full Text Available The current trend towards ?Everything as a Service? fosters a new approach on reconfigurable hardware resources. This innovative, service-oriented approach has the potential of bringing a series of benefits for both reconfigurable and distributed computing fields by favoring a hardware-based acceleration of web services and increasing service performance. This paper proposes a framework for accelerating web services by offloading the compute-intensive tasks to reconfigurable System-on-Chip (SoC devices, as integrated IP (Intellectual Property cores. The framework provides a scalable, dynamic management of the tasks and hardware processing cores, based on dynamic partial reconfiguration of the SoC. We have enhanced security of the entire system by making use of the built-in detection features of the hardware device and also by implementing active counter-measures that protect the sensitive data.

  19. Links in a distributed database: Theory and implementation

    International Nuclear Information System (INIS)

    Karonis, N.T.; Kraimer, M.R.

    1991-12-01

    This document addresses the problem of extending database links across Input/Output Controller (IOC) boundaries. It lays a foundation by reviewing the current system and proposing an implementation specification designed to guide all work in this area. The document also describes an implementation that is less ambitious than our formally stated proposal, one that does not extend the reach of all database links across IOC boundaries. Specifically, it introduces an implementation of input and output links and comments on that overall implementation. We include a set of manual pages describing each of the new functions the implementation provides

  20. Barriers and Enablers to Implementation of Dietary Guidelines in Early Childhood Education Centers in Australia: Application of the Theoretical Domains Framework.

    Science.gov (United States)

    Grady, Alice; Seward, Kirsty; Finch, Meghan; Fielding, Alison; Stacey, Fiona; Jones, Jannah; Wolfenden, Luke; Yoong, Sze Lin

    2018-03-01

    To identify perceived barriers and enablers to implementation of dietary guidelines reported by early childhood education center cooks, and barriers and enablers associated with greater implementation based on assessment of center menu compliance. Cross-sectional telephone interview. Early childhood education centers, New South Wales, Australia. A total of 202 cooks responsible for menu planning; 70 centers provided a menu for review of compliance with dietary guidelines. Barriers and enablers to dietary guideline implementation were determined using a tool assessing constructs of the Theoretical Domains Framework (TDF). Higher scores (≥6) for each construct indicated enablers to guideline implementation; lower scores (barriers. Multivariable linear regression identified TDF constructs associated with greater guideline implementation. Scores were lowest for reinforcement (mean, 5.85) and goals (mean, 5.89) domains, and highest for beliefs about consequences (mean, 6.51) and social/professional role and identity (mean, 6.50). The skills domain was positively associated with greater implementation of guidelines based on menu review (P = .01). Cooks perceived social/professional role and identity, and beliefs about consequences to be enablers to dietary guideline implementation; however, only the skills domain was associated with greater implementation. There are opportunities to target the incongruence in perceptions vs reality of the barriers and enablers to implementation. Future research could examine the utility of the TDF to identify barriers and enablers to implementation to inform intervention development and for evaluating interventions to examine intervention mechanisms. Copyright © 2017 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  1. Implementation of a Parameterization Framework for Cybersecurity Laboratories

    Science.gov (United States)

    2017-03-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data ...that the student performed each step of an exercise to help instructors assess the level of learning by each student. The framework should automate ... automated assessment tools (AATs) created to help assess how students perform in programming courses. In the paper, “Are Automated Assessment Tools

  2. Infrastructure and distributed learning methodology for privacy-preserving multi-centric rapid learning health care: euroCAT

    Directory of Open Access Journals (Sweden)

    Timo M. Deist

    2017-06-01

    The euroCAT infrastructure has been successfully implemented in five radiation clinics across three countries. SVM models can be learned on data distributed over all five clinics. Furthermore, the infrastructure provides a general framework to execute learning algorithms on distributed data. The ongoing expansion of the euroCAT network will facilitate machine learning in radiation oncology. The resulting access to larger datasets with sufficient variation will pave the way for generalizable prediction models and personalized medicine.

  3. A trial of distributed portable data acquisition and processing system implementation: the qdpb - data processing with branchpoints

    International Nuclear Information System (INIS)

    Gritsaj, K.I.; Isupov, A.Yu.

    2001-01-01

    A trial of distributed portable data acquisition and processing system qdpb is issued. An experimental setup data and hardware dependent code is separated from the generic part of the qdpb system. The generic part implementation is described

  4. Evaluation of Augmented Reality Frameworks for Android Development

    Directory of Open Access Journals (Sweden)

    Iulia Marneanu

    2014-10-01

    Full Text Available Augmented Reality (AR is the evolution of the concept of Virtual Reality (VR. Its goal is to enhance a person's perception of the surrounding world. AR is a fast growing state of the art technology and a variety of implementation tools thereof exist today. Due to the heterogeneity of available technologies, the choice of the appropriate framework for a mobile application is difficult to make. These frameworks implement different tracking techniques and have to provide support to various constraints. This publication aims to point out that the choice of the appropriate framework depends on the context of the app to be developed. As expected, it is accurate that no framework is entirely the best, but rather that each exhibits strong and weak points. Our results demonstrate that given a set of constraints, one framework can outperform others. We anticipate our research to be the starting point for testing of other frameworks, given various constraints. The frameworks evaluated here are open-source or have been purchased under Academic License.

  5. Towards a Cloud Based Smart Traffic Management Framework

    Science.gov (United States)

    Rahimi, M. M.; Hakimpour, F.

    2017-09-01

    Traffic big data has brought many opportunities for traffic management applications. However several challenges like heterogeneity, storage, management, processing and analysis of traffic big data may hinder their efficient and real-time applications. All these challenges call for well-adapted distributed framework for smart traffic management that can efficiently handle big traffic data integration, indexing, query processing, mining and analysis. In this paper, we present a novel, distributed, scalable and efficient framework for traffic management applications. The proposed cloud computing based framework can answer technical challenges for efficient and real-time storage, management, process and analyse of traffic big data. For evaluation of the framework, we have used OpenStreetMap (OSM) real trajectories and road network on a distributed environment. Our evaluation results indicate that speed of data importing to this framework exceeds 8000 records per second when the size of datasets is near to 5 million. We also evaluate performance of data retrieval in our proposed framework. The data retrieval speed exceeds 15000 records per second when the size of datasets is near to 5 million. We have also evaluated scalability and performance of our proposed framework using parallelisation of a critical pre-analysis in transportation applications. The results show that proposed framework achieves considerable performance and efficiency in traffic management applications.

  6. Toward the Framework and Implementation for Clearance of Materials from Regulated Facilities

    International Nuclear Information System (INIS)

    Chen, Shih-Yew; Moeller, Dade W.; Dornsife, William P.; Meyer, H Robert; Lamastra, Anthony; Lubenau, Joel O.; Strom, Daniel J.; Yusko, James G.

    2005-01-01

    important disposition option for solid materials, establish the framework and basis of release, and discuss resolutions regarding the implementation of such a disposition option

  7. Toward the framework and implementation for clearance of materials from regulated facilities.

    Science.gov (United States)

    Chen, S Y; Moeller, D W; Dornsife, W P; Meyer, H R; Lamastra, A; Lubenau, J O; Strom, D J; Yusko, J G

    2005-08-01

    clearance as an important disposition option for solid materials, establish the framework and basis of release, and discuss resolutions regarding the implementation of such a disposition option.

  8. Software Engineering Support of the Third Round of Scientific Grand Challenge Investigations: Earth System Modeling Software Framework Survey

    Science.gov (United States)

    Talbot, Bryan; Zhou, Shu-Jia; Higgins, Glenn; Zukor, Dorothy (Technical Monitor)

    2002-01-01

    One of the most significant challenges in large-scale climate modeling, as well as in high-performance computing in other scientific fields, is that of effectively integrating many software models from multiple contributors. A software framework facilitates the integration task, both in the development and runtime stages of the simulation. Effective software frameworks reduce the programming burden for the investigators, freeing them to focus more on the science and less on the parallel communication implementation. while maintaining high performance across numerous supercomputer and workstation architectures. This document surveys numerous software frameworks for potential use in Earth science modeling. Several frameworks are evaluated in depth, including Parallel Object-Oriented Methods and Applications (POOMA), Cactus (from (he relativistic physics community), Overture, Goddard Earth Modeling System (GEMS), the National Center for Atmospheric Research Flux Coupler, and UCLA/UCB Distributed Data Broker (DDB). Frameworks evaluated in less detail include ROOT, Parallel Application Workspace (PAWS), and Advanced Large-Scale Integrated Computational Environment (ALICE). A host of other frameworks and related tools are referenced in this context. The frameworks are evaluated individually and also compared with each other.

  9. SciSpark's SRDD : A Scientific Resilient Distributed Dataset for Multidimensional Data

    Science.gov (United States)

    Palamuttam, R. S.; Wilson, B. D.; Mogrovejo, R. M.; Whitehall, K. D.; Mattmann, C. A.; McGibbney, L. J.; Ramirez, P.

    2015-12-01

    Remote sensing data and climate model output are multi-dimensional arrays of massive sizes locked away in heterogeneous file formats (HDF5/4, NetCDF 3/4) and metadata models (HDF-EOS, CF) making it difficult to perform multi-stage, iterative science processing since each stage requires writing and reading data to and from disk. We have developed SciSpark, a robust Big Data framework, that extends ApacheTM Spark for scaling scientific computations. Apache Spark improves the map-reduce implementation in ApacheTM Hadoop for parallel computing on a cluster, by emphasizing in-memory computation, "spilling" to disk only as needed, and relying on lazy evaluation. Central to Spark is the Resilient Distributed Dataset (RDD), an in-memory distributed data structure that extends the functional paradigm provided by the Scala programming language. However, RDDs are ideal for tabular or unstructured data, and not for highly dimensional data. The SciSpark project introduces the Scientific Resilient Distributed Dataset (sRDD), a distributed-computing array structure which supports iterative scientific algorithms for multidimensional data. SciSpark processes data stored in NetCDF and HDF files by partitioning them across time or space and distributing the partitions among a cluster of compute nodes. We show usability and extensibility of SciSpark by implementing distributed algorithms for geospatial operations on large collections of multi-dimensional grids. In particular we address the problem of scaling an automated method for finding Mesoscale Convective Complexes. SciSpark provides a tensor interface to support the pluggability of different matrix libraries. We evaluate performance of the various matrix libraries in distributed pipelines, such as Nd4jTM and BreezeTM. We detail the architecture and design of SciSpark, our efforts to integrate climate science algorithms, parallel ingest and partitioning (sharding) of A-Train satellite observations from model grids. These

  10. A novel framework of ERP implementation in Indian SMEs: Kernel principal component analysis and intuitionistic Fuzzy TOPSIS driven approach

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2016-04-01

    Full Text Available Over the years, organizations have witnessed a transformational change at global market place. Integration of operations and partnership have become the key success factors for organizations. In order to achieve inclusive growth while operating in a dynamic uncertain environment, organizations irrespective of the scale of business need to stay connected across the entire value chain. The purpose of this paper is to analyze Enterprise Resource Planning (ERP implementation process for Small and Medium Enterprises (SMEs in India to identify the key enablers. Exhaustive survey of existing literature as a part of secondary research work, has been conducted in order to identify the critical success factors and usefulness of ERP implementation in different industrial sectors initially and examines the impact of those factors in Indian SMEs. Kernel Principal Component Analysis (KPCA has been applied on survey response to recognize the key constructs related to Critical Success Factors (CSFs and tangible benefits of ERP implementation. Intuitionistic Fuzzy set theory based Technique of Order Preference by Similarity to Ideal Solution (TOPSIS method is then used to rank the respective CSFs by mapping their contribution to the benefits realized through implementing ERP. Overall this work attempts to present a guideline for ERP adoption process in the said sector utilizing the framework built upon KPCA and Intuitionistic Fuzzy TOPSIS. Findings of this work can act as guidelines for monitoring the entire ERP implementation project.

  11. PCI bus content-addressable-memory (CAM) implementation on FPGA for pattern recognition/image retrieval in a distributed environment

    Science.gov (United States)

    Megherbi, Dalila B.; Yan, Yin; Tanmay, Parikh; Khoury, Jed; Woods, C. L.

    2004-11-01

    Recently surveillance and Automatic Target Recognition (ATR) applications are increasing as the cost of computing power needed to process the massive amount of information continues to fall. This computing power has been made possible partly by the latest advances in FPGAs and SOPCs. In particular, to design and implement state-of-the-Art electro-optical imaging systems to provide advanced surveillance capabilities, there is a need to integrate several technologies (e.g. telescope, precise optics, cameras, image/compute vision algorithms, which can be geographically distributed or sharing distributed resources) into a programmable system and DSP systems. Additionally, pattern recognition techniques and fast information retrieval, are often important components of intelligent systems. The aim of this work is using embedded FPGA as a fast, configurable and synthesizable search engine in fast image pattern recognition/retrieval in a distributed hardware/software co-design environment. In particular, we propose and show a low cost Content Addressable Memory (CAM)-based distributed embedded FPGA hardware architecture solution with real time recognition capabilities and computing for pattern look-up, pattern recognition, and image retrieval. We show how the distributed CAM-based architecture offers a performance advantage of an order-of-magnitude over RAM-based architecture (Random Access Memory) search for implementing high speed pattern recognition for image retrieval. The methods of designing, implementing, and analyzing the proposed CAM based embedded architecture are described here. Other SOPC solutions/design issues are covered. Finally, experimental results, hardware verification, and performance evaluations using both the Xilinx Virtex-II and the Altera Apex20k are provided to show the potential and power of the proposed method for low cost reconfigurable fast image pattern recognition/retrieval at the hardware/software co-design level.

  12. System data structures for on-line distributed data base management system

    International Nuclear Information System (INIS)

    Wade, J.A.

    1981-01-01

    Described herein are the data structures used in implementing a distributed data base management system (DBMS) for the Mirror Fusion Test Facility (MFTF), a part of the Mirror Fusion Energy Program at the Lawrence Livermore National Laboratory. The hardware and software frameworks within which the DBMS have been developed are first described, followed by a brief look at the motivation and fundamental design goals of the system. The structures are given in detail

  13. System data structures for on-line distributed data base management system

    Energy Technology Data Exchange (ETDEWEB)

    Wade, J.A.

    1981-01-28

    Described herein are the data structures used in implementing a distributed data base management system (DBMS) for the Mirror Fusion Test Facility (MFTF), a part of the Mirror Fusion Energy Program at the Lawrence Livermore National Laboratory. The hardware and software frameworks within which the DBMS have been developed are first described, followed by a brief look at the motivation and fundamental design goals of the system. The structures are given in detail.

  14. Decentralising Multicell Cooperative Processing: A Novel Robust Framework

    Directory of Open Access Journals (Sweden)

    Gesbert David

    2009-01-01

    Full Text Available Multicell cooperative processing (MCP has the potential to boost spectral efficiency and improve fairness of cellular systems. However the typical centralised conception for MCP incurs significant infrastructural overheads which increase the system costs and hinder the practical implementation of MCP. In Frequency Division Duplexing systems each user feeds back its Channel State Information (CSI only to one Base Station (BS. Therefore collaborating BSs need to be interconnected via low-latency backhaul links, and a Control Unit is necessary in order to gather user CSI, perform scheduling, and coordinate transmission. In this paper a new framework is proposed that allows MCP on the downlink while circumventing the aforementioned costly modifications on the existing infrastructure of cellular systems. Each MS feeds back its CSI to all collaborating BSs, and the needed operations of user scheduling and signal processing are performed in a distributed fashion by the involved BSs. Furthermore the proposed framework is shown to be robust against feedback errors when quantized CSI feedback and linear precoding are employed.

  15. A framework for sensitivity analysis of decision trees.

    Science.gov (United States)

    Kamiński, Bogumił; Jakubczyk, Michał; Szufel, Przemysław

    2018-01-01

    In the paper, we consider sequential decision problems with uncertainty, represented as decision trees. Sensitivity analysis is always a crucial element of decision making and in decision trees it often focuses on probabilities. In the stochastic model considered, the user often has only limited information about the true values of probabilities. We develop a framework for performing sensitivity analysis of optimal strategies accounting for this distributional uncertainty. We design this robust optimization approach in an intuitive and not overly technical way, to make it simple to apply in daily managerial practice. The proposed framework allows for (1) analysis of the stability of the expected-value-maximizing strategy and (2) identification of strategies which are robust with respect to pessimistic/optimistic/mode-favoring perturbations of probabilities. We verify the properties of our approach in two cases: (a) probabilities in a tree are the primitives of the model and can be modified independently; (b) probabilities in a tree reflect some underlying, structural probabilities, and are interrelated. We provide a free software tool implementing the methods described.

  16. Legal framework for implementation of m-government in Ethiopia ...

    African Journals Online (AJOL)

    Higher penetration of mobile services in many countries, including Ethiopia, makes m-Government an eminent technological option for delivering government services to public and businesses. Although the Ethiopian government has introduced e-government services to the public, the legal framework to support such ...

  17. Pattern-based framework for data acquisition systems

    International Nuclear Information System (INIS)

    Padmini, S.; Diwakar, M.P.; Nair, Preetha; Mathew, R.

    2004-01-01

    The data acquisition framework implements a reusable abstract architectural design for use in the development of data acquisition systems. The framework is being used to build Flux Mapping system (FMs) for TAPS III-IV and RRS Data Acquisition System for Dhruva reactor

  18. A survey and experimental comparison of distributed SPARQL engines for very large RDF data

    KAUST Repository

    Abdelaziz, Ibrahim; Harbi, Razen; Khayyat, Zuhair; Kalnis, Panos

    2017-01-01

    Distributed SPARQL engines promise to support very large RDF datasets by utilizing shared-nothing computer clusters. Some are based on distributed frameworks such as MapReduce; others implement proprietary distributed processing; and some rely on expensive preprocessing for data partitioning. These systems exhibit a variety of trade-offs that are not well-understood, due to the lack of any comprehensive quantitative and qualitative evaluation. In this paper, we present a survey of 22 state-of-the-art systems that cover the entire spectrum of distributed RDF data processing and categorize them by several characteristics. Then, we select 12 representative systems and perform extensive experimental evaluation with respect to preprocessing cost, query performance, scalability and workload adaptability, using a variety of synthetic and real large datasets with up to 4.3 billion triples. Our results provide valuable insights for practitioners to understand the trade-offs for their usage scenarios. Finally, we publish online our evaluation framework, including all datasets and workloads, for researchers to compare their novel systems against the existing ones.

  19. A survey and experimental comparison of distributed SPARQL engines for very large RDF data

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-10-19

    Distributed SPARQL engines promise to support very large RDF datasets by utilizing shared-nothing computer clusters. Some are based on distributed frameworks such as MapReduce; others implement proprietary distributed processing; and some rely on expensive preprocessing for data partitioning. These systems exhibit a variety of trade-offs that are not well-understood, due to the lack of any comprehensive quantitative and qualitative evaluation. In this paper, we present a survey of 22 state-of-the-art systems that cover the entire spectrum of distributed RDF data processing and categorize them by several characteristics. Then, we select 12 representative systems and perform extensive experimental evaluation with respect to preprocessing cost, query performance, scalability and workload adaptability, using a variety of synthetic and real large datasets with up to 4.3 billion triples. Our results provide valuable insights for practitioners to understand the trade-offs for their usage scenarios. Finally, we publish online our evaluation framework, including all datasets and workloads, for researchers to compare their novel systems against the existing ones.

  20. The Symbiose project: an integrated framework for performing environmental radiological risk assessment

    International Nuclear Information System (INIS)

    Gonze, M.A.; Mourlon, C.; Garcia-Sanchez, L.; Beaugelin, K.; Chen, T.; Le Dizes, S.

    2004-01-01

    Human health and ecological risk assessments usually require the integration of a wide range of environmental data and modelling approaches, with a varying level of detail dependent on the management objectives, the complexity of the site and the level of ignorance about the pollutant behaviour/toxicity. Like most scientists and assessors did it recently, we recognized the need for developing comprehensive, integrated and flexible approaches to risk assessment. To meet these needs, IRSN launched the Symbiose project (2002-2006) which aims first, at designing a framework for integrating and managing data, methods and knowledge of some relevance in radiological risk to humans/biota assessment studies, and second, at implementing this framework in an information management, modelling and calculation platform. Feasibility developments (currently completed) led to the specification of a fully integrated, object-oriented and hierarchical approach for describing the fate, transport and effect of radionuclides in spatially-distributed environmental systems. This innovative approach has then been implemented in a platform prototype, main components of which are a user-friendly and modular simulation environment (e.g. using GoldSim toolbox), and a hierarchical object-oriented biosphere database. Both conceptual and technical developments will be presented here. (author)