WorldWideScience

Sample records for framework implementing distributed

  1. Framework for implementation of maintenance management in distribution network service providers

    International Nuclear Information System (INIS)

    Gomez Fernandez, Juan Francisco; Crespo Marquez, Adolfo

    2009-01-01

    Distribution network service providers (DNSP) are companies dealing with network infrastructure, such as distribution of gas, water, electricity or telecommunications, and they require the development of special maintenance management (MM) capabilities in order to satisfy the needs of their customers. In this sector, maintenance management information systems are essential to ensure control, gain knowledge and improve decision making. The aim of this paper is the study of specific characteristics of maintenance in these types of companies. We will investigate existing standards and best management practices with the scope of defining a suitable ad-hoc framework for implementation of maintenance management. The conclusion of the work supports the proposition of a framework consisting on a processes framework based on a structure of systems, integrated for continuous improvement of maintenance activities. The paper offers a very practical approach to the problem, as a result of more of 10 years of professional experience within this sector, and specially focused to network maintenance.

  2. A novel optimal distribution system planning framework implementing distributed generation in a deregulated electricity market

    International Nuclear Information System (INIS)

    Porkar, S.; Poure, P.; Abbaspour-Tehrani-fard, A.; Saadate, S.

    2010-01-01

    This paper introduces a new framework included mathematical model and a new software package interfacing two powerful softwares (MATLAB and GAMS) for obtaining the optimal distributed generation (DG) capacity sizing and sitting investments with capability to simulate large distribution system planning. The proposed optimization model allows minimizing total system planning costs for DG investment, DG operation and maintenance, purchase of power by the distribution companies (DISCOs) from transmission companies (TRANSCOs) and system power losses. The proposed model provides not only the DG size and site but also the new market price as well. Three different cases depending on system conditions and three different scenarios depending on different planning alternatives and electrical market structures, have been considered. They have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders' lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers' bills. (author)

  3. A novel optimal distribution system planning framework implementing distributed generation in a deregulated electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Porkar, S. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Poure, P. [Laboratoire d' Instrumentation Electronique de Nancy, LIEN, EA 3440, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Abbaspour-Tehrani-fard, A. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Saadate, S. [Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France)

    2010-07-15

    This paper introduces a new framework included mathematical model and a new software package interfacing two powerful softwares (MATLAB and GAMS) for obtaining the optimal distributed generation (DG) capacity sizing and sitting investments with capability to simulate large distribution system planning. The proposed optimization model allows minimizing total system planning costs for DG investment, DG operation and maintenance, purchase of power by the distribution companies (DISCOs) from transmission companies (TRANSCOs) and system power losses. The proposed model provides not only the DG size and site but also the new market price as well. Three different cases depending on system conditions and three different scenarios depending on different planning alternatives and electrical market structures, have been considered. They have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders' lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers' bills. (author)

  4. Algorithm and Implementation of Distributed ESN Using Spark Framework and Parallel PSO

    Directory of Open Access Journals (Sweden)

    Kehe Wu

    2017-04-01

    Full Text Available The echo state network (ESN employs a huge reservoir with sparsely and randomly connected internal nodes and only trains the output weights, which avoids the suboptimal problem, exploding and vanishing gradients, high complexity and other disadvantages faced by traditional recurrent neural network (RNN training. In light of the outstanding adaption to nonlinear dynamical systems, ESN has been applied into a wide range of applications. However, in the era of Big Data, with an enormous amount of data being generated continuously every day, the data are often distributed and stored in real applications, and thus the centralized ESN training process is prone to being technologically unsuitable. In order to achieve the requirement of Big Data applications in the real world, in this study we propose an algorithm and its implementation for distributed ESN training. The mentioned algorithm is based on the parallel particle swarm optimization (P-PSO technique and the implementation uses Spark, a famous large-scale data processing framework. Four extremely large-scale datasets, including artificial benchmarks, real-world data and image data, are adopted to verify our framework on a stretchable platform. Experimental results indicate that the proposed work is accurate in the era of Big Data, regarding speed, accuracy and generalization capabilities.

  5. A framework for implementing a Distributed Intrusion Detection System (DIDS) with interoperabilty and information analysis

    OpenAIRE

    Davicino, Pablo; Echaiz, Javier; Ardenghi, Jorge Raúl

    2011-01-01

    Computer Intrusion Detection Systems (IDS) are primarily designed to protect availability, condentiality and integrity of critical information infrastructures. A Distributed IDS (DIDS) consists of several IDS over a large network(s), all of which communicate with each other, with a central server or with a cluster of servers that facilitates advanced network monitoring. In a distributed environment, DIDS are implemented using cooperative intelligent sensors distributed across the network(s). ...

  6. Developing frameworks for protocol implementation

    NARCIS (Netherlands)

    de Barros Barbosa, C.; de barros Barbosa, C.; Ferreira Pires, Luis

    1999-01-01

    This paper presents a method to develop frameworks for protocol implementation. Frameworks are software structures developed for a specific application domain, which can be reused in the implementation of various different concrete systems in this domain. The use of frameworks support a protocol

  7. Modeling and Implementation of Cattle/Beef Supply Chain Traceability Using a Distributed RFID-Based Framework in China

    Science.gov (United States)

    Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei

    2015-01-01

    In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain. PMID:26431340

  8. Modeling and Implementation of Cattle/Beef Supply Chain Traceability Using a Distributed RFID-Based Framework in China.

    Science.gov (United States)

    Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei

    2015-01-01

    In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain.

  9. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  10. Flexible investment under uncertainty in smart distribution networks with demand side response: Assessment framework and practical implementation

    International Nuclear Information System (INIS)

    Schachter, Jonathan A.; Mancarella, Pierluigi; Moriarty, John; Shaw, Rita

    2016-01-01

    Classical deterministic models applied to investment valuation in distribution networks may not be adequate for a range of real-world decision-making scenarios as they effectively ignore the uncertainty found in the most important variables driving network planning (e.g., load growth). As greater uncertainty is expected from growing distributed energy resources in distribution networks, there is an increasing risk of investing in too much or too little network capacity and hence causing the stranding and inefficient use of network assets; these costs are then passed on to the end-user. An alternative emerging solution in the context of smart grid development is to release untapped network capacity through Demand-Side Response (DSR). However, to date there is no approach able to quantify the value of ‘smart’ DSR solutions against ‘conventional’ asset-heavy investments. On these premises, this paper presents a general real options framework and a novel probabilistic tool for the economic assessment of DSR for smart distribution network planning under uncertainty, which allows the modeling and comparison of multiple investment strategies, including DSR and capacity reinforcements, based on different cost and risk metrics. In particular the model provides an explicit quantification of the economic value of DSR against alternative investment strategies. Through sensitivity analysis it is able to indicate the maximum price payable for DSR service such that DSR remains economically optimal against these alternatives. The proposed model thus provides Regulators with clear insights for overseeing DSR contractual arrangements. Further it highlights that differences exist in the economic perspective of the regulated DNO business and of customers. Our proposed model is therefore capable of highlighting instances where a particular investment strategy is favorable to the DNO but not to its customers, or vice-versa, and thus aspects of the regulatory framework which may

  11. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  12. Invertebrate distribution patterns and river typology for the implementation of the water framework directive in Martinique, French Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Bernadet C.

    2013-03-01

    Full Text Available Over the past decade, Europe’s Water Framework Directive provided compelling reasons for developing tools for the biological assessment of freshwater ecosystem health in member States. Yet, the lack of published study for Europe’s overseas regions reflects minimal knowledge of the distribution patterns of aquatic species in Community’s outermost areas. Benthic invertebrates (84 taxa and land-cover, physical habitat and water chemistry descriptors (26 variables were recorded at fifty-one stations in Martinique, French Lesser Antilles. Canonical Correspondence Analysis and Ward’s algorithm were used to bring out patterns in community structure in relation to environmental conditions, and variation partitioning was used to specify the influence of geomorphology and anthropogenic disturbance on invertebrate communities. Species richness decreased from headwater to lowland streams, and species composition changed from northern to southern areas. The proportion of variation explained by geomorphological variables was globally higher than that explained by anthropogenic variables. Geomorphology and land cover played key roles in delineating ecological sub-regions for the freshwater biota. Despite this and the small surface area of Martinique (1080 km2, invertebrate communities showed a clear spatial turnover in composition and biological traits (e.g., insects, crustaceans and molluscs in relation to natural conditions.

  13. DIRAC distributed secure framework

    International Nuclear Information System (INIS)

    Casajus, A; Graciani, R

    2010-01-01

    DIRAC, the LHCb community Grid solution, provides access to a vast amount of computing and storage resources to a large number of users. In DIRAC users are organized in groups with different needs and permissions. In order to ensure that only allowed users can access the resources and to enforce that there are no abuses, security is mandatory. All DIRAC services and clients use secure connections that are authenticated using certificates and grid proxies. Once a client has been authenticated, authorization rules are applied to the requested action based on the presented credentials. These authorization rules and the list of users and groups are centrally managed in the DIRAC Configuration Service. Users submit jobs to DIRAC using their local credentials. From then on, DIRAC has to interact with different Grid services on behalf of this user. DIRAC has a proxy management service where users upload short-lived proxies to be used when DIRAC needs to act on behalf of them. Long duration proxies are uploaded by users to a MyProxy service, and DIRAC retrieves new short delegated proxies when necessary. This contribution discusses the details of the implementation of this security infrastructure in DIRAC.

  14. Corporate compliance: framework and implementation.

    Science.gov (United States)

    Fowler, N

    1999-01-01

    The federal government has created numerous programs to combat fraud and abuse. The government now encourages healthcare facilities to have a corporate compliance program (CCP), a plan that reduces the chances that the facility will violate laws or regulations. A CCP is an organization-wide program comprised of a code of conduct and written policies, internal monitoring and auditing standards, employee training, feedback mechanisms and other features, all designed to prevent and detect violations of governmental laws, regulations and policies. It is a system or method ensuring that employees understand and will comply with laws that apply to what they do every day. Seven factors, based on federal sentencing guidelines, provide the framework for developing a CCP. First, a facility must establish rules that are reasonably capable of reducing criminal conduct. Second, high-level personnel must oversee the compliance effort. Third, a facility must use due care in delegating authority in the compliance initiative. Fourth, standards must be communicated effectively to employees, and fifth, a facility must take reasonable steps to achieve compliance. Sixth, standards must be enforced consistently across the organization and last, standards must be modified or changed for reported concerns, to ensure they are not repeated. PROMINA Health System, Inc. in Atlanta, Ga., designed a program to meet federal guidelines. It started with a self-assessment to define its areas or risk. Next, it created the internal structure and assigned organizational responsibility for running the CCP. PROMINA then developed standards of business and professional conduct, established vehicles of communication and trained employees on the standards. Finally, it continues to develop evidence of the program's effectiveness by monitoring and documenting its compliance activities.

  15. A Review of Telehealth Service Implementation Frameworks

    Directory of Open Access Journals (Sweden)

    Liezl Van Dyk

    2014-01-01

    Full Text Available Despite the potential of telehealth services to increase the quality and accessibility of healthcare, the success rate of such services has been disappointing. The purpose of this paper is to find and compare existing frameworks for the implementation of telehealth services that can contribute to the success rate of future endeavors. After a thorough discussion of these frameworks, this paper outlines the development methodologies in terms of theoretical background, methodology and validation. Finally, the common themes and formats are identified for consideration in future implementation. It was confirmed that a holistic implementation approach is needed, which includes technology, organizational structures, change management, economic feasibility, societal impacts, perceptions, user-friendliness, evaluation and evidence, legislation, policy and governance. Furthermore, there is some scope for scientifically rigorous framework development and validation approaches.

  16. THE LABVIEW RADE FRAMEWORK DISTRIBUTED ARCHITECTURE

    CERN Document Server

    Andreassen, O O; Raimondo, A; Rijllart, A; Shaipov, V; Sorokoletov, R

    2011-01-01

    For accelerator GUI applications there is a need for a rapid development environment to create expert tools or to prototype operator applications. Typically a variety of tools are being used, such as Matlab or Excel, but their scope is limited, either because of their low flexibility or limited integration into the accelerator infrastructure. In addition, having several tools obliges users to deal with different programming techniques and data structures. We have addressed these limitations by using LabVIEW, extending it with interfaces to C++ and Java. In this way it fulfils requirements of ease of use, flexibility and connectivity, which makes up what we refer to as the RADE framework. Recent application requirements could only be met by implementing a distributed architecture with multiple servers running multiple services. This brought us the additional advantage to implement redundant services, to increase the availability and to make transparent updates. We will present two applications requiring high a...

  17. Distributed security framework for modern workforce

    Energy Technology Data Exchange (ETDEWEB)

    Balatsky, G.; Scherer, C. P., E-mail: gbalatsky@lanl.gov, E-mail: scherer@lanl.gov [Los Alamos National Laboratory, Los Alamos, NM (United States)

    2014-07-01

    Safe and sustainable nuclear power production depends on strict adherence to nuclear security as a necessary prerequisite for nuclear power. This paper considers the current challenges for nuclear security, and proposes a conceptual framework to address those challenges. We identify several emerging factors that affect nuclear security: 1. Relatively high turnover rates in the nuclear workforce compared to the earlier years of the nuclear industry, when nuclear workers were more likely to have secure employment, a lifelong career at one company, and retirement on a pension plan. 2. Vulnerabilities stemming from the ubiquitous presence of modern electronics and their patterns of use by the younger workforce. 3. Modern management practices, including outsourcing and short-term contracting (which relates to number 1 above). In such a dynamic and complex environment, nuclear security personnel alone cannot effectively guarantee adequate security. We propose that one solution to this emerging situation is a distributed security model in which the components of nuclear security become the responsibility of each and every worker at a nuclear facility. To implement this model, there needs to be a refurbishment of current workforce training and mentoring practices. The paper will present an example of distributed security framework model, and how it may look in practice. (author)

  18. Distributed security framework for modern workforce

    International Nuclear Information System (INIS)

    Balatsky, G.; Scherer, C. P.

    2014-01-01

    Safe and sustainable nuclear power production depends on strict adherence to nuclear security as a necessary prerequisite for nuclear power. This paper considers the current challenges for nuclear security, and proposes a conceptual framework to address those challenges. We identify several emerging factors that affect nuclear security: 1. Relatively high turnover rates in the nuclear workforce compared to the earlier years of the nuclear industry, when nuclear workers were more likely to have secure employment, a lifelong career at one company, and retirement on a pension plan. 2. Vulnerabilities stemming from the ubiquitous presence of modern electronics and their patterns of use by the younger workforce. 3. Modern management practices, including outsourcing and short-term contracting (which relates to number 1 above). In such a dynamic and complex environment, nuclear security personnel alone cannot effectively guarantee adequate security. We propose that one solution to this emerging situation is a distributed security model in which the components of nuclear security become the responsibility of each and every worker at a nuclear facility. To implement this model, there needs to be a refurbishment of current workforce training and mentoring practices. The paper will present an example of distributed security framework model, and how it may look in practice. (author)

  19. Distributed mobility management - framework & analysis

    NARCIS (Netherlands)

    Liebsch, M.; Seite, P.; Karagiannis, Georgios

    2013-01-01

    Mobile operators consider the distribution of mobility anchors to enable offloading some traffic from their core network. The Distributed Mobility Management (DMM) Working Group is investigating the impact of decentralized mobility management to existing protocol solutions, while taking into account

  20. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  1. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  2. Dreams: a framework for distributed synchronous coordination

    NARCIS (Netherlands)

    Proença, J.; Clarke, D.; Vink, de E.P.; Arbab, F.

    2012-01-01

    Synchronous coordination systems, such as Reo, exchange data via indivisible actions, while distributed systems are typically asynchronous and assume that messages can be delayed or get lost. To combine these seemingly contradictory notions, we introduce the Dreams framework. Coordination patterns

  3. Knowledge Framework Implementation with Multiple Architectures - 13090

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P. [Applied Research Center, Florida International University, Miami, FL 33174 (United States); DeGregory, J. [Office of D and D and Facility Engineering, Environmental Management, Department of Energy (United States)

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  4. Knowledge Framework Implementation with Multiple Architectures - 13090

    International Nuclear Information System (INIS)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P.; DeGregory, J.

    2013-01-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  5. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice.......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution...

  6. A Distributed Framework for Real Time Path Planning in Practical Multi-agent Systems

    KAUST Repository

    Abdelkader, Mohamed; Jaleel, Hassan; Shamma, Jeff S.

    2017-01-01

    We present a framework for distributed, energy efficient, and real time implementable algorithms for path planning in multi-agent systems. The proposed framework is presented in the context of a motivating example of capture the flag which

  7. Maintenance Management in Network Utilities Framework and Practical Implementation

    CERN Document Server

    Gómez Fernández, Juan F

    2012-01-01

    In order to satisfy the needs of their customers, network utilities require specially developed maintenance management capabilities. Maintenance Management information systems are essential to ensure control, gain knowledge and improve-decision making in companies dealing with network infrastructure, such as distribution of gas, water, electricity and telecommunications. Maintenance Management in Network Utilities studies specified characteristics of maintenance management in this sector to offer a practical approach to defining and implementing  the best management practices and suitable frameworks.   Divided into three major sections, Maintenance Management in Network Utilities defines a series of stages which can be followed to manage maintenance frameworks properly. Different case studies provide detailed descriptions which illustrate the experience in real company situations. An introduction to the concepts is followed by main sections including: • A Literature Review: covering the basic concepts an...

  8. Distributed Framework for Prototyping of Observability Concepts in Smart Grids

    DEFF Research Database (Denmark)

    Prostejovsky, Alexander; Gehrke, Oliver; Kosek, Anna Magdalena

    2015-01-01

    —Development and testing of distributed monitoring, visualisation, and decision support concepts for future power systems require appropriate modelling tools that represent both the electrical side of the grid, as well as the communication and logical relations between the acting entities....... This work presents an Observability Framework for distributed data acquisition and knowledge inference that aims to facilitate the development of these distributed concepts. They are realised as applications that run within the framework and are able to access the information on the grid topology and states...... via an abstract information model. Data is acquired dynamically over low-level data interfaces that allow for easy integration within heterogeneous environments. A Multi-Agent System platform was chosen for implementation, where agents represent the different electrical and logical grid elements...

  9. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  10. Framework for Leading Next Generation Science Standards Implementation

    Science.gov (United States)

    Stiles, Katherine; Mundry, Susan; DiRanna, Kathy

    2017-01-01

    In response to the need to develop leaders to guide the implementation of the Next Generation Science Standards (NGSS), the Carnegie Corporation of New York provided funding to WestEd to develop a framework that defines the leadership knowledge and actions needed to effectively implement the NGSS. The development of the framework entailed…

  11. Quality Implementation in Transition: A Framework for Specialists and Administrators.

    Science.gov (United States)

    Wald, Judy L.; Repetto, Jeanne B.

    1995-01-01

    Quality Implementation in Transition is a framework designed to guide transition specialists and administrators in the implementation of total quality management. The framework uses the tenets set forth by W. Edwards Deming and is intended to help professionals facilitate change within transition programs. (Author/JOW)

  12. Distributed team innovation - a framework for distributed product development

    OpenAIRE

    Larsson, Andreas; Törlind, Peter; Karlsson, Lennart; Mabogunje, Ade; Leifer, Larry; Larsson, Tobias; Elfström, Bengt-Olof

    2003-01-01

    In response to the need for increased effectivity in global product development, the Polhem Laboratory at Luleå University of Technology, Sweden, and the Center for Design Research at Stanford University, USA, have created the concept of Distributed Team Innovation (DTI). The overall aim of the DTI framework is to decrease the negative impact of geographic distance on product development efforts and to further enhance current advantages of worldwide, multidisciplinary collaboration. The DTI ...

  13. Post Implementation Review Framework and Procedures

    Data.gov (United States)

    Social Security Administration — This template outlines the Social Security Administration's (SSA) approach to initiating, conducting, and completing Post Implementation Reviews (PIRs). The template...

  14. Assessing citation networks for dissemination and implementation research frameworks.

    Science.gov (United States)

    Skolarus, Ted A; Lehmann, Todd; Tabak, Rachel G; Harris, Jenine; Lecy, Jesse; Sales, Anne E

    2017-07-28

    A recent review of frameworks used in dissemination and implementation (D&I) science described 61 judged to be related either to dissemination, implementation, or both. The current use of these frameworks and their contributions to D&I science more broadly has yet to be reviewed. For these reasons, our objective was to determine the role of these frameworks in the development of D&I science. We used the Web of Science™ Core Collection and Google Scholar™ to conduct a citation network analysis for the key frameworks described in a recent systematic review of D&I frameworks (Am J Prev Med 43(3):337-350, 2012). From January to August 2016, we collected framework data including title, reference, publication year, and citations per year and conducted descriptive and main path network analyses to identify those most important in holding the current citation network for D&I frameworks together. The source article contained 119 cited references, with 50 published articles and 11 documents identified as a primary framework reference. The average citations per year for the 61 frameworks reviewed ranged from 0.7 to 103.3 among articles published from 1985 to 2012. Citation rates from all frameworks are reported with citation network analyses for the framework review article and ten highly cited framework seed articles. The main path for the D&I framework citation network is presented. We examined citation rates and the main paths through the citation network to delineate the current landscape of D&I framework research, and opportunities for advancing framework development and use. Dissemination and implementation researchers and practitioners may consider frequency of framework citation and our network findings when planning implementation efforts to build upon this foundation and promote systematic advances in D&I science.

  15. Design and Implementation of Distributed Crawler System Based on Scrapy

    Science.gov (United States)

    Fan, Yuhao

    2018-01-01

    At present, some large-scale search engines at home and abroad only provide users with non-custom search services, and a single-machine web crawler cannot sovle the difficult task. In this paper, Through the study and research of the original Scrapy framework, the original Scrapy framework is improved by combining Scrapy and Redis, a distributed crawler system based on Web information Scrapy framework is designed and implemented, and Bloom Filter algorithm is applied to dupefilter modul to reduce memory consumption. The movie information captured from douban is stored in MongoDB, so that the data can be processed and analyzed. The results show that distributed crawler system based on Scrapy framework is more efficient and stable than the single-machine web crawler system.

  16. Implementation of the Leaching Environmental Assessment Framework

    Science.gov (United States)

    New leaching tests are available in the U.S. for developing more accurate source terms for use in fate and transport models. For beneficial use or disposal, the use of the leaching environmental assessment framework (LEAF) will provide leaching results that reflect field condit...

  17. A development framework for distributed artificial intelligence

    Science.gov (United States)

    Adler, Richard M.; Cottman, Bruce H.

    1989-01-01

    The authors describe distributed artificial intelligence (DAI) applications in which multiple organizations of agents solve multiple domain problems. They then describe work in progress on a DAI system development environment, called SOCIAL, which consists of three primary language-based components. The Knowledge Object Language defines models of knowledge representation and reasoning. The metaCourier language supplies the underlying functionality for interprocess communication and control access across heterogeneous computing environments. The metaAgents language defines models for agent organization coordination, control, and resource management. Application agents and agent organizations will be constructed by combining metaAgents and metaCourier building blocks with task-specific functionality such as diagnostic or planning reasoning. This architecture hides implementation details of communications, control, and integration in distributed processing environments, enabling application developers to concentrate on the design and functionality of the intelligent agents and agent networks themselves.

  18. Distributed tactical reasoning framework for intelligent vehicles

    Science.gov (United States)

    Sukthankar, Rahul; Pomerleau, Dean A.; Thorpe, Chuck E.

    1998-01-01

    In independent vehicle concepts for the Automated Highway System (AHS), the ability to make competent tactical-level decisions in real-time is crucial. Traditional approaches to tactical reasoning typically involve the implementation of large monolithic systems, such as decision trees or finite state machines. However, as the complexity of the environment grows, the unforeseen interactions between components can make modifications to such systems very challenging. For example, changing an overtaking behavior may require several, non-local changes to car-following, lane changing and gap acceptance rules. This paper presents a distributed solution to the problem. PolySAPIENT consists of a collection of autonomous modules, each specializing in a particular aspect of the driving task - classified by traffic entities rather than tactical behavior. Thus, the influence of the vehicle ahead on the available actions is managed by one reasoning object, while the implications of an approaching exit are managed by another. The independent recommendations form these reasoning objects are expressed in the form of votes and vetos over a 'tactical action space', and are resolved by a voting arbiter. This local independence enables PolySAPIENT reasoning objects to be developed independently, using a heterogenous implementation. PolySAPIENT vehicles are implemented in the SHIVA tactical highway simulator, whose vehicles are based on the Carnegie Mellon Navlab robots.

  19. A Framework for Distributed Problem Solving

    Science.gov (United States)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  20. The LabVIEW RADE framework distributed architecture

    International Nuclear Information System (INIS)

    Andreassen, O.O.; Kudryavtsev, D.; Raimondo, A.; Rijllart, A.; Shaipov, V.; Sorokoletov, R.

    2012-01-01

    For accelerator GUI (Graphical User Interface) applications there is a need for a rapid development environment (RADE) to create expert tools or to prototype operator applications. Typically a variety of tools are being used, such as Matlab or Excel, but their scope is limited, either because of their low flexibility or limited integration into the accelerator infrastructure. In addition, having several tools obliges users to deal with different programming techniques and data structures. We have addressed these limitations by using LabVIEW, extending it with interfaces to C++ and Java. In this way it fulfills requirements of ease of use, flexibility and connectivity, which makes up what we refer to as the RADE framework. Recent application requirements could only be met by implementing a distributed architecture with multiple servers running multiple services. This brought us the additional advantage to implement redundant services, to increase the availability and to make transparent updates. We will present two applications requiring high availability. We also report on issues encountered with such a distributed architecture and how we have addressed them. The latest extension of the framework is to industrial equipment, with program templates and drivers for PLCs (Siemens and Schneider) and PXI with LabVIEW-Real Time. (authors)

  1. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2014-05-01

    The GFCS is a global partnership of governments and UN and international agencies that produce and use climate information and services. WMO, which is leading the initiative in collaboration with UN ISDR, WHO, WFP, FAO, UNESCO, UNDP and other UN and international partners are pooling their expertise and resources in order to co-design and co-produce knowledge, information and services to support effective decision making in response to climate variability and change in four priority areas (agriculture and fod security, water, health and disaster risk reduction). To address the entire value chain for the effective production and application of climate services the GFCS main components or pillars are being implemented, namely: • User Interface Platform — to provide ways for climate service users and providers to interact to identify needs and capacities and improve the effectiveness of the Framework and its climate services; • Climate Services Information System — to produce and distribute climate data, products and information according to the needs of users and to agreed standards; • Observations and Monitoring - to generate the necessary data for climate services according to agreed standards; • Research, Modelling and Prediction — to harness science capabilities and results and develop appropriate tools to meet the needs of climate services; • Capacity Building — to support the systematic development of the institutions, infrastructure and human resources needed for effective climate services. Activities are being implemented in various countries in Africa, the Caribbean and South pacific Islands. This paper will provide details on the status of implementation of the GFCS worldwider.

  2. A penalized framework for distributed lag non-linear models.

    Science.gov (United States)

    Gasparrini, Antonio; Scheipl, Fabian; Armstrong, Ben; Kenward, Michael G

    2017-09-01

    Distributed lag non-linear models (DLNMs) are a modelling tool for describing potentially non-linear and delayed dependencies. Here, we illustrate an extension of the DLNM framework through the use of penalized splines within generalized additive models (GAM). This extension offers built-in model selection procedures and the possibility of accommodating assumptions on the shape of the lag structure through specific penalties. In addition, this framework includes, as special cases, simpler models previously proposed for linear relationships (DLMs). Alternative versions of penalized DLNMs are compared with each other and with the standard unpenalized version in a simulation study. Results show that this penalized extension to the DLNM class provides greater flexibility and improved inferential properties. The framework exploits recent theoretical developments of GAMs and is implemented using efficient routines within freely available software. Real-data applications are illustrated through two reproducible examples in time series and survival analysis. © 2017 The Authors Biometrics published by Wiley Periodicals, Inc. on behalf of International Biometric Society.

  3. Vertical Load Distribution for Cloud Computing via Multiple Implementation Options

    Science.gov (United States)

    Phan, Thomas; Li, Wen-Syan

    Cloud computing looks to deliver software as a provisioned service to end users, but the underlying infrastructure must be sufficiently scalable and robust. In our work, we focus on large-scale enterprise cloud systems and examine how enterprises may use a service-oriented architecture (SOA) to provide a streamlined interface to their business processes. To scale up the business processes, each SOA tier usually deploys multiple servers for load distribution and fault tolerance, a scenario which we term horizontal load distribution. One limitation of this approach is that load cannot be distributed further when all servers in the same tier are loaded. In complex multi-tiered SOA systems, a single business process may actually be implemented by multiple different computation pathways among the tiers, each with different components, in order to provide resilience and scalability. Such multiple implementation options gives opportunities for vertical load distribution across tiers. In this chapter, we look at a novel request routing framework for SOA-based enterprise computing with multiple implementation options that takes into account the options of both horizontal and vertical load distribution.

  4. Architectural notes: a framework for distributed systems development

    NARCIS (Netherlands)

    Pires, L.F.; Ferreira Pires, Luis

    1994-01-01

    This thesis develops a framework of methods and techniques for distributed systems development. This framework consists of two related domains in which design concepts for distributed systems are defined: the entity domain and the behaviour domain. In the entity domain we consider structures of

  5. 2016-2020 Strategic Plan and Implementing Framework

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-01

    The 2016-2020 Strategic Plan and Implementing Framework from the Office of Energy Efficiency and Renewable Energy (EERE) is the blueprint for launching the nation’s leadership in the global clean energy economy. This document will guide the organization to build on decades of progress in powering our nation from clean, affordable and secure energy.

  6. A framework for risk assessment on lean production implementation

    Directory of Open Access Journals (Sweden)

    Giuliano Almeida Marodin

    2014-02-01

    Full Text Available The organizational and technical complexity of implementing the lean principles and practices can become an extensively time consuming journey with few benefits. We argue that risk assessment can aid on the understanding and management of the major difficulties on the Lean production implementation (LPI. Thus, this paper proposes a framework for risk assessment on the LPI process. The literature review permitted to adapt the risk assessment steps to the characteristics of the LPI and develop data collection and analysis procedures for each step. The Sociotechnical systems (STS theory was brought in to improve the understanding of the context’s characteristics on the proposed framework because it has a major influence on the LPI. The framework was has five steps: (a defining the unit of analysis; (b describing the context; (c risk identification; (d risk analysis; and (e risk relationships modeling.

  7. Using a Commercial Framework to Implement and Enhance the IEEE 1451.1 Standard

    OpenAIRE

    Viegas, Vítor; Pereira, José Dias; Girão, P. Silva

    2005-01-01

    In 1999, the 1451.1 Std was published defining a common object model and interface specification to develop open, multi-vendor distributed measurement and control systems. However, despite the well-known advantages of the model, few have been the initiatives to implement it. In this paper we describe the implementation of a NCAP – Network Capable Application Processor, in a well-known and well-proven infrastructure: the Microsoft .NET Framework. The choice of a commercial framework was part o...

  8. Quality Assurance Framework Implementation Guide for Isolated Community Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Esterly, Sean R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Baring-Gould, Edward I. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burman, Kari A. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Greacen, Chris [Independent Consultant (United States)

    2017-08-15

    This implementation guide is a companion document to the 'Quality Assurance Framework for Mini-Grids' technical report. This document is intended to be used by one of the many stakeholder groups that take part in the implementation of isolated power systems. Although the QAF could be applied to a single system, it was designed primarily to be used within the context of a larger national or regional rural electrification program in which many individual systems are being installed. This guide includes a detailed overview of the Quality Assurance Framework and provides guidance focused on the implementation of the Framework from the perspective of the different stakeholders that are commonly involved in expanding energy development within specific communities or regions. For the successful long-term implementation of a specific rural electrification program using mini-grid systems, six key stakeholders have been identified that are typically engaged, each with a different set of priorities 1. Regulatory agency 2. Governmental ministry 3. System developers 4. Mini-utility 5. Investors 6. Customers/consumers. This document is broken into two distinct sections. The first focuses on the administrative processes in the development and operation of community-based mini-grid programs, while the second focuses on the process around the installation of the mini-grid project itself.

  9. Heartbeat-based error diagnosis framework for distributed embedded systems

    Science.gov (United States)

    Mishra, Swagat; Khilar, Pabitra Mohan

    2012-01-01

    Distributed Embedded Systems have significant applications in automobile industry as steer-by-wire, fly-by-wire and brake-by-wire systems. In this paper, we provide a general framework for fault detection in a distributed embedded real time system. We use heartbeat monitoring, check pointing and model based redundancy to design a scalable framework that takes care of task scheduling, temperature control and diagnosis of faulty nodes in a distributed embedded system. This helps in diagnosis and shutting down of faulty actuators before the system becomes unsafe. The framework is designed and tested using a new simulation model consisting of virtual nodes working on a message passing system.

  10. Distributed inter process communication framework of BES III DAQ online software

    International Nuclear Information System (INIS)

    Li Fei; Liu Yingjie; Ren Zhenyu; Wang Liang; Chinese Academy of Sciences, Beijing; Chen Mali; Zhu Kejun; Zhao Jingwei

    2006-01-01

    DAQ (Data Acquisition) system is one important part of BES III, which is the large scale high-energy physics detector on the BEPC. The inter process communication (IPC) of online software in distributed environments is very pivotal for design and implement of DAQ system. This article will introduce one distributed inter process communication framework, which is based on CORBA and used in BES III DAQ online software. The article mainly presents the design and implementation of the IPC framework and application based on IPC. (authors)

  11. An implementation framework for additive manufacturing in supply chains

    Directory of Open Access Journals (Sweden)

    Raed Handal

    2017-12-01

    Full Text Available Additive manufacturing has become one of the most important technologies in the manufacturing field. Full implementation of additive manufacturing will change many well-known management practices in the production sector. However, theoretical development in the field of additive manufacturing with regard to its impact on supply chain management is rare. While additive manufacturing is believed to revolutionize and enhance traditional manufacturing, there is no comprehensive toolset developed in the manufacturing field to assess the impact of additive manufacturing and determine the best production method that suits the applied supply chain strategy. A significant portion of the existing supply chain methods and frameworks were adopted in this study to examine the implementation of additive manufacturing in supply chain management. The aim of this study is to develop a framework to explain when additive manufacturing impacts supply chain management efficiently.

  12. Implementing a Mobile Social Media Framework for Designing Creative Pedagogies

    Directory of Open Access Journals (Sweden)

    Thomas Cochrane

    2014-08-01

    Full Text Available The rise of mobile social media provides unique opportunities for new and creative pedagogies. Pedagogical change requires a catalyst, and we argue that mobile social media can be utilized as such a catalyst. However, the mobile learning literature is dominated by case studies that retrofit traditional pedagogical strategies and pre-existing course activities onto mobile devices and social media. From our experiences of designing and implementing a series of mobile social media projects, the authors have developed a mobile social media framework for creative pedagogies. We illustrate the implementation of our mobile social media framework within the development of a new media minor (an elective set of four courses that explicitly integrates the unique technical and pedagogical affordances of mobile social media, with a focus upon student-generated content and student-determined learning (heutagogy. We argue that our mobile social media framework is potentially transferable to a range of educational contexts, providing a simple design framework for new pedagogies.

  13. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt

    2013-01-01

    This paper describes work being done towards a Framework for Applying the Reference Model for an Open Archival Information System (OAIS) to Distributed Digital Preservation (DDP). Such a Framework will be helpful for future analyses and/or audits of repositories that are performing digital...

  14. Supporting Collective Inquiry: A Technology Framework for Distributed Learning

    Science.gov (United States)

    Tissenbaum, Michael

    This design-based study describes the implementation and evaluation of a technology framework to support smart classrooms and Distributed Technology Enhanced Learning (DTEL) called SAIL Smart Space (S3). S3 is an open-source technology framework designed to support students engaged in inquiry investigations as a knowledge community. To evaluate the effectiveness of S3 as a generalizable technology framework, a curriculum named PLACE (Physics Learning Across Contexts and Environments) was developed to support two grade-11 physics classes (n = 22; n = 23) engaged in a multi-context inquiry curriculum based on the Knowledge Community and Inquiry (KCI) pedagogical model. This dissertation outlines three initial design studies that established a set of design principles for DTEL curricula, and related technology infrastructures. These principles guided the development of PLACE, a twelve-week inquiry curriculum in which students drew upon their community-generated knowledge base as a source of evidence for solving ill-structured physics problems based on the physics of Hollywood movies. During the culminating smart classroom activity, the S3 framework played a central role in orchestrating student activities, including managing the flow of materials and students using real-time data mining and intelligent agents that responded to emergent class patterns. S3 supported students' construction of knowledge through the use individual, collective and collaborative scripts and technologies, including tablets and interactive large-format displays. Aggregate and real-time ambient visualizations helped the teacher act as a wondering facilitator, supporting students in their inquiry where needed. A teacher orchestration tablet gave the teacher some control over the flow of the scripted activities, and alerted him to critical moments for intervention. Analysis focuses on S3's effectiveness in supporting students' inquiry across multiple learning contexts and scales of time, and in

  15. A Run-Time Verification Framework for Smart Grid Applications Implemented on Simulation Frameworks

    Energy Technology Data Exchange (ETDEWEB)

    Ciraci, Selim; Sozer, Hasan; Tekinerdogan, Bedir

    2013-05-18

    Smart grid applications are implemented and tested with simulation frameworks as the developers usually do not have access to large sensor networks to be used as a test bed. The developers are forced to map the implementation onto these frameworks which results in a deviation between the architecture and the code. On its turn this deviation makes it hard to verify behavioral constraints that are de- scribed at the architectural level. We have developed the ConArch toolset to support the automated verification of architecture-level behavioral constraints. A key feature of ConArch is programmable mapping for architecture to the implementation. Here, developers implement queries to identify the points in the target program that correspond to architectural interactions. ConArch generates run- time observers that monitor the flow of execution between these points and verifies whether this flow conforms to the behavioral constraints. We illustrate how the programmable mappings can be exploited for verifying behavioral constraints of a smart grid appli- cation that is implemented with two simulation frameworks.

  16. Architectural frameworks: defining the structures for implementing learning health systems.

    Science.gov (United States)

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline

  17. Development of framework for sustainable Lean implementation: an ISM approach

    Science.gov (United States)

    Jadhav, Jagdish Rajaram; Mantha, S. S.; Rane, Santosh B.

    2014-07-01

    The survival of any organization depends upon its competitive edge. Even though Lean is one of the most powerful quality improvement methodologies, nearly two-thirds of the Lean implementations results in failures and less than one-fifth of those implemented have sustained results. One of the most significant tasks of top management is to identify, understand and deploy the significant Lean practices like quality circle, Kanban, Just-in-time purchasing, etc. The term `bundle' is used to make groups of inter-related and internally consistent Lean practices. Eight significant Lean practice bundles have been identified based on literature reviewed and opinion of the experts. The order of execution of Lean practice bundles is very important. Lean practitioners must be able to understand the interrelationship between these practice bundles. The objective of this paper is to develop framework for sustainable Lean implementation using interpretive structural modelling approach.

  18. Adapting the Consolidated Framework for Implementation Research to Create Organizational Readiness and Implementation Tools for Project ECHO.

    Science.gov (United States)

    Serhal, Eva; Arena, Amanda; Sockalingam, Sanjeev; Mohri, Linda; Crawford, Allison

    2018-03-01

    The Project Extension for Community Healthcare Outcomes (ECHO) model expands primary care provider (PCP) capacity to manage complex diseases by sharing knowledge, disseminating best practices, and building a community of practice. The model has expanded rapidly, with over 140 ECHO projects currently established globally. We have used validated implementation frameworks, such as Damschroder's (2009) Consolidated Framework for Implementation Research (CFIR) and Proctor's (2011) taxonomy of implementation outcomes, combined with implementation experience to (1) create a set of questions to assess organizational readiness and suitability of the ECHO model and (2) provide those who have determined ECHO is the correct model with a checklist to support successful implementation. A set of considerations was created, which adapted and consolidated CFIR constructs to create ECHO-specific organizational readiness questions, as well as a process guide for implementation. Each consideration was mapped onto Proctor's (2011) implementation outcomes, and questions relating to the constructs were developed and reviewed for clarity. The Preimplementation list included 20 questions; most questions fall within Proctor's (2001) implementation outcome domains of "Appropriateness" and "Acceptability." The Process Checklist is a 26-item checklist to help launch an ECHO project; items map onto the constructs of Planning, Engaging, Executing, Reflecting, and Evaluating. Given that fidelity to the ECHO model is associated with robust outcomes, effective implementation is critical. These tools will enable programs to work through key considerations to implement a successful Project ECHO. Next steps will include validation with a diverse sample of ECHO projects.This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND), where it is permissible to download and share the work provided it is properly cited

  19. Designing the Distributed Model Integration Framework – DMIF

    NARCIS (Netherlands)

    Belete, Getachew F.; Voinov, Alexey; Morales, Javier

    2017-01-01

    We describe and discuss the design and prototype of the Distributed Model Integration Framework (DMIF) that links models deployed on different hardware and software platforms. We used distributed computing and service-oriented development approaches to address the different aspects of

  20. A Modeling Framework for Schedulability Analysis of Distributed Avionics Systems

    DEFF Research Database (Denmark)

    Han, Pujie; Zhai, Zhengjun; Nielsen, Brian

    2018-01-01

    This paper presents a modeling framework for schedulability analysis of distributed integrated modular avionics (DIMA) systems that consist of spatially distributed ARINC-653 modules connected by a unified AFDX network. We model a DIMA system as a set of stopwatch automata (SWA) in UPPAAL...

  1. Distributed software framework and continuous integration in hydroinformatics systems

    Science.gov (United States)

    Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao

    2017-08-01

    When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.

  2. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  3. HUMANITARIAN AID DISTRIBUTION FRAMEWORK FOR NATURAL DISASTER MANAGEMENT

    OpenAIRE

    Mohd, S.; Fathi, M. S.; Harun, A. N.

    2018-01-01

    Humanitarian aid distribution is associated with many activities, numerous disaster management stakeholders, enormous effort and different processes. For effective communication, humanitarian aid distribution activities require appropriate and up-to-date information to enhance collaboration, and improve integration. The purpose of this paper is to develop a humanitarian aid distribution framework for disaster management in Malaysia. The findings of this paper are based on a review of the huma...

  4. A distributed framework for inter-domain virtual network embedding

    Science.gov (United States)

    Wang, Zihua; Han, Yanni; Lin, Tao; Tang, Hui

    2013-03-01

    Network virtualization has been a promising technology for overcoming the Internet impasse. A main challenge in network virtualization is the efficient assignment of virtual resources. Existing work focused on intra-domain solutions whereas inter-domain situation is more practical in realistic setting. In this paper, we present a distributed inter-domain framework for mapping virtual networks to physical networks which can ameliorate the performance of the virtual network embedding. The distributed framework is based on a Multi-agent approach. A set of messages for information exchange is defined. We design different operations and IPTV use scenarios to validate the advantages of our framework. Use cases shows that our framework can solve the inter-domain problem efficiently.

  5. Implementation of Grid-computing Framework for Simulation in Multi-scale Structural Analysis

    Directory of Open Access Journals (Sweden)

    Data Iranata

    2010-05-01

    Full Text Available A new grid-computing framework for simulation in multi-scale structural analysis is presented. Two levels of parallel processing will be involved in this framework: multiple local distributed computing environments connected by local network to form a grid-based cluster-to-cluster distributed computing environment. To successfully perform the simulation, a large-scale structural system task is decomposed into the simulations of a simplified global model and several detailed component models using various scales. These correlated multi-scale structural system tasks are distributed among clusters and connected together in a multi-level hierarchy and then coordinated over the internet. The software framework for supporting the multi-scale structural simulation approach is also presented. The program architecture design allows the integration of several multi-scale models as clients and servers under a single platform. To check its feasibility, a prototype software system has been designed and implemented to perform the proposed concept. The simulation results show that the software framework can increase the speedup performance of the structural analysis. Based on this result, the proposed grid-computing framework is suitable to perform the simulation of the multi-scale structural analysis.

  6. Flexible patient information search and retrieval framework: pilot implementation

    Science.gov (United States)

    Erdal, Selnur; Catalyurek, Umit V.; Saltz, Joel; Kamal, Jyoti; Gurcan, Metin N.

    2007-03-01

    Medical centers collect and store significant amount of valuable data pertaining to patients' visit in the form of medical free-text. In addition, standardized diagnosis codes (International Classification of Diseases, Ninth Revision, Clinical Modification: ICD9-CM) related to those dictated reports are usually available. In this work, we have created a framework where image searches could be initiated through a combination of free-text reports as well as ICD9 codes. This framework enables more comprehensive search on existing large sets of patient data in a systematic way. The free text search is enriched by computer-aided inclusion of additional search terms enhanced by a thesaurus. This combination of enriched search allows users to access to a larger set of relevant results from a patient-centric PACS in a simpler way. Therefore, such framework is of particular use in tasks such as gathering images for desired patient populations, building disease models, and so on. As the motivating application of our framework, we implemented a search engine. This search engine processed two years of patient data from the OSU Medical Center's Information Warehouse and identified lung nodule location information using a combination of UMLS Meta-Thesaurus enhanced text report searches along with ICD9 code searches on patients that have been discharged. Five different queries with various ICD9 codes involving lung cancer were carried out on 172552 cases. Each search was completed under a minute on average per ICD9 code and the inclusion of UMLS thesaurus increased the number of relevant cases by 45% on average.

  7. Design and implementation of a standard framework for KSTAR control system

    International Nuclear Information System (INIS)

    Lee, Woongryol; Park, Mikyung; Lee, Taegu; Lee, Sangil; Yun, Sangwon; Park, Jinseop; Park, Kaprai

    2014-01-01

    Highlights: • We performed a standardized of control system in KSTAR. • EPICS based software framework is developed for the realization of various control systems. • The applicability of the framework is widened from a simple command dispatcher to the real time application. • Our framework supports the implementation of embedded IOC in FPGA board. - Abstract: Standardization of control system is an important issue in KSTAR which is organized with various heterogeneous systems. Diverse control systems in KSTAR have been adopting new application software since 2010. Development of this software was launched for easy implementation of a data acquisition system but it is extended to as a Standard Framework (SFW) of control system in KSTAR. It is composed with a single library, database, template, and descriptor files. The SFW based controller has common factors. It has non-blocking control command method with a thread. The internal sequence handler makes it can be synchronized with KSTAR experiment. It also has a ring buffer pool mechanism for streaming input data handling. Recently, there are two important functional improvements in the framework. Processor embedded FPGA was proposed as a standard hardware platform for specific application. These are also manipulated by the SFW based embedded application. This approach gives single board system an ability of low level distributed control under the EPICS environments. We also developed a real time monitoring system as a real time network inspection tool in 2012 campaign using the SFW

  8. Kodiak: An Implementation Framework for Branch and Bound Algorithms

    Science.gov (United States)

    Smith, Andrew P.; Munoz, Cesar A.; Narkawicz, Anthony J.; Markevicius, Mantas

    2015-01-01

    Recursive branch and bound algorithms are often used to refine and isolate solutions to several classes of global optimization problems. A rigorous computation framework for the solution of systems of equations and inequalities involving nonlinear real arithmetic over hyper-rectangular variable and parameter domains is presented. It is derived from a generic branch and bound algorithm that has been formally verified, and utilizes self-validating enclosure methods, namely interval arithmetic and, for polynomials and rational functions, Bernstein expansion. Since bounds computed by these enclosure methods are sound, this approach may be used reliably in software verification tools. Advantage is taken of the partial derivatives of the constraint functions involved in the system, firstly to reduce the branching factor by the use of bisection heuristics and secondly to permit the computation of bifurcation sets for systems of ordinary differential equations. The associated software development, Kodiak, is presented, along with examples of three different branch and bound problem types it implements.

  9. A Test Generation Framework for Distributed Fault-Tolerant Algorithms

    Science.gov (United States)

    Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.

    2009-01-01

    Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.

  10. A Distributed Framework for Real Time Path Planning in Practical Multi-agent Systems

    KAUST Repository

    Abdelkader, Mohamed

    2017-10-19

    We present a framework for distributed, energy efficient, and real time implementable algorithms for path planning in multi-agent systems. The proposed framework is presented in the context of a motivating example of capture the flag which is an adversarial game played between two teams of autonomous agents called defenders and attackers. We start with the centralized formulation of the problem as a linear program because of its computational efficiency. Then we present an approximation framework in which each agent solves a local version of the centralized linear program by communicating with its neighbors only. The premise in this work is that for practical multi-agent systems, real time implementability of distributed algorithms is more crucial then global optimality. Thus, instead of verifying the proposed framework by performing offline simulations in MATLAB, we run extensive simulations in a robotic simulator V-REP, which includes a detailed dynamic model of quadrotors. Moreover, to create a realistic scenario, we allow a human operator to control the attacker quadrotor through a joystick in a single attacker setup. These simulations authenticate that the proposed framework is real time implementable and results in a performance that is comparable with the global optimal solution under the considered scenarios.

  11. DISCRN: A Distributed Storytelling Framework for Intelligence Analysis.

    Science.gov (United States)

    Shukla, Manu; Dos Santos, Raimundo; Chen, Feng; Lu, Chang-Tien

    2017-09-01

    Storytelling connects entities (people, organizations) using their observed relationships to establish meaningful storylines. This can be extended to spatiotemporal storytelling that incorporates locations, time, and graph computations to enhance coherence and meaning. But when performed sequentially these computations become a bottleneck because the massive number of entities make space and time complexity untenable. This article presents DISCRN, or distributed spatiotemporal ConceptSearch-based storytelling, a distributed framework for performing spatiotemporal storytelling. The framework extracts entities from microblogs and event data, and links these entities using a novel ConceptSearch to derive storylines in a distributed fashion utilizing key-value pair paradigm. Performing these operations at scale allows deeper and broader analysis of storylines. The novel parallelization techniques speed up the generation and filtering of storylines on massive datasets. Experiments with microblog posts such as Twitter data and Global Database of Events, Language, and Tone events show the efficiency of the techniques in DISCRN.

  12. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    International Nuclear Information System (INIS)

    Heo, Jaeseok; Kim, Kyung Doo

    2015-01-01

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper

  13. PAPIRUS, a parallel computing framework for sensitivity analysis, uncertainty propagation, and estimation of parameter distribution

    Energy Technology Data Exchange (ETDEWEB)

    Heo, Jaeseok, E-mail: jheo@kaeri.re.kr; Kim, Kyung Doo, E-mail: kdkim@kaeri.re.kr

    2015-10-15

    Highlights: • We developed an interface between an engineering simulation code and statistical analysis software. • Multiple packages of the sensitivity analysis, uncertainty quantification, and parameter estimation algorithms are implemented in the framework. • Parallel computing algorithms are also implemented in the framework to solve multiple computational problems simultaneously. - Abstract: This paper introduces a statistical data analysis toolkit, PAPIRUS, designed to perform the model calibration, uncertainty propagation, Chi-square linearity test, and sensitivity analysis for both linear and nonlinear problems. The PAPIRUS was developed by implementing multiple packages of methodologies, and building an interface between an engineering simulation code and the statistical analysis algorithms. A parallel computing framework is implemented in the PAPIRUS with multiple computing resources and proper communications between the server and the clients of each processor. It was shown that even though a large amount of data is considered for the engineering calculation, the distributions of the model parameters and the calculation results can be quantified accurately with significant reductions in computational effort. A general description about the PAPIRUS with a graphical user interface is presented in Section 2. Sections 2.1–2.5 present the methodologies of data assimilation, uncertainty propagation, Chi-square linearity test, and sensitivity analysis implemented in the toolkit with some results obtained by each module of the software. Parallel computing algorithms adopted in the framework to solve multiple computational problems simultaneously are also summarized in the paper.

  14. Communication Optimizations for a Wireless Distributed Prognostic Framework

    Science.gov (United States)

    Saha, Sankalita; Saha, Bhaskar; Goebel, Kai

    2009-01-01

    Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication overhead is an important design problem for such systems. In this paper we focus on communication issues faced in the distributed implementation of an important class of algorithms for prognostics - particle filters. In spite of being computation and memory intensive, particle filters lend well to distributed implementation except for one significant step - resampling. We propose new resampling scheme called parameterized resampling that attempts to reduce communication between collaborating nodes in a distributed wireless sensor network. Analysis and comparison with relevant resampling schemes is also presented. A battery health management system is used as a target application. A new resampling scheme for distributed implementation of particle filters has been discussed in this paper. Analysis and comparison of this new scheme with existing resampling schemes in the context for minimizing communication overhead have also been discussed. Our proposed new resampling scheme performs significantly better compared to other schemes by attempting to reduce both the communication message length as well as number total communication messages exchanged while not compromising prediction accuracy and precision. Future work will explore the effects of the new resampling scheme in the overall computational performance of the whole system as well as full implementation of the new schemes on the Sun SPOT devices. Exploring different network architectures for efficient communication is an importance future research direction as well.

  15. Distributed S-Net: design and implementation

    NARCIS (Netherlands)

    Grelck, C.; Julku, J.; Penczek, F.; Morazan, M.

    2009-01-01

    S-Net is a declarative coordination language and component technology aimed at modern multi-core/many-core architectures and systems-on-chip. It builds on the concept of stream processing to structure networks of communicating asynchronous components, which can be implemented using a conventional

  16. Spiking Activity of a LIF Neuron in Distributed Delay Framework

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-06-01

    Full Text Available Evolution of membrane potential and spiking activity for a single leaky integrate-and-fire (LIF neuron in distributed delay framework (DDF is investigated. DDF provides a mechanism to incorporate memory element in terms of delay (kernel function into a single neuron models. This investigation includes LIF neuron model with two different kinds of delay kernel functions, namely, gamma distributed delay kernel function and hypo-exponential distributed delay kernel function. Evolution of membrane potential for considered models is studied in terms of stationary state probability distribution (SPD. Stationary state probability distribution of membrane potential (SPDV for considered neuron models are found asymptotically similar which is Gaussian distributed. In order to investigate the effect of membrane potential delay, rate code scheme for neuronal information processing is applied. Firing rate and Fano-factor for considered neuron models are calculated and standard LIF model is used for comparative study. It is noticed that distributed delay increases the spiking activity of a neuron. Increase in spiking activity of neuron in DDF is larger for hypo-exponential distributed delay function than gamma distributed delay function. Moreover, in case of hypo-exponential delay function, a LIF neuron generates spikes with Fano-factor less than 1.

  17. Implementing Distributed Algorithms using Remote Procedure Call

    NARCIS (Netherlands)

    Bal, H.E.; van Renesse, R.; Tanenbaum, A.S.

    1987-01-01

    Remote procedure call (RPC) is a simple yet powerful primitiv~ for communication and synchronization between distributed processes. A problem with RPC is that it tends to decrease the amount of parallelism in an application due to its synchronous nature. This paper shows how light-weight processes

  18. A framework for distributed mixed-language scientific applications

    International Nuclear Information System (INIS)

    Quarrie, D.R.

    1996-01-01

    The Object Management Group has defined an architecture (COBRA) for distributed object applications based on an Object Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel subs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL. (author)

  19. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  20. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  1. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  2. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

    Directory of Open Access Journals (Sweden)

    Alexander Jeffery A

    2009-08-01

    Full Text Available Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality, four constructs were identified related to outer setting (e.g., patient needs and resources, 12 constructs were identified related to inner setting (e.g., culture

  3. A Distributed Python HPC Framework: ODIN, PyTrilinos, & Seamless

    Energy Technology Data Exchange (ETDEWEB)

    Grant, Robert [Enthought, Inc., Austin, TX (United States)

    2015-11-23

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports the DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.

  4. A General Framework for Analyzing, Characterizing, and Implementing Spectrally Modulated, Spectrally Encoded Signals

    National Research Council Canada - National Science Library

    Roberts, Marcus L

    2006-01-01

    .... Research is rapidly progressing in SDR hardware and software venues, but current CR-based SDR research lacks the theoretical foundation and analytic framework to permit efficient implementation...

  5. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    Science.gov (United States)

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  6. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt; Skinner, Katherine

    apparatuses in order to achieve the reliable persistence of digital content. Although the use of distribution is common within the preservation field, there is not yet an accepted definition for “distributed digital preservation”. As the preservation field has matured, the term “distributed digital...... preservation” has been applied to myriad preservation approaches. In the white paper we define DDP as the use of replication, independence, and coordination to address the known threats to digital content through time to ensure their accessibility. The preservation field relies heavily upon an international......, delineating the various trends and practices that compel an elaboration upon OAIS, identifying the challenges ahead for advancing this endeavor, and putting forward a series of recommendations for making progress toward developing a formal framework for a DDP environment....

  7. Implementation of Perioperative Music Using the Consolidated Framework for Implementation Research.

    Science.gov (United States)

    Carter, Jessica E; Pyati, Srinivas; Kanach, Frances A; Maxwell, Ann Miller W; Belden, Charles M; Shea, Christopher M; Van de Ven, Thomas; Thompson, Jillian; Hoenig, Helen; Raghunathan, Karthik

    2018-06-12

    Complementary integrative health therapies have a perioperative role in the reduction of pain, analgesic use, and anxiety, and increasing patient satisfaction. However, long implementation lags have been quantified. The Consolidated Framework for Implementation Research (CFIR) can help mitigate this translational problem. We reviewed evidence for several nonpharmacological treatments (CFIR domain: characteristics of interventions) and studied external context and organizational readiness for change by surveying providers at 11 Veterans Affairs (VA) hospitals (domains: outer and inner settings). We asked patients about their willingness to receive music and studied the association between this and known risk factors for opioid use (domain: characteristics of individuals). We implemented a protocol for the perioperative use of digital music players loaded with veteran-preferred playlists and evaluated its penetration in a subgroup of patients undergoing joint replacements over a 6-month period (domain: process of implementation). We then extracted data on postoperative recovery time and other outcomes, comparing them with historic and contemporary cohorts. Evidence varied from strong and direct for perioperative music and acupuncture, to modest or weak and indirect for mindfulness, yoga, and tai chi, respectively. Readiness for change surveys completed by 97 perioperative providers showed overall positive scores (mean >0 on a scale from -2 to +2, equivalent to >2.5 on the 5-point Likert scale). Readiness was higher at Durham (+0.47) versus most other VA hospitals (range +0.05 to +0.63). Of 3307 veterans asked about willingness to receive music, approximately 68% (n = 2252) answered "yes." In multivariable analyses, a positive response (acceptability) was independently predicted by younger age and higher mean preoperative pain scores (>4 out of 10 over 90 days before admission), factors associated with opioid overuse. Penetration was modest in the targeted subset (39

  8. MAPI: a software framework for distributed biomedical applications

    Directory of Open Access Journals (Sweden)

    Karlsson Johan

    2013-01-01

    Full Text Available Abstract Background The amount of web-based resources (databases, tools etc. in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license.

  9. Framework and implementation for improving physics essential skills via computer-based practice: Vector math

    Science.gov (United States)

    Mikula, Brendon D.; Heckler, Andrew F.

    2017-06-01

    We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with a careful identification of target skills and the study of specific student difficulties with these skills. It then employs computer-based instruction, immediate feedback, mastery grading, and well-researched principles from cognitive psychology such as interleaved training sequences and distributed practice. We implemented this with more than 1500 students over 2 semesters. Students completed the mastery practice for an average of about 13 min /week , for a total of about 2-3 h for the whole semester. Results reveal large (>1 SD ) pretest to post-test gains in accuracy in vector skills, even compared to a control group, and these gains were retained at least 2 months after practice. We also find evidence of improved fluency, student satisfaction, and that awarding regular course credit results in higher participation and higher learning gains than awarding extra credit. In all, we find that simple computer-based mastery practice is an effective and efficient way to improve a set of basic and essential skills for introductory physics.

  10. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2015-04-01

    The World Climate Conference-3 (Geneva 2009) unanimously decided to establish the Global Framework for Climate Services (GFCS), a UN-led initiative spearheaded by WMO to guide the development and application of science-based climate information and services in support of decision-making in climate sensitive sectors. By promoting science-based decision-making, the GFCS is empowering governments, communities and companies to build climate resilience, reduce vulnerabilities and adapt to impacts. The initial priority areas of GFCS are Agriculture and Food Security; Disaster Risk Reduction; Health; and Water Resources. The implementation of GFCS is well underway with a governance structure now fully established. The governance structure of GFCS includes the Partner Advisory Committee (PAC), which is GFCS's stakeholder engagement mechanism. The membership of the PAC allows for a broad participation of stakeholders. The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), the European Commission (EC), the Food and Agriculture Organization of the UN (FAO), the Global Water Partnership (GWP), the International Federation of Red Cross and Red Crescent Societies (IFRC), the International Union of Geodesy and Geophysics (IUGG), United Nations Environment Programme (UNEP), the United Nations Institute for Training and Research (UNITAR), the World Business Council for Sustainable Development (WBCSD), the World Food Programme (WFP) and WMO have already joined the PAC. Activities are being implemented in various countries in Africa, the Caribbean, Asia and Pacific Small Islands Developing States through flagship projects and activities in the four priority areas of GFCS to enable the development of a Proof of Concept. The focus at national level is on strengthening institutional capacities needed for development of capacities for co-design and co-production of climate services and their application in support of decision-making in climate sensitive

  11. Development of a distributed air pollutant dry deposition modeling framework

    International Nuclear Information System (INIS)

    Hirabayashi, Satoshi; Kroll, Charles N.; Nowak, David J.

    2012-01-01

    A distributed air pollutant dry deposition modeling system was developed with a geographic information system (GIS) to enhance the functionality of i-Tree Eco (i-Tree, 2011). With the developed system, temperature, leaf area index (LAI) and air pollutant concentration in a spatially distributed form can be estimated, and based on these and other input variables, dry deposition of carbon monoxide (CO), nitrogen dioxide (NO 2 ), sulfur dioxide (SO 2 ), and particulate matter less than 10 microns (PM10) to trees can be spatially quantified. Employing nationally available road network, traffic volume, air pollutant emission/measurement and meteorological data, the developed system provides a framework for the U.S. city managers to identify spatial patterns of urban forest and locate potential areas for future urban forest planting and protection to improve air quality. To exhibit the usability of the framework, a case study was performed for July and August of 2005 in Baltimore, MD. - Highlights: ► A distributed air pollutant dry deposition modeling system was developed. ► The developed system enhances the functionality of i-Tree Eco. ► The developed system employs nationally available input datasets. ► The developed system is transferable to any U.S. city. ► Future planting and protection spots were visually identified in a case study. - Employing nationally available datasets and a GIS, this study will provide urban forest managers in U.S. cities a framework to quantify and visualize urban forest structure and its air pollution removal effect.

  12. A CONCEPTUAL FRAMEWORK OF DISTRIBUTIVE JUSTICE IN ISLAMIC ECONOMICS

    Directory of Open Access Journals (Sweden)

    Shafinah Begum Abdul Rahim

    2015-06-01

    Full Text Available itical, behavioural and social sciences both in mainstream or Islam. Given its increasing relevance to the global village we share and the intensity of socio-economic problems invariably related to the distribution of resources amongst us, this work is aimed at adding value through a deeper understanding and appreciation of justice placed by the Syariah in all domains of of our economic lives. The existing works within this area appear to lean mostly towards redistributive mechanisms available in the revealed knowledge. Hence a comprehensive analysis of the notion of distributive justice from the theoretical level translated into practical terms is expected to contribute significantly to policymakers committed towards finding permanent solutions to economic problems especially in the Muslim world. It is a modest yet serious attempt to bridge the gap between distributive justice in letter and spirit as clearly ordained in the Holy Quran. The entire analysis is based on critical reviews and appraisals of the all relevant literary on distributive justice in Islamic Economics. The final product is a conceptual framework that can be used as a blueprint in establishing the notion of justice in the distribution of economic resources, i.e. income and wealth as aspired by the Syariah.

  13. California Curriculum Frameworks: A Handbook for Production, Implementation, and Evaluation Activities.

    Science.gov (United States)

    California State Dept. of Education, Sacramento.

    This booklet describes the characteristics and role of curriculum frameworks and describes how they can be used in developing educational programs. It is designed as a guide for writers of frameworks, for educators who are responsible for implementing frameworks, or for evaluators of educational programs. It provides a concise description of the…

  14. A Framework Proposal For Choosing A New Business Implementation Model In Henkel

    OpenAIRE

    Li, Tsz Wan

    2015-01-01

    Henkel's New Business team is a corporate venturing unit that explores corporate entrepreneurial activities on behalf of Henkel Adhesives Technologies. The new business ideas are implemented through one of these models: incubator, venturing or innovation ecosystem. In current practice, there is no systematic framework in place to choose the implementation model. The goal of the thesis is to propose a framework for choosing the most appropriate model for implementation of a new business idea i...

  15. A framework for effective implementation of lean production in Small and Medium-sized Enterprises

    Directory of Open Access Journals (Sweden)

    Amine Belhadi

    2016-09-01

    Full Text Available Purpose: The present paper aims at developing an effective framework including all the components necessary for implementing lean production properly in Small and Medium-sized Enterprises. Design/methodology/approach: The paper begins with the review of the main existing framework of lean implementation in order to highlight shortcomings in the literature through a lack of suitable framework for small companies. To overcome this literature gap, data of successful initiatives of lean implementation were collected based on a multiple case study approach. These initiatives has been juxtaposed in order to develop a new, practical and effective framework that includes all the components (process, tools, success factors that are necessary to implement lean in Small and Medium-sized Enterprises. Findings: The proposed framework presents many significant contributions: First, it provides an overcoming for the limitations of the existing frameworks by proposing for consultants, researchers and organizations an effective framework for lean implementation in SMEs that allows SMEs to benefit from competitive advantages  gained by lean. Second, it brings together a set of the more essential and critical elements of lean implementation commonly used by SMEs and derived from the practical experiences of them in lean implementation. Finally, it highlights the successful experiences of small companies in implementing lean programs and then proves that lean can give a relevant results even for SMEs. Research limitations/implications: The proposed framework presents a number of limitations and still evokes extension for further researches: Although it was derived from practical experiences of SMEs, the proposed framework is not supported by practical implementation. On the other hand and even though the elements in the proposed framework from the practical experiences of four SMEs, the identified elements need to be generalized and enriching by conducting

  16. Promoting Action on Research Implementation in Health Services framework applied to TeamSTEPPS implementation in small rural hospitals.

    Science.gov (United States)

    Ward, Marcia M; Baloh, Jure; Zhu, Xi; Stewart, Greg L

    A particularly useful model for examining implementation of quality improvement interventions in health care settings is the PARIHS (Promoting Action on Research Implementation in Health Services) framework developed by Kitson and colleagues. The PARIHS framework proposes three elements (evidence, context, and facilitation) that are related to successful implementation. An evidence-based program focused on quality enhancement in health care, termed TeamSTEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety), has been widely promoted by the Agency for Healthcare Research and Quality, but research is needed to better understand its implementation. We apply the PARIHS framework in studying TeamSTEPPS implementation to identify elements that are most closely related to successful implementation. Quarterly interviews were conducted over a 9-month period in 13 small rural hospitals that implemented TeamSTEPPS. Interview quotes that were related to each of the PARIHS elements were identified using directed content analysis. Transcripts were also scored quantitatively, and bivariate regression analysis was employed to explore relationships between PARIHS elements and successful implementation related to planning activities. The current findings provide support for the PARIHS framework and identified two of the three PARIHS elements (context and facilitation) as important contributors to successful implementation. This study applies the PARIHS framework to TeamSTEPPS, a widely used quality initiative focused on improving health care quality and patient safety. By focusing on small rural hospitals that undertook this quality improvement activity of their own accord, our findings represent effectiveness research in an understudied segment of the health care delivery system. By identifying context and facilitation as the most important contributors to successful implementation, these analyses provide a focus for efficient and effective sustainment of Team

  17. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice.

    Science.gov (United States)

    Breimaier, Helga E; Heckemann, Birgit; Halfens, Ruud J G; Lohrmann, Christa

    2015-01-01

    Implementing clinical practice guidelines (CPGs) in healthcare settings is a complex intervention involving both independent and interdependent components. Although the Consolidated Framework for Implementation Research (CFIR) has never been evaluated in a practical context, it appeared to be a suitable theoretical framework to guide an implementation process. The aim of this study was to evaluate the comprehensiveness, applicability and usefulness of the CFIR in the implementation of a fall-prevention CPG in nursing practice to improve patient care in an Austrian university teaching hospital setting. The evaluation of the CFIR was based on (1) team-meeting minutes, (2) the main investigator's research diary, containing a record of a before-and-after, mixed-methods study design embedded in a participatory action research (PAR) approach for guideline implementation, and (3) an analysis of qualitative and quantitative data collected from graduate and assistant nurses in two Austrian university teaching hospital departments. The CFIR was used to organise data per and across time point(s) and assess their influence on the implementation process, resulting in implementation and service outcomes. Overall, the CFIR could be demonstrated to be a comprehensive framework for the implementation of a guideline into a hospital-based nursing practice. However, the CFIR did not account for some crucial factors during the planning phase of an implementation process, such as consideration of stakeholder aims and wishes/needs when implementing an innovation, pre-established measures related to the intended innovation and pre-established strategies for implementing an innovation. For the CFIR constructs reflecting & evaluating and engaging, a more specific definition is recommended. The framework and its supplements could easily be used by researchers, and their scope was appropriate for the complexity of a prospective CPG-implementation project. The CFIR facilitated qualitative data

  18. Implementing accountability for reasonableness framework at district level in Tanzania

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; SanSebastián, Miguel

    2011-01-01

    Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management...

  19. Legal framework for implementation of m-government in Ethiopia ...

    African Journals Online (AJOL)

    Higher penetration of mobile services in many countries, including Ethiopia, makes m-Government an eminent technological option for delivering government services to public and businesses. Although the Ethiopian government has introduced e-government services to the public, the legal framework to support such ...

  20. Implementation of disability policy framework in Namibia: A qualitative study

    Directory of Open Access Journals (Sweden)

    Tonderai W. Shumba

    2018-04-01

    Conclusions: The study revealed key issues that need to be addressed in reviewing the policy and legal framework so that it is responsive to the current needs of persons with disabilities. Further, the CBR programme needs an evaluation tool to assess its effectiveness and efficiency in meeting the needs of persons with disabilities and also to elicit their experiences and satisfaction.

  1. First Thoughts on Implementing the Framework for Information Literacy

    Science.gov (United States)

    Jacobson, Trudi E.; Gibson, Craig

    2015-01-01

    Following the action of the ACRL Board in February 2015 in accepting the "Framework for Information Literacy for Higher Education" as one of the "constellation of documents" that promote and guide information literacy instruction and program development, discussion in the library community continues about steps in implementing…

  2. Framework of ERP System Implementation for SMEs In Punjab

    OpenAIRE

    Sarvjit Singh

    2012-01-01

    Enterprise resource planning (ERP) system can effectively reduce product cost, improve customer service experience, and increase enterprise competitiveness is one of the most significant information systems for Punjab enterprises, However, the successful implementation rate of ERP system/is much lower than initially planned and many enterprises did not achieve their intended goals. There are a lot of factors (e.g., high implementation costs, technical complexity, lack of well-trained employee...

  3. Establishing an intrapreneurial orientation as strategy: A framework for implementation

    Directory of Open Access Journals (Sweden)

    H. Jacobs

    2001-12-01

    Full Text Available This paper describes a study aimed at increasing an organisation's ability to implement a strategy for establishing an intrapreneurial orientation effectively. Establishing an intrapreneurial orientation will be treated from a strategic management point of view, with the emphasis on the implementation phase of strategic management. As such, this study seeks to integrate theory and practice from the fields of strategic management and entrepreneurship.

  4. Framework for measuring the sustainability performance of ecodesign implementation

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    Companies and academic studies are consistently reporting several potential business benefits gained fromecodesign implementation, such as increased innovation potential, development of new markets and business models, reduction in environmental liability, risks and costs, improvement of organiza......Companies and academic studies are consistently reporting several potential business benefits gained fromecodesign implementation, such as increased innovation potential, development of new markets and business models, reduction in environmental liability, risks and costs, improvement......, in alignment with corporate sustainability strategy and main drivers....

  5. Design Of Real-Time Implementable Distributed Suboptimal Control: An LQR Perspective

    KAUST Repository

    Jaleel, Hassan

    2017-09-29

    We propose a framework for multiagent systems in which the agents compute their control actions in real time, based on local information only. The novelty of the proposed framework is that the process of computing a suboptimal control action is divided into two phases: an offline phase and an online phase. In the offline phase, an approximate problem is formulated with a cost function that is close to the optimal cost in some sense and is distributed, i.e., the costs of non-neighboring nodes are not coupled. This phase is centralized and is completed before the deployment of the system. In the online phase, the approximate problem is solved in real time by implementing any efficient distributed optimization algorithm. To quantify the performance loss, we derive upper bounds for the maximum error between the optimal performance and the performance under the proposed framework. Finally, the proposed framework is applied to an example setup in which a team of mobile nodes is assigned the task of establishing a communication link between two base stations with minimum energy consumption. We show through simulations that the performance under the proposed framework is close to the optimal performance and the suboptimal policy can be efficiently implemented online.

  6. Barriers to Implementing the Response to Intervention Framework in Secondary Schools: Interviews with Secondary Principals

    Science.gov (United States)

    Bartholomew, Mitch; De Jong, David

    2017-01-01

    Despite the successful implementation of the Response to Intervention (RtI) framework in many elementary schools, there is little evidence of successful implementation in high school settings. Several themes emerged from the interviews of nine secondary principals, including a lack of knowledge and training for successful implementation, the…

  7. A portable implementation of ARPACK for distributed memory parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Maschhoff, K.J.; Sorensen, D.C.

    1996-12-31

    ARPACK is a package of Fortran 77 subroutines which implement the Implicitly Restarted Arnoldi Method used for solving large sparse eigenvalue problems. A parallel implementation of ARPACK is presented which is portable across a wide range of distributed memory platforms and requires minimal changes to the serial code. The communication layers used for message passing are the Basic Linear Algebra Communication Subprograms (BLACS) developed for the ScaLAPACK project and Message Passing Interface(MPI).

  8. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review.

    Science.gov (United States)

    Birken, Sarah A; Powell, Byron J; Presseau, Justin; Kirk, M Alexis; Lorencatto, Fabiana; Gould, Natalie J; Shea, Christopher M; Weiner, Bryan J; Francis, Jill J; Yu, Yan; Haines, Emily; Damschroder, Laura J

    2017-01-05

    Over 60 implementation frameworks exist. Using multiple frameworks may help researchers to address multiple study purposes, levels, and degrees of theoretical heritage and operationalizability; however, using multiple frameworks may result in unnecessary complexity and redundancy if doing so does not address study needs. The Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF) are both well-operationalized, multi-level implementation determinant frameworks derived from theory. As such, the rationale for using the frameworks in combination (i.e., CFIR + TDF) is unclear. The objective of this systematic review was to elucidate the rationale for using CFIR + TDF by (1) describing studies that have used CFIR + TDF, (2) how they used CFIR + TDF, and (2) their stated rationale for using CFIR + TDF. We undertook a systematic review to identify studies that mentioned both the CFIR and the TDF, were written in English, were peer-reviewed, and reported either a protocol or results of an empirical study in MEDLINE/PubMed, PsycInfo, Web of Science, or Google Scholar. We then abstracted data into a matrix and analyzed it qualitatively, identifying salient themes. We identified five protocols and seven completed studies that used CFIR + TDF. CFIR + TDF was applied to studies in several countries, to a range of healthcare interventions, and at multiple intervention phases; used many designs, methods, and units of analysis; and assessed a variety of outcomes. Three studies indicated that using CFIR + TDF addressed multiple study purposes. Six studies indicated that using CFIR + TDF addressed multiple conceptual levels. Four studies did not explicitly state their rationale for using CFIR + TDF. Differences in the purposes that authors of the CFIR (e.g., comprehensive set of implementation determinants) and the TDF (e.g., intervention development) propose help to justify the use of CFIR

  9. Developing a Framework and Implementing User-Driven Innovation in Supply and Value Network

    DEFF Research Database (Denmark)

    Jacobsen, Alexia; Lassen, Astrid Heidemann; Wandahl, Søren

    2011-01-01

    This paper serves to create a framework for and, subsequently, implementing user-driven innovation in a construction material industry network. The research has its outset in Project InnoDoors that consists of a Danish university and a construction material network. The framework and the implemen......This paper serves to create a framework for and, subsequently, implementing user-driven innovation in a construction material industry network. The research has its outset in Project InnoDoors that consists of a Danish university and a construction material network. The framework...

  10. Implementation of a Parameterization Framework for Cybersecurity Laboratories

    Science.gov (United States)

    2017-03-01

    this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data ...that the student performed each step of an exercise to help instructors assess the level of learning by each student. The framework should automate ... automated assessment tools (AATs) created to help assess how students perform in programming courses. In the paper, “Are Automated Assessment Tools

  11. Framework for successfully implementing an inaugural GRI reporting process

    OpenAIRE

    Dudik, Anna

    2012-01-01

    Project submitted as partial requirement for the conferral of Master in International Management This thesis is a corporate project analyzing the Global Reporting Initiative (GRI) reporting process. Its main objective is to propose a practical framework to guide organizations that plan to engage in first-time voluntary sustainability reporting using GRI’s Sustainability Reporting Guidelines. The thesis provides insight into the exact tasks involved in each stage of the GRI repo...

  12. Mobile Autonomous Sensing Unit (MASU: A Framework That Supports Distributed Pervasive Data Sensing

    Directory of Open Access Journals (Sweden)

    Esunly Medina

    2016-07-01

    Full Text Available Pervasive data sensing is a major issue that transverses various research areas and application domains. It allows identifying people’s behaviour and patterns without overwhelming the monitored persons. Although there are many pervasive data sensing applications, they are typically focused on addressing specific problems in a single application domain, making them difficult to generalize or reuse. On the other hand, the platforms for supporting pervasive data sensing impose restrictions to the devices and operational environments that make them unsuitable for monitoring loosely-coupled or fully distributed work. In order to help address this challenge this paper present a framework that supports distributed pervasive data sensing in a generic way. Developers can use this framework to facilitate the implementations of their applications, thus reducing complexity and effort in such an activity. The framework was evaluated using simulations and also through an empirical test, and the obtained results indicate that it is useful to support such a sensing activity in loosely-coupled or fully distributed work scenarios.

  13. Recommendations for institutional policy and network regulatory frameworks towards distributed generation in EU Member States

    International Nuclear Information System (INIS)

    Ten Donkelaar, M.; Van Oostvoorn, F.

    2005-01-01

    Recommendations regarding the development of regulatory frameworks and institutional policies towards an optimal integration of distributed generation (DG) into electricity networks are presented. These recommendations are based on findings from a benchmarking study conducted in the framework of the ENIRDG-net project. The aim of the benchmarking exercise was to identify examples of well-defined pro-DG policies, with clear targets and adequate implementation mechanisms. In this study an adequate pro-DG policy is defined on the basis of a level playing field, a situation where distributed and centralised generation receive equal incentives and have equal access to the liberalised markets for electricity. The benchmark study includes the results of a similar study conducted in the framework of the SUSTELNET project. When comparing the results a certain discrepancy can be noticed between the actual regulation and policy in a number of countries, the medium to long-term targets and the ideal situation described by the level playing field objective. To overcome this discrepancy, a number of recommendations have been drafted for future policy and regulation towards distributed generation

  14. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun

    2013-01-01

    There has been a large body of statements claiming that the large-scale deployment of Distributed Energy Resources (DERs) could eventually reshape the future distribution grid operation in numerous ways. Thus, it is necessary to introduce a framework to measure to what extent the power system......, comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  15. Towards Scalable Distributed Framework for Urban Congestion Traffic Patterns Warehousing

    Directory of Open Access Journals (Sweden)

    A. Boulmakoul

    2015-01-01

    Full Text Available We put forward architecture of a framework for integration of data from moving objects related to urban transportation network. Most of this research refers to the GPS outdoor geolocation technology and uses distributed cloud infrastructure with big data NoSQL database. A network of intelligent mobile sensors, distributed on urban network, produces congestion traffic patterns. Congestion predictions are based on extended simulation model. This model provides traffic indicators calculations, which fuse with the GPS data for allowing estimation of traffic states across the whole network. The discovery process of congestion patterns uses semantic trajectories metamodel given in our previous works. The challenge of the proposed solution is to store patterns of traffic, which aims to ensure the surveillance and intelligent real-time control network to reduce congestion and avoid its consequences. The fusion of real-time data from GPS-enabled smartphones integrated with those provided by existing traffic systems improves traffic congestion knowledge, as well as generating new information for a soft operational control and providing intelligent added value for transportation systems deployment.

  16. Implementing Culture Change in Nursing Homes: An Adaptive Leadership Framework.

    Science.gov (United States)

    Corazzini, Kirsten; Twersky, Jack; White, Heidi K; Buhr, Gwendolen T; McConnell, Eleanor S; Weiner, Madeline; Colón-Emeric, Cathleen S

    2015-08-01

    To describe key adaptive challenges and leadership behaviors to implement culture change for person-directed care. The study design was a qualitative, observational study of nursing home staff perceptions of the implementation of culture change in each of 3 nursing homes. We conducted 7 focus groups of licensed and unlicensed nursing staff, medical care providers, and administrators. Questions explored perceptions of facilitators and barriers to culture change. Using a template organizing style of analysis with immersion/crystallization, themes of barriers and facilitators were coded for adaptive challenges and leadership. Six key themes emerged, including relationships, standards and expectations, motivation and vision, workload, respect of personhood, and physical environment. Within each theme, participants identified barriers that were adaptive challenges and facilitators that were examples of adaptive leadership. Commonly identified challenges were how to provide person-directed care in the context of extant rules or policies or how to develop staff motivated to provide person-directed care. Implementing culture change requires the recognition of adaptive challenges for which there are no technical solutions, but which require reframing of norms and expectations, and the development of novel and flexible solutions. Managers and administrators seeking to implement person-directed care will need to consider the role of adaptive leadership to address these adaptive challenges. © The Author 2014. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Framework for assessing the viability of implementing dual water ...

    African Journals Online (AJOL)

    In many settlements across the world (e.g. Pimpama Coomera and Mawson Lakes – Australia, Hong Kong – China, Majuro – Marshall Islands, Tarawa – Kiribati, and Windhoek – Namibia), dual water reticulation systems have been implemented in response to increasing water demands and decreasing freshwater ...

  18. A Systemic Approach to Implementing a Protective Factors Framework

    Science.gov (United States)

    Parsons, Beverly; Jessup, Patricia; Moore, Marah

    2014-01-01

    The leadership team of the national Quality Improvement Center on early Childhood ventured into the frontiers of deep change in social systems by funding four research projects. The purpose of the research projects was to learn about implementing a protective factors approach with the goal of reducing the likelihood of child abuse and neglect. In…

  19. Establishing a framework to implement 4D XCAT Phantom for 4D radiotherapy research

    Directory of Open Access Journals (Sweden)

    Raj K Panta

    2012-01-01

    Conclusions: An integrated computer program has been developed to generate, review, analyse, process, and export the 4D XCAT images. A framework has been established to implement the 4D XCAT phantom for 4D RT research.

  20. Distributed Framework for Dynamic Telescope and Instrument Control

    Science.gov (United States)

    Ames, Troy J.; Case, Lynne

    2002-01-01

    Traditionally, instrument command and control systems have been developed specifically for a single instrument. Such solutions are frequently expensive and are inflexible to support the next instrument development effort. NASA Goddard Space Flight Center is developing an extensible framework, known as Instrument Remote Control (IRC) that applies to any kind of instrument that can be controlled by a computer. IRC combines the platform independent processing capabilities of Java with the power of the Extensible Markup Language (XML). A key aspect of the architecture is software that is driven by an instrument description, written using the Instrument Markup Language (IML). IML is an XML dialect used to describe graphical user interfaces to control and monitor the instrument, command sets and command formats, data streams, communication mechanisms, and data processing algorithms. The IRC framework provides the ability to communicate to components anywhere on a network using the JXTA protocol for dynamic discovery of distributed components. JXTA (see httD://www.jxta.org,) is a generalized protocol that allows any devices connected by a network to communicate in a peer-to-peer manner. IRC uses JXTA to advertise a device's IML and discover devices of interest on the network. Devices can join or leave the network and thus join or leave the instrument control environment of IRC. Currently, several astronomical instruments are working with the IRC development team to develop custom components for IRC to control their instruments. These instruments include: High resolution Airborne Wideband Camera (HAWC), a first light instrument for the Stratospheric Observatory for Infrared Astronomy (SOFIA); Submillimeter And Far Infrared Experiment (SAFIRE), a Principal Investigator instrument for SOFIA; and Fabry-Perot Interferometer Bolometer Research Experiment (FIBRE), a prototype of the SAFIRE instrument, used at the Caltech Submillimeter Observatory (CSO). Most recently, we have

  1. Thermodynamic framework for compact q-Gaussian distributions

    Science.gov (United States)

    Souza, Andre M. C.; Andrade, Roberto F. S.; Nobre, Fernando D.; Curado, Evaldo M. F.

    2018-02-01

    Recent works have associated systems of particles, characterized by short-range repulsive interactions and evolving under overdamped motion, to a nonlinear Fokker-Planck equation within the class of nonextensive statistical mechanics, with a nonlinear diffusion contribution whose exponent is given by ν = 2 - q. The particular case ν = 2 applies to interacting vortices in type-II superconductors, whereas ν > 2 covers systems of particles characterized by short-range power-law interactions, where correlations among particles are taken into account. In the former case, several studies presented a consistent thermodynamic framework based on the definition of an effective temperature θ (presenting experimental values much higher than typical room temperatures T, so that thermal noise could be neglected), conjugated to a generalized entropy sν (with ν = 2). Herein, the whole thermodynamic scheme is revisited and extended to systems of particles interacting repulsively, through short-ranged potentials, described by an entropy sν, with ν > 1, covering the ν = 2 (vortices in type-II superconductors) and ν > 2 (short-range power-law interactions) physical examples. One basic requirement concerns a cutoff in the equilibrium distribution Peq(x) , approached due to a confining external harmonic potential, ϕ(x) = αx2 / 2 (α > 0). The main results achieved are: (a) The definition of an effective temperature θ conjugated to the entropy sν; (b) The construction of a Carnot cycle, whose efficiency is shown to be η = 1 -(θ2 /θ1) , where θ1 and θ2 are the effective temperatures associated with two isothermal transformations, with θ1 >θ2; (c) Thermodynamic potentials, Maxwell relations, and response functions. The present thermodynamic framework, for a system of interacting particles under the above-mentioned conditions, and associated to an entropy sν, with ν > 1, certainly enlarges the possibility of experimental verifications.

  2. A Generalized Framework for Modeling Next Generation 911 Implementations.

    Energy Technology Data Exchange (ETDEWEB)

    Kelic, Andjelka; Aamir, Munaf Syed; Kelic, Andjelka; Jrad, Ahmad M.; Mitchell, Roger

    2018-02-01

    This document summarizes the current state of Sandia 911 modeling capabilities and then addresses key aspects of Next Generation 911 (NG911) architectures for expansion of existing models. Analysis of three NG911 implementations was used to inform heuristics , associated key data requirements , and assumptions needed to capture NG911 architectures in the existing models . Modeling of NG911 necessitates careful consideration of its complexity and the diversity of implementations. Draft heuristics for constructing NG911 models are pres ented based on the analysis along with a summary of current challenges and ways to improve future NG911 modeling efforts . We found that NG911 relies on E nhanced 911 (E911) assets such as 911 selective routers to route calls originating from traditional tel ephony service which are a majority of 911 calls . We also found that the diversity and transitional nature of NG911 implementations necessitates significant and frequent data collection to ensure that adequate model s are available for crisis action support .

  3. Testing the Consolidated Framework for Implementation Research on health care innovations from South Yorkshire.

    Science.gov (United States)

    Ilott, Irene; Gerrish, Kate; Booth, Andrew; Field, Becky

    2013-10-01

    There is an international imperative to implement research into clinical practice to improve health care. Understanding the dynamics of change requires knowledge from theoretical and empirical studies. This paper presents a novel approach to testing a new meta theoretical framework: the Consolidated Framework for Implementation Research. The utility of the Framework was evaluated using a post hoc, deductive analysis of 11 narrative accounts of innovation in health care services and practice from England, collected in 2010. A matrix, comprising the five domains and 39 constructs of the Framework was developed to examine the coherence of the terminology, to compare results across contexts and to identify new theoretical developments. The Framework captured the complexity of implementation across 11 diverse examples, offering theoretically informed, comprehensive coverage. The Framework drew attention to relevant points in individual cases together with patterns across cases; for example, all were internally developed innovations that brought direct or indirect patient advantage. In 10 cases, the change was led by clinicians. Most initiatives had been maintained for several years and there was evidence of spread in six examples. Areas for further development within the Framework include sustainability and patient/public engagement in implementation. Our analysis suggests that this conceptual framework has the potential to offer useful insights, whether as part of a situational analysis or by developing context-specific propositions for hypothesis testing. Such studies are vital now that innovation is being promoted as core business for health care. © 2012 John Wiley & Sons Ltd.

  4. Teachers implementing context-based teaching materials : a framework for case-analysis in chemistry

    NARCIS (Netherlands)

    Vos, M.A.J.; Taconis, R.; Jochems, W.M.G.; Pilot, A.

    2010-01-01

    We present a framework for analysing the interplay between context-based teaching material and teachers, and for evaluating the adequacy of the resulting implementation of context-based pedagogy in chemistry classroom practice. The development of the framework is described, including an account of

  5. Mass customization and sustainability an assessment framework and industrial implementation

    CERN Document Server

    Boër, Claudio R; Bettoni, Andrea; Sorlini, Marzio

    2013-01-01

    To adapt to global competitive pressures, manufacturers must develop methods and enabling technologies towards a personalized, customer oriented and sustainable manufacturing. Mass Customization and Sustainability defines the two concepts of mass customization and sustainability and introduces a framework to establish a link between the two concepts to answer the questions: Are these two aspects empowering one another? Or are they hindering one another?   These questions investigate mass customization as one of the main driving forces to achieve effective sustainability.  A methodology to assess the contribution of mass customization to sustainability is developed, providing an assessment model composed by a set of indicators covering the three aspects of sustainability: social, economical and environmental. This is supported and further explained using ideas and new concepts compiled from recent European research.   Researchers, scientists, managers and industry professionals alike can follow a set of ...

  6. Implementing Distributed Operations: A Comparison of Two Deep Space Missions

    Science.gov (United States)

    Mishkin, Andrew; Larsen, Barbara

    2006-01-01

    Two very different deep space exploration missions--Mars Exploration Rover and Cassini--have made use of distributed operations for their science teams. In the case of MER, the distributed operations capability was implemented only after the prime mission was completed, as the rovers continued to operate well in excess of their expected mission lifetimes; Cassini, designed for a mission of more than ten years, had planned for distributed operations from its inception. The rapid command turnaround timeline of MER, as well as many of the operations features implemented to support it, have proven to be conducive to distributed operations. These features include: a single science team leader during the tactical operations timeline, highly integrated science and engineering teams, processes and file structures designed to permit multiple team members to work in parallel to deliver sequencing products, web-based spacecraft status and planning reports for team-wide access, and near-elimination of paper products from the operations process. Additionally, MER has benefited from the initial co-location of its entire operations team, and from having a single Principal Investigator, while Cassini operations have had to reconcile multiple science teams distributed from before launch. Cassini has faced greater challenges in implementing effective distributed operations. Because extensive early planning is required to capture science opportunities on its tour and because sequence development takes significantly longer than sequence execution, multiple teams are contributing to multiple sequences concurrently. The complexity of integrating inputs from multiple teams is exacerbated by spacecraft operability issues and resource contention among the teams, each of which has their own Principal Investigator. Finally, much of the technology that MER has exploited to facilitate distributed operations was not available when the Cassini ground system was designed, although later adoption

  7. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework

    Directory of Open Access Journals (Sweden)

    French Simon D

    2012-04-01

    Full Text Available Abstract Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s of delivery could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be

  8. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework.

    Science.gov (United States)

    French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M

    2012-04-24

    There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a

  9. Implementing the Marine Strategy Framework Directive: A policy perspective on regulatory, institutional and stakeholder impediments to effective implementation

    NARCIS (Netherlands)

    Leeuwen, van J.; Raakjaer, J.; Hoof, van L.J.W.; Tatenhove, van J.P.M.; Long, R.; Ounanian, K.

    2014-01-01

    The implementation of the European Union (EU) Marine Strategy Framework Directive (MSFD) requires EU Member States to draft a program of measures to achieve Good Environmental Status (GES). Central argument of this paper, based on an analysis of the unique, holistic character of the MSFD, is that

  10. A prompt start: Implementing the framework convention on climate change

    International Nuclear Information System (INIS)

    Chayes, A.; Skolnikoff, E.B.; Victor, D.G.

    1992-01-01

    A Framework Convention on Climate Change is under active negotiation in the United Nations with the expectation it will be ready for Signature at the Rio Conference this June. Under the most optimistic projections, a Convention will not come into force and be an effective instrument for months, probably years. In recognition of the several institutional tasks that will be of crucial importance whatever the detailed content of the Convention a small group of high international organizations involved in the negotiations was convened at the Rockefeller Foundation's Conference Center at Bellagio in January. The discussions at Bellagio on the need for a Prompt Start on these institutional tasks benefitted from earlier meetings at Harvard in March and at Bermuda in May, 1991, that the co-organizers convened to discuss these and related aspects of the negotiations on a Climate Convention. Those meetings were attended by members of the academic community, officials from the United Nations, and representatives of governments involved in the negotiations

  11. Analysing the agricultural cost and non-market benefits of implementing the water framework directive

    NARCIS (Netherlands)

    Bateman, I.J.; Brouwer, R.; Davies, H.; Day, B.H.; Deflandre, A.; Di Falco, S.; Georgiou, S.; Hadley, D.; Hutchins, M.; Jones, A.P.; Kay, D.; Leeks, G.; Lewis, M.; Lovett, A.A.; Neal, C.; Posen, P.; Rigby, D.; Turner, R.K.

    2006-01-01

    Implementation of the Water Framework Directive (WFD) represents a fundamental change in the management of water in Europe with a requirement that member states ensure 'good ecological status' for all water bodies by 2015. Agriculture is expected to bear a major share of WFD implementation costs as

  12. A more ‘autonomous’ European social dialogue: the implementation of the Framework Agreement On Telework

    NARCIS (Netherlands)

    Visser, J.; Ramos Martín, N.

    2008-01-01

    This paper examines the implementation of the first ‘autonomous’ agreement signed by the European social partners. The European Framework Agreement on Telework of July 2002 was to be implemented ‘in accordance with the procedures and practices specific to management and labour and the Member

  13. Water and spatial development: the implementation of the water framework directive in the Netherlands

    NARCIS (Netherlands)

    Knaap, van der W.G.M.; Pijnappels, M.

    2010-01-01

    This paper discusses how water managers and spatial planners could co-operate on local level in combination with the implementation of the Water Framework Directive and the Birds and Habitats Directives in the Netherlands. Recent evaluations of the European Commission show that implementation of

  14. A framework for plasticity implementation on the SpiNNaker neural architecture.

    Science.gov (United States)

    Galluppi, Francesco; Lagorce, Xavier; Stromatias, Evangelos; Pfeiffer, Michael; Plana, Luis A; Furber, Steve B; Benosman, Ryad B

    2014-01-01

    Many of the precise biological mechanisms of synaptic plasticity remain elusive, but simulations of neural networks have greatly enhanced our understanding of how specific global functions arise from the massively parallel computation of neurons and local Hebbian or spike-timing dependent plasticity rules. For simulating large portions of neural tissue, this has created an increasingly strong need for large scale simulations of plastic neural networks on special purpose hardware platforms, because synaptic transmissions and updates are badly matched to computing style supported by current architectures. Because of the great diversity of biological plasticity phenomena and the corresponding diversity of models, there is a great need for testing various hypotheses about plasticity before committing to one hardware implementation. Here we present a novel framework for investigating different plasticity approaches on the SpiNNaker distributed digital neural simulation platform. The key innovation of the proposed architecture is to exploit the reconfigurability of the ARM processors inside SpiNNaker, dedicating a subset of them exclusively to process synaptic plasticity updates, while the rest perform the usual neural and synaptic simulations. We demonstrate the flexibility of the proposed approach by showing the implementation of a variety of spike- and rate-based learning rules, including standard Spike-Timing dependent plasticity (STDP), voltage-dependent STDP, and the rate-based BCM rule. We analyze their performance and validate them by running classical learning experiments in real time on a 4-chip SpiNNaker board. The result is an efficient, modular, flexible and scalable framework, which provides a valuable tool for the fast and easy exploration of learning models of very different kinds on the parallel and reconfigurable SpiNNaker system.

  15. Evaluation Framework and Tools for Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Gumerman, Etan Z.; Bharvirkar, Ranjit R.; LaCommare, Kristina Hamachi; Marnay , Chris

    2003-02-01

    The Energy Information Administration's (EIA) 2002 Annual Energy Outlook (AEO) forecast anticipates the need for 375 MW of new generating capacity (or about one new power plant) per week for the next 20 years, most of which is forecast to be fueled by natural gas. The Distributed Energy and Electric Reliability Program (DEER) of the Department of Energy (DOE), has set a national goal for DER to capture 20 percent of new electric generation capacity additions by 2020 (Office of Energy Efficiency and Renewable Energy 2000). Cumulatively, this amounts to about 40 GW of DER capacity additions from 2000-2020. Figure ES-1 below compares the EIA forecast and DEER's assumed goal for new DER by 2020 while applying the same definition of DER to both. This figure illustrates that the EIA forecast is consistent with the overall DEER DER goal. For the purposes of this study, Berkeley Lab needed a target level of small-scale DER penetration upon which to hinge consideration of benefits and costs. Because the AEO2002 forecasted only 3.1 GW of cumulative additions from small-scale DER in the residential and commercial sectors, another approach was needed to estimate the small-scale DER target. The focus here is on small-scale DER technologies under 500 kW. The technology size limit is somewhat arbitrary, but the key results of interest are marginal additional costs and benefits around an assumed level of penetration that existing programs might achieve. Berkeley Lab assumes that small-scale DER has the same growth potential as large scale DER in AEO2002, about 38 GW. This assumption makes the small-scale goal equivalent to 380,000 DER units of average size 100 kW. This report lays out a framework whereby the consequences of meeting this goal might be estimated and tallied up. The framework is built around a list of major benefits and a set of tools that might be applied to estimate them. This study lists some of the major effects of an emerging paradigm shift away from

  16. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    Science.gov (United States)

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  17. Implementing ATML in Distributed ATS for SG-III Prototype

    International Nuclear Information System (INIS)

    Chen Ming; Yang Cunbang; Lu Junfeng; Ding Yongkun; Yin Zejie; Zheng Zhijian

    2007-01-01

    With the forthcoming large-scale scientific experimental systems, we are looking for ways to construct an open, distributed architecture within the new and the existing automatic test systems. The new standard of Automatic Test Markup Language meets our demand for data exchange for this architecture through defining the test routines and resultant data in the XML format. This paper introduces the concept of ATML(Automatic Test Markup Language) and related standards, and the significance of these new standards for a distributed automatic test system. It also describes the implementation of ATML through the integration of this technology among the existing and new test systems

  18. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  19. Policy implementation in practice: the case of national service frameworks in general practice.

    Science.gov (United States)

    Checkland, Kath; Harrison, Stephen

    2004-10-01

    National Service Frameworks are an integral part of the government's drive to 'modernise' the NHS, intended to standardise both clinical care and the design of the services used to deliver that clinical care. This article uses evidence from qualitative case studies in three general practices to illustrate the difficulties associated with the implementation of such top-down guidelines and models of service. In these studies it was found that, while there had been little explicit activity directed at implementation overall, the National Service Framework for coronary heart disease had in general fared better than that for older people. Gunn's notion of 'perfect implementation' is used to make sense of the findings.

  20. Identifying a practice-based implementation framework for sustainable interventions for improving the evolving working environment

    DEFF Research Database (Denmark)

    Højberg, Helene; Nørregaard Rasmussen, Charlotte Diana; Osborne, Richard H.

    2018-01-01

    Our aim was to identify implementation components for sustainable working environment interventions in the nursing assistant sector to generate a framework to optimize the implementation of workplace improvement initiatives. The implementation framework was informed by: 1) an industry advisory...... group, 2) interviews with key stakeholder, 3) concept mapping workshops, and 4) an e-mail survey. Thirty five stakeholders were interviewed and contributed in the concept mapping workshops. Eleven implementation components were derived across four domains: 1) A supportive organizational platform, 2......) An engaged workplace with mutual goals, 3) The intervention is sustainably fitted to the workplace, and 4) the intervention is an attractive choice. The highest rated component was “Engaged and Active Management” (mean 4.1) and the lowest rated was “Delivered in an Attractive Form” (mean 2.8). The framework...

  1. The Development of a Practical Framework for the Implementation of JIT Manufacturing

    OpenAIRE

    Hallihan, A.

    1996-01-01

    This research develops a framework to guide practitioners through the process of implementing Just In Time manufacturing in the commercial aircraft manufacturing industry. The scope of Just In Time manufacturing is determined through an analysis of its evolution and current use. Current approaches to its implementation are reviewed and shortcomings are identified. A requirement to allow practitioners to tailor the approach to the implementation of Just In Time manufacturing, ...

  2. Introducing the Canadian Thoracic Society Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation

    Directory of Open Access Journals (Sweden)

    Samir Gupta

    2013-01-01

    Full Text Available The Canadian Thoracic Society (CTS is leveraging its strengths in guideline production to enable respiratory guideline implementation in Canada. The authors describe the new CTS Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation, which has three spheres of action: guideline production, implementation infrastructure and knowledge translation (KT methodological support. The Canadian Institutes of Health Research ‘Knowledge-to-Action’ process was adopted as the model of choice for conceptualizing KT interventions. Within the framework, new evidence for formatting guideline recommendations to enhance the intrinsic implementability of future guidelines were applied. Clinical assemblies will consider implementability early in the guideline production cycle when selecting clinical questions, and new practice guidelines will include a section dedicated to KT. The framework describes the development of a web-based repository and communication forum to inventory existing KT resources and to facilitate collaboration and communication among implementation stakeholders through an online discussion board. A national forum for presentation and peer-review of proposed KT projects is described. The framework outlines expert methodological support for KT planning, development and evaluation including a practical guide for implementers and a novel ‘Clinical Assembly – KT Action Team’, and in-kind logistical support and assistance in securing peer-reviewed funding.

  3. A research framework for the development and implementation of interventions preventing work-related musculoskeletal disorders.

    Science.gov (United States)

    van der Beek, Allard J; Dennerlein, Jack T; Huysmans, Maaike A; Mathiassen, Svend Erik; Burdorf, Alex; van Mechelen, Willem; van Dieën, Jaap H; Frings-Dresen, Monique Hw; Holtermann, Andreas; Janwantanakul, Prawit; van der Molen, Henk F; Rempel, David; Straker, Leon; Walker-Bone, Karen; Coenen, Pieter

    2017-11-01

    Objectives Work-related musculoskeletal disorders (MSD) are highly prevalent and put a large burden on (working) society. Primary prevention of work-related MSD focuses often on physical risk factors (such as manual lifting and awkward postures) but has not been too successful in reducing the MSD burden. This may partly be caused by insufficient knowledge of etiological mechanisms and/or a lack of adequately feasible interventions (theory failure and program failure, respectively), possibly due to limited integration of research disciplines. A research framework could link research disciplines thereby strengthening the development and implementation of preventive interventions. Our objective was to define and describe such a framework for multi-disciplinary research on work-related MSD prevention. Methods We described a framework for MSD prevention research, partly based on frameworks from other research fields (ie, sports injury prevention and public health). Results The framework is composed of a repeated sequence of six steps comprising the assessment of (i) incidence and severity of MSD, (ii) risk factors for MSD, and (iii) underlying mechanisms; and the (iv) development, (v) evaluation, and (vi) implementation of preventive intervention(s). Conclusions In the present framework for optimal work-related MSD prevention, research disciplines are linked. This framework can thereby help to improve theories and strengthen the development and implementation of prevention strategies for work-related MSD.

  4. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  5. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    Science.gov (United States)

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  6. A distributed cloud-based cyberinfrastructure framework for integrated bridge monitoring

    Science.gov (United States)

    Jeong, Seongwoon; Hou, Rui; Lynch, Jerome P.; Sohn, Hoon; Law, Kincho H.

    2017-04-01

    This paper describes a cloud-based cyberinfrastructure framework for the management of the diverse data involved in bridge monitoring. Bridge monitoring involves various hardware systems, software tools and laborious activities that include, for examples, a structural health monitoring (SHM), sensor network, engineering analysis programs and visual inspection. Very often, these monitoring systems, tools and activities are not coordinated, and the collected information are not shared. A well-designed integrated data management framework can support the effective use of the data and, thereby, enhance bridge management and maintenance operations. The cloud-based cyberinfrastructure framework presented herein is designed to manage not only sensor measurement data acquired from the SHM system, but also other relevant information, such as bridge engineering model and traffic videos, in an integrated manner. For the scalability and flexibility, cloud computing services and distributed database systems are employed. The information stored can be accessed through standard web interfaces. For demonstration, the cyberinfrastructure system is implemented for the monitoring of the bridges located along the I-275 Corridor in the state of Michigan.

  7. Validation of the theoretical domains framework for use in behaviour change and implementation research

    OpenAIRE

    Cane, James E.; O'Connor, Denise; Michie, Susan

    2012-01-01

    Abstract Background An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Methods Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. Results The...

  8. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    Science.gov (United States)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  9. A framework of quality improvement interventions to implement evidence-based practices for pressure ulcer prevention.

    Science.gov (United States)

    Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Valuck, Robert J

    2014-06-01

    To enhance the learner's competence with knowledge about a framework of quality improvement (QI) interventions to implement evidence-based practices for pressure ulcer (PrU) prevention. This continuing education activity is intended for physicians and nurses with an interest in skin and wound care. After participating in this educational activity, the participant should be better able to:1. Summarize the process of creating and initiating the best-practice framework of QI for PrU prevention.2. Identify the domains and QI interventions for the best-practice framework of QI for PrU prevention. Pressure ulcer (PrU) prevention is a priority issue in US hospitals. The National Pressure Ulcer Advisory Panel endorses an evidence-based practice (EBP) protocol to help prevent PrUs. Effective implementation of EBPs requires systematic change of existing care units. Quality improvement interventions offer a mechanism of change to existing structures in order to effectively implement EBPs for PrU prevention. The best-practice framework developed by Nelson et al is a useful model of quality improvement interventions that targets process improvement in 4 domains: leadership, staff, information and information technology, and performance and improvement. At 2 academic medical centers, the best-practice framework was shown to physicians, nurses, and health services researchers. Their insight was used to modify the best-practice framework as a reference tool for quality improvement interventions in PrU prevention. The revised framework includes 25 elements across 4 domains. Many of these elements support EBPs for PrU prevention, such as updates in PrU staging and risk assessment. The best-practice framework offers a reference point to initiating a bundle of quality improvement interventions in support of EBPs. Hospitals and clinicians tasked with quality improvement efforts can use this framework to problem-solve PrU prevention and other critical issues.

  10. Implementation of Web-based Information Systems in Distributed Organizations

    DEFF Research Database (Denmark)

    Bødker, Keld; Pors, Jens Kaaber; Simonsen, Jesper

    2004-01-01

    This article presents results elicited from studies conducted in relation to implementing a web-based information system throughout a large distributed organization. We demonstrate the kind of expectations and conditions for change that management face in relation to open-ended, configurable......, and context specific web-based information systems like Lotus QuickPlace. Our synthesis from the empirical findings is related to two recent models, the improvisational change management model suggested by Orlikowski and Hofman (1997), and Gallivan's (2001) model for organizational adoption and assimilation....... In line with comparable approaches from the knowledge management area (Dixon 2000; Markus 2001), we relate to, refine, and operationalize the models from an overall organizational view by identifying and characterizing four different and general implementation contexts...

  11. Designing and Implementing a Retrospective Earthquake Detection Framework at the U.S. Geological Survey National Earthquake Information Center

    Science.gov (United States)

    Patton, J.; Yeck, W.; Benz, H.

    2017-12-01

    The U.S. Geological Survey National Earthquake Information Center (USGS NEIC) is implementing and integrating new signal detection methods such as subspace correlation, continuous beamforming, multi-band picking and automatic phase identification into near-real-time monitoring operations. Leveraging the additional information from these techniques help the NEIC utilize a large and varied network on local to global scales. The NEIC is developing an ordered, rapid, robust, and decentralized framework for distributing seismic detection data as well as a set of formalized formatting standards. These frameworks and standards enable the NEIC to implement a seismic event detection framework that supports basic tasks, including automatic arrival time picking, social media based event detections, and automatic association of different seismic detection data into seismic earthquake events. In addition, this framework enables retrospective detection processing such as automated S-wave arrival time picking given a detected event, discrimination and classification of detected events by type, back-azimuth and slowness calculations, and ensuring aftershock and induced sequence detection completeness. These processes and infrastructure improve the NEIC's capabilities, accuracy, and speed of response. In addition, this same infrastructure provides an improved and convenient structure to support access to automatic detection data for both research and algorithmic development.

  12. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Directory of Open Access Journals (Sweden)

    Runzhe Geng

    Full Text Available Best management practices (BMPs for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P index, model simulation techniques (Hydrological Simulation Program-FORTRAN, and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001 decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  13. Competing Through Lean – Towards Sustainable Resource-Oriented Implementation Framework

    Directory of Open Access Journals (Sweden)

    Rymaszewska Anna

    2014-11-01

    Full Text Available This paper addresses the needs of SMEs manufacturing companies which due to their limited resources are often unable to introduce radical changes in their strategies. The main focus is on analyzing the principles of lean manufacturing and management regarding their potential contribution to building a company's competitive advantage. The paper analyses lean from a strategic management viewpoint while combining its implementation with achieving a competitive advantage. The ultimate result is a framework for lean implementation aimed at building a competitive advantage for companies. The proposed framework focuses on the idea of a closed loop with embedded sustainability.

  14. Implementation of High Speed Distributed Data Acquisition System

    Science.gov (United States)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high

  15. Adjusting Estimates of the Expected Value of Information for Implementation: Theoretical Framework and Practical Application.

    Science.gov (United States)

    Andronis, Lazaros; Barton, Pelham M

    2016-04-01

    Value of information (VoI) calculations give the expected benefits of decision making under perfect information (EVPI) or sample information (EVSI), typically on the premise that any treatment recommendations made in light of this information will be implemented instantly and fully. This assumption is unlikely to hold in health care; evidence shows that obtaining further information typically leads to "improved" rather than "perfect" implementation. To present a method of calculating the expected value of further research that accounts for the reality of improved implementation. This work extends an existing conceptual framework by introducing additional states of the world regarding information (sample information, in addition to current and perfect information) and implementation (improved implementation, in addition to current and optimal implementation). The extension allows calculating the "implementation-adjusted" EVSI (IA-EVSI), a measure that accounts for different degrees of implementation. Calculations of implementation-adjusted estimates are illustrated under different scenarios through a stylized case study in non-small cell lung cancer. In the particular case study, the population values for EVSI and IA-EVSI were £ 25 million and £ 8 million, respectively; thus, a decision assuming perfect implementation would have overestimated the expected value of research by about £ 17 million. IA-EVSI was driven by the assumed time horizon and, importantly, the specified rate of change in implementation: the higher the rate, the greater the IA-EVSI and the lower the difference between IA-EVSI and EVSI. Traditionally calculated measures of population VoI rely on unrealistic assumptions about implementation. This article provides a simple framework that accounts for improved, rather than perfect, implementation and offers more realistic estimates of the expected value of research. © The Author(s) 2015.

  16. cMsg - A general purpose, publish-subscribe, interprocess communication implementation and framework

    International Nuclear Information System (INIS)

    Timmer, C; Abbott, D; Gyurjyan, V; Heyes, G; Jastrzembski, E; Wolin, E

    2008-01-01

    cMsg is software used to send and receive messages in the Jefferson Lab online and runcontrol systems. It was created to replace the several IPC software packages in use with a single API. cMsg is asynchronous in nature, running a callback for each message received. However, it also includes synchronous routines for convenience. On the framework level, cMsg is a thin API layer in Java, C, or C++ that can be used to wrap most message-based interprocess communication protocols. The top layer of cMsg uses this same API and multiplexes user calls to one of many such wrapped protocols (or domains) based on a URL-like string which we call a Uniform Domain Locator or UDL. One such domain is a complete implementation of a publish-subscribe messaging system using network communications and written in Java (user APIs in C and C++ too). This domain is built in a way which allows it to be used as a proxy server to other domains (protocols). Performance is excellent allowing the system not only to be used for messaging but also as a data distribution system

  17. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    Science.gov (United States)

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  18. DiSC: A Simulation Framework for Distribution System Voltage Control

    DEFF Research Database (Denmark)

    Pedersen, Rasmus; Sloth, Christoffer Eg; Andresen, Gorm

    2015-01-01

    This paper presents the MATLAB simulation framework, DiSC, for verifying voltage control approaches in power distribution systems. It consists of real consumption data, stochastic models of renewable resources, flexible assets, electrical grid, and models of the underlying communication channels....... The simulation framework makes it possible to validate control approaches, and thus advance realistic and robust control algorithms for distribution system voltage control. Two examples demonstrate the potential voltage issues from penetration of renewables in the distribution grid, along with simple control...

  19. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    Science.gov (United States)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  20. Regulation of electricity distribution: Issues for implementing a norm model

    International Nuclear Information System (INIS)

    Bjoerndal, Endre; Bjoerndal, Mette; Bjoernenak, Trond; Johnsen, Thore

    2005-01-01

    The Norwegian regulation of transmission and distribution of electricity is currently under revision, and several proposals, including price caps, various norm models and adjustments to the present revenue cap model, have been considered by the Norwegian regulator, NVE. Our starting point is that a successful and sustainable income-regulation-model for electricity distribution should be in accordance with the way of thinking, and the managerial tools of modern businesses. In the regulation it is assumed that decisions regarding operations and investments are made by independent, business oriented entities. The ambition of a dynamically efficient industry therefore requires that the regulatory model and its implementation support best practice business performance. This will influence how the cost base is determined and the way investments are dealt with. We will investigate a possible implementation of a regulatory model based on cost norms. In this we will distinguish between on the one hand, customer driven costs, and on the other hand, costs related to the network itself. The network related costs, which account for approximately 80% of the total cost of electricity distribution, include the costs of operating and maintaining the network, as well as capital costs. These are the ''difficult'' costs, as their levels depend on structural and climatic factors, as well as the number of customers and the load that is served. Additionally, the costs are not separable, since for instance maintenance and investments can be substitutable activities. The work concentrates on verifying the cost model, and evaluating implications for the use of the present efficiency model (DEA) in the regulation. Moreover, we consider how network related costs can be managed in a norm model. Finally, it is highlighted that an important part of a regulatory model based on cost norms is to devise quality measures and how to use them in the economic regulation. (Author)

  1. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  2. Communication Channels as Implementation Determinants of Performance Management Framework in Kenya

    Science.gov (United States)

    Sang, Jane

    2016-01-01

    The purpose of this study to assess communication channels as implementation determinants of performance management framework In Kenya at Moi Teaching and Referral Hospital (MTRH). The communication theory was used to inform the study. This study adopted an explanatory design. The target sampled 510 respondents through simple random and stratified…

  3. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    International Nuclear Information System (INIS)

    Alioli, Simone; Nason, Paolo; Oleari, Carlo; Re, Emanuele

    2010-02-01

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  4. Implementation of a Framework for Collaborative Social Networks in E-Learning

    Science.gov (United States)

    Maglajlic, Seid

    2016-01-01

    This paper describes the implementation of a framework for the construction and utilization of social networks in ELearning. These social networks aim to enhance collaboration between all E-Learning participants (i.e. both traineeto-trainee and trainee-to-tutor communication are targeted). E-Learning systems that include a so-called "social…

  5. PIRPOSAL Model of Integrative STEM Education: Conceptual and Pedagogical Framework for Classroom Implementation

    Science.gov (United States)

    Wells, John G.

    2016-01-01

    The PIRPOSAL model is both a conceptual and pedagogical framework intended for use as a pragmatic guide to classroom implementation of Integrative STEM Education. Designerly questioning prompted by a "need to know" serves as the basis for transitioning student designers within and among multiple phases while they progress toward an…

  6. Teacher Competencies for the Implementation of Collaborative Learning in the Classroom: A Framework and Research Review

    Science.gov (United States)

    Kaendler, Celia; Wiedmann, Michael; Rummel, Nikol; Spada, Hans

    2015-01-01

    This article describes teacher competencies for implementing collaborative learning in the classroom. Research has shown that the effectiveness of collaborative learning largely depends on the quality of student interaction. We therefore focus on what a "teacher" can do to foster student interaction. First, we present a framework that…

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; Sebastián, Miguel San

    2010-01-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania...

  8. The Use of Ethical Frameworks for Implementing Science as a Human Endeavour in Year 10 Biology

    Science.gov (United States)

    Yap, Siew Fong; Dawson, Vaille

    2014-01-01

    This research focuses on the use of ethical frameworks as a pedagogical model for socio-scientific education in implementing the "Science as a Human Endeavour" (SHE) strand of the Australian Curriculum: Science in a Year 10 biology class in a Christian college in metropolitan Perth, Western Australia. Using a case study approach, a mixed…

  9. Cost-effectiveness analysis for the implementation of the EU Water Framework Directive

    NARCIS (Netherlands)

    van Engelen, D.M.; Seidelin, Christian; van der Veeren, Rob; Barton, David N.; Queb, Kabir

    2008-01-01

    The EU Water Framework Directive (WFD) prescribes cost-effectiveness analysis (CEA) as an economic tool for the minimisation of costs when formulating programmes of measures to be implemented in the European river basins by the year 2009. The WFD does not specify, however, which approach to CEA has

  10. A planning and analysis framework for evaluating distributed generation and utility strategies

    International Nuclear Information System (INIS)

    Ault, Graham W.

    2000-01-01

    The numbers of smaller scale distributed power generation units connected to the distribution networks of electricity utilities in the UK and elsewhere have grown significantly in recent years. Numerous economic and political drivers have stimulated this growth and continue to provide the environment for future growth in distributed generation. The simple fact that distributed generation is independent from the distribution utility complicates planning and operational tasks for the distribution network. The uncertainty relating to the number, location and type of distributed generating units to connect to the distribution network in the future makes distribution planning a particularly difficult activity. This thesis concerns the problem of distribution network and business planning in the era of distributed generation. A distributed generation strategic analysis framework is proposed to provide the required analytical capability and planning and decision making framework to enable distribution utilities to deal effectively with the challenges and opportunities presented to them by distributed generation. The distributed generation strategic analysis framework is based on the best features of modern planning and decision making methodologies and facilitates scenario based analysis across many utility strategic options and uncertainties. Case studies are presented and assessed to clearly illustrate the potential benefits of such an approach to distributed generation planning in the UK electricity supply industry. (author)

  11. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  12. Assessment of school wellness policies implementation by benchmarking against diffusion of innovation framework.

    Science.gov (United States)

    Harriger, Dinah; Lu, Wenhua; McKyer, E Lisako J; Pruitt, Buzz E; Goodson, Patricia

    2014-04-01

    The School Wellness Policy (SWP) mandate marks one of the first innovative and extensive efforts of the US government to address the child obesity epidemic and the influence of the school environment on child health. However, no systematic review has been conducted to examine the implementation of the mandate. The study examines the literature on SWP implementation by using the Diffusion of Innovations Theory as a framework. Empirically based literature on SWP was systematically searched and analyzed. A theory-driven approach was used to categorize the articles by 4 diffusion stages: restructuring/redefining, clarifying, routinizing, and multiple stages. Twenty-one studies were identified, and 3 key characteristics of the reviewed literature were captured: (1) uniformity in methodology, (2) role of context in analyzing policy implementation, and (3) lack of information related to policy clarification. Over half of the studies were published by duplicate set of authors, and only 1 study employed a pure qualitative methodology. Only 2 articles include an explicit theoretical framework to study theory-driven constructs related to SWP implementation. Policy implementation research can inform the policy process. Therefore, it is essential that policy implementation is measured accurately. Failing to clearly define implementation constructs may result in misguided conclusion. © 2014, American School Health Association.

  13. Communication Optimizations for a Wireless Distributed Prognostic Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication...

  14. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  15. Visapult: A Prototype Remote and Distributed Visualization Application and Framework

    International Nuclear Information System (INIS)

    Bethel, Wes

    2000-01-01

    We describe an approach used for implementing a highly efficient and scalable method for direct volume rendering. Our approach uses a pipelined-parallel decomposition composed of parallel computers and commodity desktop hardware. With our approach, desktop interactivity is divorced from the latency inherent in network-based applications

  16. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  17. Implementation of parallel processing in the basf2 framework for Belle II

    International Nuclear Information System (INIS)

    Itoh, Ryosuke; Lee, Soohyung; Katayama, N; Mineo, S; Moll, A; Kuhr, T; Heck, M

    2012-01-01

    Recent PC servers are equipped with multi-core CPUs and it is desired to utilize the full processing power of them for the data analysis in large scale HEP experiments. A software framework basf2 is being developed for the use in the Belle II experiment, a new generation B-factory experiment at KEK, and the parallel event processing to utilize the multi-core CPUs is in its design for the use in the massive data production. The details of the implementation of event parallel processing in the basf2 framework are discussed with the report of preliminary performance study in the realistic use on a 32 core PC server.

  18. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the ...

  19. MODELS AND SOLUTIONS FOR THE IMPLEMENTATION OF DISTRIBUTED SYSTEMS

    Directory of Open Access Journals (Sweden)

    Tarca Naiana

    2011-07-01

    Full Text Available Software applications may have different degrees of complexity depending on the problems they try to solve and can integrate very complex elements that bring together functionality that sometimes are competing or conflicting. We can take for example a mobile communications system. Functionalities of such a system are difficult to understand, and they add to the non-functional requirements such as the use in practice, performance, cost, durability and security. The transition from local computer networks to cover large networks that allow millions of machines around the world at speeds exceeding one gigabit per second allowed universal access to data and design of applications that require simultaneous use of computing power of several interconnected systems. The result of these technologies has enabled the evolution from centralized to distributed systems that connect a large number of computers. To enable the exploitation of the advantages of distributed systems one had developed software and communications tools that have enabled the implementation of distributed processing of complex solutions. The objective of this document is to present all the hardware, software and communication tools, closely related to the possibility of their application in integrated social and economic level as a result of globalization and the evolution of e-society. These objectives and national priorities are based on current needs and realities of Romanian society, while being consistent with the requirements of Romania's European orientation towards the knowledge society, strengthening the information society, the target goal representing the accomplishment of e-Romania, with its strategic e-government component. Achieving this objective repositions Romania and gives an advantage for sustainable growth, positive international image, rapid convergence in Europe, inclusion and strengthening areas of high competence, in line with Europe 2020, launched by the

  20. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions.

    Science.gov (United States)

    Pérez, Dennis; Van der Stuyft, Patrick; Zabala, Maríadel Carmen; Castro, Marta; Lefèvre, Pierre

    2016-07-08

    One of the major debates in implementation research turns around fidelity and adaptation. Fidelity is the degree to which an intervention is implemented as intended by its developers. It is meant to ensure that the intervention maintains its intended effects. Adaptation is the process of implementers or users bringing changes to the original design of an intervention. Depending on the nature of the modifications brought, adaptation could either be potentially positive or could carry the risk of threatening the theoretical basis of the intervention, resulting in a negative effect on expected outcomes. Adaptive interventions are those for which adaptation is allowed or even encouraged. Classical fidelity dimensions and conceptual frameworks do not address the issue of how to adapt an intervention while still maintaining its effectiveness. We support the idea that fidelity and adaptation co-exist and that adaptations can impact either positively or negatively on the intervention's effectiveness. For adaptive interventions, research should answer the question how an adequate fidelity-adaptation balance can be reached. One way to address this issue is by looking systematically at the aspects of an intervention that are being adapted. We conducted fidelity research on the implementation of an empowerment strategy for dengue prevention in Cuba. In view of the adaptive nature of the strategy, we anticipated that the classical fidelity dimensions would be of limited use for assessing adaptations. The typology we used in the assessment-implemented, not-implemented, modified, or added components of the strategy-also had limitations. It did not allow us to answer the question which of the modifications introduced in the strategy contributed to or distracted from outcomes. We confronted our empirical research with existing literature on fidelity, and as a result, considered that the framework for implementation fidelity proposed by Carroll et al. in 2007 could potentially meet

  1. Implementing change in primary care practices using electronic medical records: a conceptual framework.

    Science.gov (United States)

    Nemeth, Lynne S; Feifer, Chris; Stuart, Gail W; Ornstein, Steven M

    2008-01-16

    Implementing change in primary care is difficult, and little practical guidance is available to assist small primary care practices. Methods to structure care and develop new roles are often needed to implement an evidence-based practice that improves care. This study explored the process of change used to implement clinical guidelines for primary and secondary prevention of cardiovascular disease in primary care practices that used a common electronic medical record (EMR). Multiple conceptual frameworks informed the design of this study designed to explain the complex phenomena of implementing change in primary care practice. Qualitative methods were used to examine the processes of change that practice members used to implement the guidelines. Purposive sampling in eight primary care practices within the Practice Partner Research Network-Translating Researching into Practice (PPRNet-TRIP II) clinical trial yielded 28 staff members and clinicians who were interviewed regarding how change in practice occurred while implementing clinical guidelines for primary and secondary prevention of cardiovascular disease and strokes. A conceptual framework for implementing clinical guidelines into primary care practice was developed through this research. Seven concepts and their relationships were modelled within this framework: leaders setting a vision with clear goals for staff to embrace; involving the team to enable the goals and vision for the practice to be achieved; enhancing communication systems to reinforce goals for patient care; developing the team to enable the staff to contribute toward practice improvement; taking small steps, encouraging practices' tests of small changes in practice; assimilating the electronic medical record to maximize clinical effectiveness, enhancing practices' use of the electronic tool they have invested in for patient care improvement; and providing feedback within a culture of improvement, leading to an iterative cycle of goal setting

  2. A Distributed Framework for Supporting 3D Swarming Applications

    OpenAIRE

    Pour Sadrollah, Ghazaleh; Barca, Jan Carlo; Khan, Asad; Eliasson, Jens; Senthooran, Ilankaikone

    2014-01-01

    Abstract—In-flight wireless sensor networks (WSN) are ofincreased interest owing to efficiency gains in weight and operationallifetime of IP-enabled computers. High impact 3Dswarming applications for such systems include autonomousmapping, surveying, servicing, environmental monitoring anddisaster site management. For distributed robotic applications,such as quad copter swarms, it is critical that the robots are ableto localise themselves autonomously with respect to other robotsand to share ...

  3. An Optimization Framework for Dynamic, Distributed Real-Time Systems

    Science.gov (United States)

    Eckert, Klaus; Juedes, David; Welch, Lonnie; Chelberg, David; Bruggerman, Carl; Drews, Frank; Fleeman, David; Parrott, David; Pfarr, Barbara

    2003-01-01

    Abstract. This paper presents a model that is useful for developing resource allocation algorithms for distributed real-time systems .that operate in dynamic environments. Interesting aspects of the model include dynamic environments, utility and service levels, which provide a means for graceful degradation in resource-constrained situations and support optimization of the allocation of resources. The paper also provides an allocation algorithm that illustrates how to use the model for producing feasible, optimal resource allocations.

  4. How can we improve guideline use? A conceptual framework of implementability

    Directory of Open Access Journals (Sweden)

    Lemieux-Charles Louise

    2011-03-01

    Full Text Available Abstract Background Guidelines continue to be underutilized, and a variety of strategies to improve their use have been suboptimal. Modifying guideline features represents an alternative, but untested way to promote their use. The purpose of this study was to identify and define features that facilitate guideline use, and examine whether and how they are included in current guidelines. Methods A guideline implementability framework was developed by reviewing the implementation science literature. We then examined whether guidelines included these, or additional implementability elements. Data were extracted from publicly available high quality guidelines reflecting primary and institutional care, reviewed independently by two individuals, who through discussion resolved conflicts, then by the research team. Results The final implementability framework included 22 elements organized in the domains of adaptability, usability, validity, applicability, communicability, accommodation, implementation, and evaluation. Data were extracted from 20 guidelines on the management of diabetes, hypertension, leg ulcer, and heart failure. Most contained a large volume of graded, narrative evidence, and tables featuring complementary clinical information. Few contained additional features that could improve guideline use. These included alternate versions for different users and purposes, summaries of evidence and recommendations, information to facilitate interaction with and involvement of patients, details of resource implications, and instructions on how to locally promote and monitor guideline use. There were no consistent trends by guideline topic. Conclusions Numerous opportunities were identified by which guidelines could be modified to support various types of decision making by different users. New governance structures may be required to accommodate development of guidelines with these features. Further research is needed to validate the proposed

  5. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  6. Participation in the implementation of the Water Framework Directive in Denmark

    DEFF Research Database (Denmark)

    Wright, Stuart Anthony Lewis; Jacobsen, Brian Højland

    2011-01-01

    Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform to the re......Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform....... The paper then presents the Danish AGWAPLAN project which actively involved farmers in selecting measures to reduce diffuse nutrient pollution from agriculture. The second aim of the paper is to establish whether nationwide implementation of the AGWAPLAN concept is worthwhile. AGWAPLAN resulted in outcomes...... which could potentially increase the effectiveness of the WFD. Furthermore, the adoption of the project approach would also be one way to satisfy the requirement for active involvement in the Directive. However, some problems exist, relating to time, administrative costs, problems with control...

  7. A Testing and Implementation Framework (TIF) for Climate Adaptation Innovations : Initial Version of the TIF - Deliverable 5.1

    NARCIS (Netherlands)

    Sebastian, A.G.; Lendering, K.T.; van Loon-Steensma, J.M.; Paprotny, D.; Bellamy, Rob; Willems, Patrick; van Loenhout, Joris; Colaço, Conceição; Dias, Susana; Nunes, Leónia; Rego, Francisco; Koundouri, Phoebe; Xepapadeas, Petros; Vassilopoulos, Achilleas; Wiktor, Paweł; Wysocka-Golec, Justyna

    2017-01-01

    Currently there is no internationally accepted framework for assessing the readiness of innovations that reduce disaster risk. To fill this gap, BRIGAID is developing a standard, comprehensive Testing and Implementation Framework (TIF). The TIF is designed to provide innovators with a framework for

  8. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  9. More performance results and implementation of an object oriented track reconstruction model in different OO frameworks

    International Nuclear Information System (INIS)

    Gaines, Irwin; Qian Sijin

    2001-01-01

    This is an update of the report about an Object Oriented (OO) track reconstruction model, which was presented in the previous AIHENP'99 at Crete, Greece. The OO model for the Kalman filtering method has been designed for high energy physics experiments at high luminosity hadron colliders. It has been coded in the C++ programming language and successfully implemented into a few different OO computing environments of the CMS and ATLAS experiments at the future Large Hadron Collider at CERN. We shall report: (1) more performance result: (2) implementing the OO model into the new SW OO framework 'Athena' of ATLAS experiment and some upgrades of the OO model itself

  10. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    Science.gov (United States)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  11. ICT, Policy, Politics, and Democracy: An Integrated Framework for G2G Implementation

    Directory of Open Access Journals (Sweden)

    Iliana Mizinova

    2006-12-01

    Full Text Available This research approaches the issue of G2G digitization using an integrated policy dynamics model. The essence of the contradictions in the G2G integration discourse is followed by a description of two policy paradigms that are then incorporated into an integrated or synthetic framework to evaluate the specifics of the G2G implementation in DHS and HUD. Speculations are made about the implications of this study for the democratic principles of government rule.

  12. National Qualifications Framework For Higher Education in Turkey, and Architectural Education: Problems and Challenges of Implementation

    Directory of Open Access Journals (Sweden)

    Emel AKÖZER

    2013-01-01

    Full Text Available The Council of Higher Education (CoHE adopted the National Qualifications Framework for Higher Education in Turkey (NQF-HETR in May 2009, as part of the Bologna reforms. In January 2010, the CoHE decided full implementation of the NQF-HETR at institutional and program levels and in this decision, it was foreseen that the process would be completed by the end of December 2012. The NQFHETR has been aligned both to the overarching Framework for Qualifications in the European Higher Education Area (QF-EHEA, 2005 and to the European Qualifications Framework for lifelong learning (EQF-LLL, 2008. The latter was introduced to facilitate the European cooperation in education and training, in line with the goals of the European Union's (EU Lisbon Strategy. This paper focuses on some of the problems that have become apparent during the NQF-HETR's implementation at the levels of “narrow fields of education” and architecture programs, and the challenges ahead. Following a discussion of the significance of the two European frameworks in light of the goals of the EHEA, the Education and Training 2010 work programme (ET 2010 and the strategic framework for European cooperation in Education and Training (ET 2020, it covers two problem areas concerning qualifications in architecture: i terminological and classificatory problems entailed by the NQF-HETR; ii the lack of alignment between the European qualifications frameworks and the EU Directive on the Recognition of Professional Qualifications (Directive EC/2005/36 that covers seven “sectoral professions” including architecture. The paper also reviews the latest developments for the modernization of the EU Directive in order to provide progression in forming an integrated European Higher Education Area.

  13. The FERMI-Elettra distributed real-time framework

    International Nuclear Information System (INIS)

    Pivetta, L.; Gaio, G.; Passuello, R.; Scalamera, G.

    2012-01-01

    FERMI-Elettra is a Free Electron Laser (FEL) based on a 1.5 GeV linac. The pulsed operation of the accelerator and the necessity to characterize and control each electron bunch requires synchronous acquisition of the beam diagnostics together with the ability to drive actuators in real-time at the linac repetition rate. The Adeos/Xenomai real-time extensions have been adopted in order to add real-time capabilities to the Linux based control system computers running the Tango software. A software communication protocol based on Gigabit Ethernet and known as Network Reflective Memory (NRM) has been developed to implement a shared memory across the whole control system, allowing computers to communicate in real-time. The NRM architecture, the real-time performance and the integration in the control system are described. (authors)

  14. Operations management in distribution networks within a smart city framework.

    Science.gov (United States)

    Cerulli, Raffaele; Dameri, Renata Paola; Sciomachen, Anna

    2017-02-20

    This article studies a vehicle routing problem with environmental constraints that are motivated by the requirements for sustainable urban transport. The empirical research presents a fleet planning problem that takes into consideration both minimum cost vehicle routes and minimum pollution. The problem is formulated as a mixed integer linear programming model and experimentally validated using data collected from a real situation: a grocery company delivering goods ordered via e-channels to customers spread in the urban and metropolitan area of Genoa smart city. The proposed model is a variant of the vehicle routing problem tailored to include environmental issues and street limitations. Its novelty regards also the use of real data instances provided by the B2C grocery company. Managerial implications are the choice of both the routes and the number and type of vehicles. Results show that commercial distribution strategies achieve better results in term of both business and environmental performance, provided the smart mobility goals and constraints are included into the distribution model from the beginning. © The authors 2017. Published by Oxford University Press on behalf of the Institute of Mathematics and its Applications. All rights reserved.

  15. Analyse of The Legal Framework in Colombia for implementation of Bioprospecting Practices

    International Nuclear Information System (INIS)

    Duarte, Oscar; Velho Lea

    2008-01-01

    The practice of bioprospecting is inherently linked with traditional knowledge and practices of local communities in the South as well as with the commercial activities of industries (e.g., pharmaceutics sector, agriculture) in the North. A series of actors operate at this interface, such as Non-Governmental Organizations (NGOs), Research Centers, Universities, Science and Technology sponsor institutions and the State. As these actors have divergent interests and powers of negotiation, an appropriate regulatory framework is necessary to regulate their interaction. This paper analyzes the existing legal framework in a mega-diverse country, like Colombia, for implementation of bioprospecting practices. The research consisted of two key components: (i) A review of the state of art of bioprospecting; (ii) A work in situ in Colombia, which consisted of analysis of information and genetic resources related to bioprospecting, participation in the implementation of a legal frame for bioprospecting practices and interviews with Colombian professionals in the field of biodiversity conservation. Our research determined that: (i) national authorities encounter a multitude of difficulties to implement a legal framework in Colombia, especially the Andean regional normativity; (ii) the execution of research projects related to bioprospecting in Colombia faces numerous challenges

  16. Implementing Peer Learning in Clinical Education: A Framework to Address Challenges In the "Real World".

    Science.gov (United States)

    Tai, Joanna Hong Meng; Canny, Benedict J; Haines, Terry P; Molloy, Elizabeth K

    2017-01-01

    Phenomenon: Peer learning has many benefits and can assist students in gaining the educational skills required in future years when they become teachers themselves. Peer learning may be particularly useful in clinical learning environments, where students report feeling marginalized, overwhelmed, and unsupported. Educational interventions often fail in the workplace environment, as they are often conceived in the "ideal" rather than the complex, messy real world. This work sought to explore barriers and facilitators to implementing peer learning activities in a clinical curriculum. Previous peer learning research results and a matrix of empirically derived peer learning activities were presented to local clinical education experts to generate discussion around the realities of implementing such activities. Potential barriers and limitations of and strategies for implementing peer learning in clinical education were the focus of the individual interviews. Thematic analysis of the data identified three key considerations for real-world implementation of peer learning: culture, epistemic authority, and the primacy of patient-centered care. Strategies for peer learning implementation were also developed from themes within the data, focusing on developing a culture of safety in which peer learning could be undertaken, engaging both educators and students, and establishing expectations for the use of peer learning. Insights: This study identified considerations and strategies for the implementation of peer learning activities, which took into account both educator and student roles. Reported challenges were reflective of those identified within the literature. The resultant framework may aid others in anticipating implementation challenges. Further work is required to test the framework's application in other contexts and its effect on learner outcomes.

  17. A benchmarking program to reduce red blood cell outdating: implementation, evaluation, and a conceptual framework.

    Science.gov (United States)

    Barty, Rebecca L; Gagliardi, Kathleen; Owens, Wendy; Lauzon, Deborah; Scheuermann, Sheena; Liu, Yang; Wang, Grace; Pai, Menaka; Heddle, Nancy M

    2015-07-01

    Benchmarking is a quality improvement tool that compares an organization's performance to that of its peers for selected indicators, to improve practice. Processes to develop evidence-based benchmarks for red blood cell (RBC) outdating in Ontario hospitals, based on RBC hospital disposition data from Canadian Blood Services, have been previously reported. These benchmarks were implemented in 160 hospitals provincewide with a multifaceted approach, which included hospital education, inventory management tools and resources, summaries of best practice recommendations, recognition of high-performing sites, and audit tools on the Transfusion Ontario website (http://transfusionontario.org). In this study we describe the implementation process and the impact of the benchmarking program on RBC outdating. A conceptual framework for continuous quality improvement of a benchmarking program was also developed. The RBC outdating rate for all hospitals trended downward continuously from April 2006 to February 2012, irrespective of hospitals' transfusion rates or their distance from the blood supplier. The highest annual outdating rate was 2.82%, at the beginning of the observation period. Each year brought further reductions, with a nadir outdating rate of 1.02% achieved in 2011. The key elements of the successful benchmarking strategy included dynamic targets, a comprehensive and evidence-based implementation strategy, ongoing information sharing, and a robust data system to track information. The Ontario benchmarking program for RBC outdating resulted in continuous and sustained quality improvement. Our conceptual iterative framework for benchmarking provides a guide for institutions implementing a benchmarking program. © 2015 AABB.

  18. A Framework for ERP Post-Implementation Amendments: A Literature Analysis

    Directory of Open Access Journals (Sweden)

    Taiwo Oseni

    2017-06-01

    Full Text Available Post-implementation amendments to ERP systems (ERP-PIA are of importance for advancing ERP research, but more importantly essential if ERP systems are to be used as a strategic and competitive business tool. For ease of clarity, we have adopted the term “amendments” to encompass the main forms of post implementation activities: maintenance, enhancements and upgrades. The term ‘amendments’ is used to counteract one of the major findings from this research - the inconsistency of terms used by many authors to explain post implementation activities. This paper presents a review of the ERP post-implementation amendment literature in order to provide answers to two specific questions: first, what is the current state of research in the field of ERP-PIA; and second, what are the future research directions that need to be explored in the field of ERP-PIA. From the review, we develop a framework to identify: (a major themes concerning ERP post-implementation amendments, (b inherent gaps in the post-implementation amendments literature, and (c specific areas that require further research attention influencing the uptake of amendments. Suggestions on empirical evaluation of research directions and their relevance in the extension of existing literature is presented.

  19. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  20. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  1. Behavior and Convergence of Wasserstein Metric in the Framework of Stable Distributions

    Czech Academy of Sciences Publication Activity Database

    Omelchenko, Vadym

    2012-01-01

    Roč. 2012, č. 30 (2012), s. 124-138 ISSN 1212-074X R&D Projects: GA ČR GAP402/10/0956 Institutional research plan: CEZ:AV0Z10750506 Institutional support: RVO:67985556 Keywords : Wasserstein Metric * Stable Distributions * Empirical Distribution Function Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2013/E/omelchenko-behavior and convergence of wasserstein metric in the framework of stable distributions.pdf

  2. Links in a distributed database: Theory and implementation

    International Nuclear Information System (INIS)

    Karonis, N.T.; Kraimer, M.R.

    1991-12-01

    This document addresses the problem of extending database links across Input/Output Controller (IOC) boundaries. It lays a foundation by reviewing the current system and proposing an implementation specification designed to guide all work in this area. The document also describes an implementation that is less ambitious than our formally stated proposal, one that does not extend the reach of all database links across IOC boundaries. Specifically, it introduces an implementation of input and output links and comments on that overall implementation. We include a set of manual pages describing each of the new functions the implementation provides

  3. A QDWH-Based SVD Software Framework on Distributed-Memory Manycore Systems

    KAUST Repository

    Sukkari, Dalal

    2017-01-01

    This paper presents a high performance software framework for computing a dense SVD on distributed- memory manycore systems. Originally introduced by Nakatsukasa et al. (Nakatsukasa et al. 2010; Nakatsukasa and Higham 2013), the SVD solver relies on the polar decomposition using the QR Dynamically-Weighted Halley algorithm (QDWH). Although the QDWH-based SVD algorithm performs a significant amount of extra floating-point operations compared to the traditional SVD with the one-stage bidiagonal reduction, the inherent high level of concurrency associated with Level 3 BLAS compute-bound kernels ultimately compensates for the arithmetic complexity overhead. Using the ScaLAPACK two-dimensional block cyclic data distribution with a rectangular processor topology, the resulting QDWH-SVD further reduces excessive communications during the panel factorization, while increasing the degree of parallelism during the update of the trailing submatrix, as opposed to relying to the default square processor grid. After detailing the algorithmic complexity and the memory footprint of the algorithm, we conduct a thorough performance analysis and study the impact of the grid topology on the performance by looking at the communication and computation profiling trade-offs. We report performance results against state-of-the-art existing QDWH software implementations (e.g., Elemental) and their SVD extensions on large-scale distributed-memory manycore systems based on commodity Intel x86 Haswell processors and Knights Landing (KNL) architecture. The QDWH-SVD framework achieves up to 3/8-fold on the Haswell/KNL-based platforms, respectively, against ScaLAPACK PDGESVD and turns out to be a competitive alternative for well and ill-conditioned matrices. We finally come up herein with a performance model based on these empirical results. Our QDWH-based polar decomposition and its SVD extension are freely available at https://github.com/ecrc/qdwh.git and https

  4. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems.

    Science.gov (United States)

    Atkins, Lou; Francis, Jill; Islam, Rafat; O'Connor, Denise; Patey, Andrea; Ivers, Noah; Foy, Robbie; Duncan, Eilidh M; Colquhoun, Heather; Grimshaw, Jeremy M; Lawton, Rebecca; Michie, Susan

    2017-06-21

    Implementing new practices requires changes in the behaviour of relevant actors, and this is facilitated by understanding of the determinants of current and desired behaviours. The Theoretical Domains Framework (TDF) was developed by a collaboration of behavioural scientists and implementation researchers who identified theories relevant to implementation and grouped constructs from these theories into domains. The collaboration aimed to provide a comprehensive, theory-informed approach to identify determinants of behaviour. The first version was published in 2005, and a subsequent version following a validation exercise was published in 2012. This guide offers practical guidance for those who wish to apply the TDF to assess implementation problems and support intervention design. It presents a brief rationale for using a theoretical approach to investigate and address implementation problems, summarises the TDF and its development, and describes how to apply the TDF to achieve implementation objectives. Examples from the implementation research literature are presented to illustrate relevant methods and practical considerations. Researchers from Canada, the UK and Australia attended a 3-day meeting in December 2012 to build an international collaboration among researchers and decision-makers interested in the advancing use of the TDF. The participants were experienced in using the TDF to assess implementation problems, design interventions, and/or understand change processes. This guide is an output of the meeting and also draws on the authors' collective experience. Examples from the implementation research literature judged by authors to be representative of specific applications of the TDF are included in this guide. We explain and illustrate methods, with a focus on qualitative approaches, for selecting and specifying target behaviours key to implementation, selecting the study design, deciding the sampling strategy, developing study materials, collecting and

  5. Consolidating tactical planning and implementation frameworks for integrated vector management in Uganda.

    Science.gov (United States)

    Okia, Michael; Okui, Peter; Lugemwa, Myers; Govere, John M; Katamba, Vincent; Rwakimari, John B; Mpeka, Betty; Chanda, Emmanuel

    2016-04-14

    Integrated vector management (IVM) is the recommended approach for controlling some vector-borne diseases (VBD). In the face of current challenges to disease vector control, IVM is vital to achieve national targets set for VBD control. Though global efforts, especially for combating malaria, now focus on elimination and eradication, IVM remains useful for Uganda which is principally still in the control phase of the malaria continuum. This paper outlines the processes undertaken to consolidate tactical planning and implementation frameworks for IVM in Uganda. The Uganda National Malaria Control Programme with its efforts to implement an IVM approach to vector control was the 'case' for this study. Integrated management of malaria vectors in Uganda remained an underdeveloped component of malaria control policy. In 2012, knowledge and perceptions of malaria vector control policy and IVM were assessed, and recommendations for a specific IVM policy were made. In 2014, a thorough vector control needs assessment (VCNA) was conducted according to WHO recommendations. The findings of the VCNA informed the development of the national IVM strategic guidelines. Information sources for this study included all available data and accessible archived documentary records on VBD control in Uganda. The literature was reviewed and adapted to the local context and translated into the consolidated tactical framework. WHO recommends implementation of IVM as the main strategy to vector control and has encouraged member states to adopt the approach. However, many VBD-endemic countries lack IVM policy frameworks to guide implementation of the approach. In Uganda most VBD coexists and could be managed more effectively if done in tandem. In order to successfully control malaria and other VBD and move towards their elimination, the country needs to scale up proven and effective vector control interventions and also learn from the experience of other countries. The IVM strategy is important in

  6. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  7. ClimateSpark: An in-memory distributed computing framework for big climate data analytics

    Science.gov (United States)

    Hu, Fei; Yang, Chaowei; Schnase, John L.; Duffy, Daniel Q.; Xu, Mengchao; Bowen, Michael K.; Lee, Tsengdar; Song, Weiwei

    2018-06-01

    The unprecedented growth of climate data creates new opportunities for climate studies, and yet big climate data pose a grand challenge to climatologists to efficiently manage and analyze big data. The complexity of climate data content and analytical algorithms increases the difficulty of implementing algorithms on high performance computing systems. This paper proposes an in-memory, distributed computing framework, ClimateSpark, to facilitate complex big data analytics and time-consuming computational tasks. Chunking data structure improves parallel I/O efficiency, while a spatiotemporal index is built for the chunks to avoid unnecessary data reading and preprocessing. An integrated, multi-dimensional, array-based data model (ClimateRDD) and ETL operations are developed to address big climate data variety by integrating the processing components of the climate data lifecycle. ClimateSpark utilizes Spark SQL and Apache Zeppelin to develop a web portal to facilitate the interaction among climatologists, climate data, analytic operations and computing resources (e.g., using SQL query and Scala/Python notebook). Experimental results show that ClimateSpark conducts different spatiotemporal data queries/analytics with high efficiency and data locality. ClimateSpark is easily adaptable to other big multiple-dimensional, array-based datasets in various geoscience domains.

  8. A Theory of Information Quality and a Framework for its Implementation in the Requirements Engineering Process

    Science.gov (United States)

    Grenn, Michael W.

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of information. The meaning of requirements information in the systems engineering context is estimated or measured in terms of the cumulative requirements quality Q which corresponds to the distribution of the requirements among the available quality levels. The requirements entropy framework (REF) implements the theory to address the requirements engineering problem. The REF defines the relationship between requirements changes, requirements volatility, requirements quality, requirements entropy and uncertainty, and engineering effort. The REF is evaluated via simulation experiments to assess its practical utility as a new method for measuring, monitoring and predicting requirements trends and engineering effort at any given time in the process. The REF treats the requirements engineering process as an open system in which the requirements are discrete information entities that transition from initial states of high entropy, disorder and uncertainty toward the desired state of minimum entropy as engineering effort is input and requirements increase in quality. The distribution of the total number of requirements R among the N discrete quality levels is determined by the number of defined quality attributes accumulated by R at any given time. Quantum statistics are used to estimate the number of possibilities P for arranging R among the available quality levels. The requirements entropy H R is estimated using R, N and P by extending principles of information theory and statistical mechanics to the requirements engineering process. The information I increases as HR and uncertainty decrease, and the change in information AI needed

  9. Using the "customer service framework" to successfully implement patient- and family-centered care.

    Science.gov (United States)

    Rangachari, Pavani; Bhat, Anita; Seol, Yoon-Ho

    2011-01-01

    Despite the growing momentum toward patient- and family-centered care at the federal policy level, the organizational literature remains divided on its effectiveness, especially in regard to its key dimension of involving patients and families in treatment decisions and safety practices. Although some have argued for the universal adoption of patient involvement, others have questioned both the effectiveness and feasibility of patient involvement. In this article, we apply a well-established theoretical perspective, that is, the Service Quality Model (SQM) (also known as the "customer service framework") to the health care context, to reconcile the debate related to patient involvement. The application helps support the case for universal adoption of patient involvement and also question the arguments against it. A key contribution of the SQM lies in highlighting a set of fundamental service quality determinants emanating from basic consumer service needs. It also provides a simple framework for understanding how gaps between consumer expectations and management perceptions of those expectations can affect the gap between "expected" and "perceived" service quality from a consumer's perspective. Simultaneously, the SQM also outlines "management requirements" for the successful implementation of a customer service strategy. Applying the SQM to the health care context therefore, in addition to reconciling the debate on patient involvement, helps identify specific steps health care managers could take to successfully implement patient- and family-centered care. Correspondingly, the application also provides insights into strategies for the successful implementation of policy recommendations related to patient- and family-centered care in health care organizations.

  10. Towards a resilience management framework for complex enterprise systems upgrade implementation

    Science.gov (United States)

    Teoh, Say Yen; Yeoh, William; Zadeh, Hossein Seif

    2017-05-01

    The lack of knowledge of how resilience management supports enterprise system (ES) projects accounts for the failure of firms to leverage their investments in costly ES implementations. Using a structured-pragmatic-situational (SPS) case study research approach, this paper reports on an investigation into the resilience management of a large utility company as it implemented an ES upgrade. Drawing on the literature and on the case study findings, we developed a process-based resilience management framework that involves three strategies (developing situation awareness, demystifying threats, and executing restoration plans) and four organisational capabilities that transform resilience management concepts into practices. We identified the crucial phases of ES upgrade implementation and developed indicators for how different strategies and capabilities of resilience management can assist managers at different stages of an ES upgrade. This research advances the state of existing knowledge by providing specific and verifiable propositions for attaining a state of resilience, the knowledge being grounded in the empirical reality of a case study. Moreover, the framework offers ES practitioners a roadmap to better identify appropriate responses and levels of preparedness.

  11. Developing a Framework for Traceability Implementation in the Textile Supply Chain

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2017-04-01

    Full Text Available Traceability has recently gained considerable attention in the textile industry. Traceability stands for information sharing about a product including the product history, specification, or location. With the involvement of globally dispersed actors in the textile supply chain, ensuring appropriate product quality with timely supplies is crucial for surviving in this industry with ever increasing competition. Hence it is of paramount importance for a supply chain actor to track every product and trace its history in the supply chain. In this context, this paper presents a framework to implement traceability in the textile supply chain. A system approach has been followed, where firstly the usage requirement of traceability is defined, and then a framework for implementing intra-actor or internal traceability and inter-actor or external traceability is discussed. This article further presents a sequential diagram to demonstrate the interaction and information exchange between the actors in the supply chain, when the traceability information is requested. An example is also illustrated for data storage using a relational database management system and information exchange using XML for the textile weaver. Finally, the article discusses challenges and future studies required to implement traceability in the textile supply chain.

  12. Implementing the European Marine Strategy Framework Directive: Scientific challenges and opportunities

    Science.gov (United States)

    Newton, Alice; Borja, Angel; Solidoro, Cosimo; Grégoire, Marilaure

    2015-10-01

    The Marine Strategy Framework Directive (MSFD; EC, 2008) is an ambitious European policy instrument that aims to achieve Good Environmental Status (GES) in the 5,720,000 km2 of European seas by 2020, using an Ecosystem Approach. GES is to be assessed using 11 descriptors and up to 56 indicators (European Commission, 2010), and the goal is for clean, healthy and productive seas that are the basis for marine-based development, known as Blue-Growth. The MSFD is one of many policy instruments, such as the Water Framework Directive, the Common Fisheries Policy and the Habitats Directive that, together, should result in "Healthy Oceans and Productive Ecosystems - HOPE". Researchers working together with stakeholders such as the Member States environmental agencies, the European Environmental Agency, and the Regional Sea Conventions, are to provide the scientific knowledge basis for the implementation of the MSFD. This represents both a fascinating challenge and a stimulating opportunity.

  13. European union water policy--tasks for implementing "Water Framework Directive" in pre-accession countries.

    Science.gov (United States)

    Sözen, Seval; Avcioglu, Ebru; Ozabali, Asli; Görgun, Erdem; Orhon, Derin

    2003-08-01

    Water Framework Directive aiming to maintain and improve the aquatic environment in the EU was launched by the European Parliament in 2000. According to this directive, control of quantity is an ancillary element in securing good water quality and therefore measures on quantity, serving the objective of ensuring good quality should also be established. Accordingly, it is a comprehensive and coordinated package that will ensure all European waters to be protected according to a common standard. Therefore, it refers to all other Directives related to water resources management such as Urban Wastewater Treatment Directive Nitrates Directive, Drinking Water Directive, Integrated Pollution Prevention Control etc. Turkey, as a candidate state targeting full-membership, should comply the necessary preparations for the implementation of the "Water Framework Directive" as soon as possible. In this study, the necessary legislative, political, institutional, and technical attempts of the pre-accession countries have been discussed and effective recommendations have been offered for future activities in Turkey.

  14. The SBIRT program matrix: a conceptual framework for program implementation and evaluation.

    Science.gov (United States)

    Del Boca, Frances K; McRee, Bonnie; Vendetti, Janice; Damon, Donna

    2017-02-01

    Screening, Brief Intervention and Referral to Treatment (SBIRT) is a comprehensive, integrated, public health approach to the delivery of services to those at risk for the adverse consequences of alcohol and other drug use, and for those with probable substance use disorders. Research on successful SBIRT implementation has lagged behind studies of efficacy and effectiveness. This paper (1) outlines a conceptual framework, the SBIRT Program Matrix, to guide implementation research and program evaluation and (2) specifies potential implementation outcomes. Overview and narrative description of the SBIRT Program Matrix. The SBIRT Program Matrix has five components, each of which includes multiple elements: SBIRT services; performance sites; provider attributes; patient/client populations; and management structure and activities. Implementation outcomes include program adoption, acceptability, appropriateness, feasibility, fidelity, costs, penetration, sustainability, service provision and grant compliance. The Screening, Brief Intervention and Referral to Treatment Program Matrix provides a template for identifying, classifying and organizing the naturally occurring commonalities and variations within and across SBIRT programs, and for investigating which variables are associated with implementation success and, ultimately, with treatment outcomes and other impacts. © 2017 Society for the Study of Addiction.

  15. Reducing Binge Drinking in Adolescents through Implementation of the Strategic Prevention Framework

    Science.gov (United States)

    Anderson-Carpenter, Kaston D.; Watson-Thompson, Jomella; Chaney, Lisa; Jones, Marvia

    2016-01-01

    The Strategic Prevention Framework (SPF) is a conceptual model that supports coalition-driven efforts to address underage drinking and related consequences. Although the SPF has been promoted by the U.S. Substance Abuse and Mental Health Services Administration’s Center for Substance Abuse Prevention and implemented in multiple U.S. states and territories, there is limited research on the SPF’s effectiveness on improving targeted outcomes and associated influencing factors. The present quasi-experimental study examines the effects of SPF implementation on binge drinking and enforcement of existing underage drinking laws as an influencing factor. The intervention group encompassed 11 school districts that were implementing the SPF with local prevention coalitions across eight Kansas communities. The comparison group consisted of 14 school districts that were matched based on demographic variables. The intervention districts collectively facilitated 137 community-level changes, including new or modified programs, policies, and practices. SPF implementation supported significant improvements in binge drinking and enforcement outcomes over time (p .05). Overall, the findings provide a basis for guiding future research and community-based prevention practice in implementing and evaluating the SPF. PMID:27217310

  16. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation.

    Science.gov (United States)

    Scobbie, Lesley; McLean, Donald; Dixon, Diane; Duncan, Edward; Wyke, Sally

    2013-05-24

    Goal setting is considered 'best practice' in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient's well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. G-AP has been perceived as both beneficial and broadly acceptable in one community rehabilitation team; however, implementation of novel

  17. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation

    Science.gov (United States)

    2013-01-01

    Background Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning & coping planning and appraisal & feedback. All stages are recorded in a patient-held record. In this study we examined the implementation, acceptability and perceived benefits of G-AP in one community rehabilitation team with people recovering from stroke. Methods G-AP was implemented for 6 months with 23 stroke patients. In-depth interviews with 8 patients and 8 health professionals were analysed thematically to investigate views of its implementation, acceptability and perceived benefits. Case notes of interviewed patients were analysed descriptively to assess the fidelity of G-AP implementation. Results G-AP was mostly implemented according to protocol with deviations noted at the planning and appraisal and feedback stages. Each stage was felt to make a useful contribution to the overall process; however, in practice, goal negotiation and goal setting merged into one stage and the appraisal and feedback stage included an explicit decision making component. Only two issues were raised regarding G-APs acceptability: (i) health professionals were concerned about the impact of goal non-attainment on patient’s well-being (patients did not share their concerns), and (ii) some patients and health professionals found the patient-held record unhelpful. G-AP was felt to have a positive impact on patient goal attainment and professional goal setting practice. Collaborative partnerships between health professionals and patients were apparent throughout the process. Conclusions G-AP has been perceived as both beneficial and broadly acceptable in one community

  18. Organizational Health Literacy: Review of Theories, Frameworks, Guides, and Implementation Issues

    Science.gov (United States)

    Bonneville, Luc; Bouchard, Louise

    2018-01-01

    Organizational health literacy is described as an organization-wide effort to transform organization and delivery of care and services to make it easier for people to navigate, understand, and use information and services to take care of their health. Several health literacy guides have been developed to assist healthcare organizations with this effort, but their content has not been systematically reviewed to understand the scope and practical implications of this transformation. The objective of this study was to review (1) theories and frameworks that inform the concept of organizational health literacy, (2) the attributes of organizational health literacy as described in the guides, (3) the evidence for the effectiveness of the guides, and (4) the barriers and facilitators to implementing organizational health literacy. Drawing on a metanarrative review method, 48 publications were reviewed, of which 15 dealt with the theories and operational frameworks, 20 presented health literacy guides, and 13 addressed guided implementation of organizational health literacy. Seven theories and 9 operational frameworks have been identified. Six health literacy dimensions and 9 quality-improvement characteristics were reviewed for each health literacy guide. Evidence about the effectiveness of health literacy guides is limited at this time, but experiences with the guides were positive. Thirteen key barriers (conceived also as facilitators) were identified. Further development of organizational health literacy requires a strong and a clear connection between its vision and operationalization as an implementation strategy to patient-centered care. For many organizations, becoming health literate will require multiple, simultaneous, and radical changes. Organizational health literacy has to make sense from clinical and financial perspectives in order for organizations to embark on such transformative journey. PMID:29569968

  19. Organizational Health Literacy: Review of Theories, Frameworks, Guides, and Implementation Issues.

    Science.gov (United States)

    Farmanova, Elina; Bonneville, Luc; Bouchard, Louise

    2018-01-01

    Organizational health literacy is described as an organization-wide effort to transform organization and delivery of care and services to make it easier for people to navigate, understand, and use information and services to take care of their health. Several health literacy guides have been developed to assist healthcare organizations with this effort, but their content has not been systematically reviewed to understand the scope and practical implications of this transformation. The objective of this study was to review (1) theories and frameworks that inform the concept of organizational health literacy, (2) the attributes of organizational health literacy as described in the guides, (3) the evidence for the effectiveness of the guides, and (4) the barriers and facilitators to implementing organizational health literacy. Drawing on a metanarrative review method, 48 publications were reviewed, of which 15 dealt with the theories and operational frameworks, 20 presented health literacy guides, and 13 addressed guided implementation of organizational health literacy. Seven theories and 9 operational frameworks have been identified. Six health literacy dimensions and 9 quality-improvement characteristics were reviewed for each health literacy guide. Evidence about the effectiveness of health literacy guides is limited at this time, but experiences with the guides were positive. Thirteen key barriers (conceived also as facilitators) were identified. Further development of organizational health literacy requires a strong and a clear connection between its vision and operationalization as an implementation strategy to patient-centered care. For many organizations, becoming health literate will require multiple, simultaneous, and radical changes. Organizational health literacy has to make sense from clinical and financial perspectives in order for organizations to embark on such transformative journey.

  20. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    Directory of Open Access Journals (Sweden)

    Ndawi Benedict

    2011-02-01

    Full Text Available Abstract Background Despite the growing importance of the Accountability for Reasonableness (A4R framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full

  1. Surveillance indicators and their use in implementation of the Marine Strategy Framework Directive

    DEFF Research Database (Denmark)

    Shephard, Samuel; Greenstreet, Simon P. R.; Piet, GerJan J.

    2015-01-01

    The European Union Marine Strategy Framework Directive (MSFD) uses indicators to track ecosystem state in relation to Good Environmental Status (GES). These indicators were initially expected to be “operational”, i.e. to have well-understood relationships between state and specified anthropogenic...... pressure(s), and to have defined targets. Recent discussion on MSFD implementation has highlighted an additional class of “surveillance” indicators. Surveillance indicators monitor key aspects of the ecosystem for which there is: first, insufficient evidence to define targets and support formal state...

  2. Researcher readiness for participating in community-engaged dissemination and implementation research: a conceptual framework of core competencies.

    Science.gov (United States)

    Shea, Christopher M; Young, Tiffany L; Powell, Byron J; Rohweder, Catherine; Enga, Zoe K; Scott, Jennifer E; Carter-Edwards, Lori; Corbie-Smith, Giselle

    2017-09-01

    Participating in community-engaged dissemination and implementation (CEDI) research is challenging for a variety of reasons. Currently, there is not specific guidance or a tool available for researchers to assess their readiness to conduct CEDI research. We propose a conceptual framework that identifies detailed competencies for researchers participating in CEDI and maps these competencies to domains. The framework is a necessary step toward developing a CEDI research readiness survey that measures a researcher's attitudes, willingness, and self-reported ability for acquiring the knowledge and performing the behaviors necessary for effective community engagement. The conceptual framework for CEDI competencies was developed by a team of eight faculty and staff affiliated with a university's Clinical and Translational Science Award (CTSA). The authors developed CEDI competencies by identifying the attitudes, knowledge, and behaviors necessary for carrying out commonly accepted CE principles. After collectively developing an initial list of competencies, team members individually mapped each competency to a single domain that provided the best fit. Following the individual mapping, the group held two sessions in which the sorting preferences were shared and discrepancies were discussed until consensus was reached. During this discussion, modifications to wording of competencies and domains were made as needed. The team then engaged five community stakeholders to review and modify the competencies and domains. The CEDI framework consists of 40 competencies organized into nine domains: perceived value of CE in D&I research, introspection and openness, knowledge of community characteristics, appreciation for stakeholder's experience with and attitudes toward research, preparing the partnership for collaborative decision-making, collaborative planning for the research design and goals, communication effectiveness, equitable distribution of resources and credit, and

  3. A framework for implementation of user-centric identity management systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    2010-01-01

    Users increasingly become part of electronic transactions, which take place instantaneously without direct user involvement. This leads to the risk of data manipulation, identity theft and privacy violation, and it has become a major concern for individuals and businesses around the world. Gov......-ernments in many countries are implementing identity man-agement systems (IdMS) to curtail these incidences and to offer citizens the power to exercise informational self-determination. Using concepts from technology adoption and fit-viability theo-ries as well as the laws of identity, this paper analyzes...... the crite-ria for successful implementation and defines a framework for a citizen-centric national IdMS. Results from a survey con-ducted in Ghana are also included....

  4. A Framework for Enhancing the Value of Research for Dissemination and Implementation.

    Science.gov (United States)

    Neta, Gila; Glasgow, Russell E; Carpenter, Christopher R; Grimshaw, Jeremy M; Rabin, Borsika A; Fernandez, Maria E; Brownson, Ross C

    2015-01-01

    A comprehensive guide that identifies critical evaluation and reporting elements necessary to move research into practice is needed. We propose a framework that highlights the domains required to enhance the value of dissemination and implementation research for end users. We emphasize the importance of transparent reporting on the planning phase of research in addition to delivery, evaluation, and long-term outcomes. We highlight key topics for which well-established reporting and assessment tools are underused (e.g., cost of intervention, implementation strategy, adoption) and where such tools are inadequate or lacking (e.g., context, sustainability, evolution) within the context of existing reporting guidelines. Consistent evaluation of and reporting on these issues with standardized approaches would enhance the value of research for practitioners and decision-makers.

  5. Using a knowledge translation framework to implement asthma clinical practice guidelines in primary care.

    Science.gov (United States)

    Licskai, Christopher; Sands, Todd; Ong, Michael; Paolatto, Lisa; Nicoletti, Ivan

    2012-10-01

    Quality problem International guidelines establish evidence-based standards for asthma care; however, recommendations are often not implemented and many patients do not meet control targets. Initial assessment Regional pilot data demonstrated a knowledge-to-practice gap. Choice of solutions We engineered health system change in a multi-step approach described by the Canadian Institutes of Health Research knowledge translation framework. Implementation Knowledge translation occurred at multiple levels: patient, practice and local health system. A regional administrative infrastructure and inter-disciplinary care teams were developed. The key project deliverable was a guideline-based interdisciplinary asthma management program. Six community organizations, 33 primary care physicians and 519 patients participated. The program operating cost was $290/patient. Evaluation Six guideline-based care elements were implemented, including spirometry measurement, asthma controller therapy, a written self-management action plan and general asthma education, including the inhaler device technique, role of medications and environmental control strategies in 93, 95, 86, 100, 97 and 87% of patients, respectively. Of the total patients 66% were adults, 61% were female, the mean age was 35.7 (SD = ± 24.2) years. At baseline 42% had two or more symptoms beyond acceptable limits vs. 17% (P< 0.001) post-intervention; 71% reported urgent/emergent healthcare visits at baseline (2.94 visits/year) vs. 45% (1.45 visits/year) (P< 0.001); 39% reported absenteeism (5.0 days/year) vs. 19% (3.0 days/year) (P< 0.001). The mean follow-up interval was 22 (SD = ± 7) months. Lessons learned A knowledge-translation framework can guide multi-level organizational change, facilitate asthma guideline implementation, and improve health outcomes in community primary care practices. Program costs are similar to those of diabetes programs. Program savings offset costs in a ratio of 2.1:1.

  6. Climate Services Information System Activities in Support of The Global Framework for Climate Services Implementation

    Science.gov (United States)

    Timofeyeva-Livezey, M. M.; Horsfall, F. M. C.; Pulwarty, R. S.; Klein-Tank, A.; Kolli, R. K.; Hechler, P.; Dilley, M.; Ceron, J. P.; Goodess, C.

    2017-12-01

    The WMO Commission on Climatology (CCl) supports the implementation of the Global Framework for Climate Services (GFCS) with a particular focus on the Climate Services Information System (CSIS), which is the core operational component of GFCS at the global, regional, and national level. CSIS is designed for producing, packaging and operationally delivering authoritative climate information data and products through appropriate operational systems, practices, data exchange, technical standards, authentication, communication, and product delivery. Its functions include climate analysis and monitoring, assessment and attribution, prediction (monthly, seasonal, decadal), and projection (centennial scale) as well as tailoring the associated products tUEAo suit user requirements. A central, enabling piece of implementation of CSIS is a Climate Services Toolkit (CST). In its development phase, CST exists as a prototype (www.wmo.int/cst) as a compilation of tools for generating tailored data and products for decision-making, with a special focus on national requirements in developing countries. WMO provides a server to house the CST prototype as well as support operations and maintenance. WMO members provide technical expertise and other in-kind support, including leadership of the CSIS development team. Several recent WMO events have helped with the deployment of CST within the eight countries that have been recognized by GFCS as illustrative for developing their climate services at national levels. Currently these countries are developing climate services projects focusing service development and delivery for selected economic sectors, such as for health, agriculture, energy, water resources, and hydrometeorological disaster risk reduction. These countries are working together with their respective WMO Regional Climate Centers (RCCs), which provide technical assistance with implementation of climate services projects at the country level and facilitate development of

  7. Using a knowledge translation framework to implement asthma clinical practice guidelines in primary care

    Science.gov (United States)

    Licskai, Christopher; Sands, Todd; Ong, Michael; Paolatto, Lisa; Nicoletti, Ivan

    2012-01-01

    Quality problem International guidelines establish evidence-based standards for asthma care; however, recommendations are often not implemented and many patients do not meet control targets. Initial assessment Regional pilot data demonstrated a knowledge-to-practice gap. Choice of solutions We engineered health system change in a multi-step approach described by the Canadian Institutes of Health Research knowledge translation framework. Implementation Knowledge translation occurred at multiple levels: patient, practice and local health system. A regional administrative infrastructure and inter-disciplinary care teams were developed. The key project deliverable was a guideline-based interdisciplinary asthma management program. Six community organizations, 33 primary care physicians and 519 patients participated. The program operating cost was $290/patient. Evaluation Six guideline-based care elements were implemented, including spirometry measurement, asthma controller therapy, a written self-management action plan and general asthma education, including the inhaler device technique, role of medications and environmental control strategies in 93, 95, 86, 100, 97 and 87% of patients, respectively. Of the total patients 66% were adults, 61% were female, the mean age was 35.7 (SD = ±24.2) years. At baseline 42% had two or more symptoms beyond acceptable limits vs. 17% (Pabsenteeism (5.0 days/year) vs. 19% (3.0 days/year) (P< 0.001). The mean follow-up interval was 22 (SD = ±7) months. Lessons learned A knowledge-translation framework can guide multi-level organizational change, facilitate asthma guideline implementation, and improve health outcomes in community primary care practices. Program costs are similar to those of diabetes programs. Program savings offset costs in a ratio of 2.1:1 PMID:22893665

  8. Using an ontology as a model for the implementation of the National Cybersecurity Policy Framework for South Africa

    CSIR Research Space (South Africa)

    Jansen van Vuuren, JC

    2014-03-01

    Full Text Available National Cybersecurity Policy Framework that is easy to understand and implement. In this paper, the authors motivate that an ontology can assist in defining a model that describes the relationships between different stakeholders and cybersecurity...

  9. Hybrid Multi-Agent Control in Microgrids: Framework, Models and Implementations Based on IEC 61850

    Directory of Open Access Journals (Sweden)

    Xiaobo Dou

    2014-12-01

    Full Text Available Operation control is a vital and complex issue for microgrids. The objective of this paper is to explore the practical means of applying decentralized control by using a multi agent system in actual microgrids and devices. This paper presents a hierarchical control framework (HCF consisting of local reaction control (LRC level, local decision control (LDC level, horizontal cooperation control (HCC level and vertical cooperation control (VCC level to meet different control requirements of a microgrid. Then, a hybrid multi-agent control model (HAM is proposed to implement HCF, and the properties, functionalities and operating rules of HAM are described. Furthermore, the paper elaborates on the implementation of HAM based on the IEC 61850 Standard, and proposes some new implementation methods, such as extended information models of IEC 61850 with agent communication language and bidirectional interaction mechanism of generic object oriented substation event (GOOSE communication. A hardware design and software system are proposed and the results of simulation and laboratory tests verify the effectiveness of the proposed strategies, models and implementations.

  10. The Midwifery Services Framework: Lessons learned from the initial stages of implementation in six countries.

    Science.gov (United States)

    Garg, Shantanu; Moyo, Nester T; Nove, Andrea; Bokosi, Martha

    2018-07-01

    In 2015, the International Confederation of Midwives (ICM) launched the Midwifery Services Framework (MSF): an evidence-based tool to guide countries through the process of improving their sexual, reproductive, maternal and newborn health services through strengthening and developing the midwifery workforce. The MSF is aligned with key global architecture for sexual, reproductive, maternal and newborn health and human resources for health. This third in a series of three papers describes the experience of starting to implement the MSF in the first six countries that requested ICM support to adopt the tool, and the lessons learned during these early stages of implementation. The early adopting countries selected a variety of priority work areas, but nearly all highlighted the importance of improving the attractiveness of midwifery as a career so as to improve attraction and retention, and several saw the need for improvements to midwifery regulation, pre-service education, availability and/or accessibility of midwives. Key lessons from the early stages of implementation include the need to ensure a broad range of stakeholder involvement from the outset and the need for an in-country lead organisation to maintain the momentum of implementation even when there are changes in political leadership, security concerns or other barriers to progress. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Implementing forward recovery using checkpointing in distributed systems

    Science.gov (United States)

    Long, Junsheng; Fuchs, W. K.; Abraham, Jacob A.

    1991-01-01

    The paper describes the implementation of a forward recovery scheme using checkpoints and replicated tasks. The implementation is based on the concept of lookahead execution and rollback validation. In the experiment, two tasks are selected for the normal execution and one for rollback validation. It is shown that the recovery strategy has nearly error-free execution time and an average redundancy lower than TMR.

  12. Implementation of Enterprise Risk Management (ERM Framework in Enhancing Business Performances in Oil and Gas Sector

    Directory of Open Access Journals (Sweden)

    Sanmugam Annamalah

    2018-01-01

    Full Text Available This study empirically investigated the ERM Implementation model and proposed framework to identify and manage risks in Oil and Gas Sector in Malaysia. The study examined the role of ERM framework implementation in improving business performance by utilizing Economic Value Added as a measurement tool. The study also provides insights to the Oil and Gas Sector to gain higher profit returns, reduce cost of capital, and improve shareholders value. Moreover, it contributes significantly in the field of Enterprise risk management in Malaysia. The identification and management of risk is significant to organizations in managing risks efficiently. Expectations of stakeholders of the organization are high from executives and board of directors in managing the risk effectively. Linear regression analysis is utilized in analyzing the data obtained from the data collection performed for this paper. Purposive sampling has been employed in order to select the firms that are operating in Malaysian oil and gas sector. Primary data has been utilized to collect data with the help of structured questions and interview techniques that involve semi structured questions. The results of the regression analysis conducted for in this study suggested that a significant and positive relationship between Enterprise Risk Management with operational risk; market risk; political risk; health, safety and environmental risk; and, also business performance.

  13. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  14. A Transparent Framework for Evaluating the Effects of DGPV on Distribution System Costs

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Mather, Barry A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ding, Fei [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Denholm, Paul L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-04-02

    Assessing the costs and benefits of distributed photovoltaic generators (DGPV) to the power system and electricity consumers is key to determining appropriate policies, tariff designs, and power system upgrades for the modern grid. We advance understanding of this topic by providing a transparent framework, terminology, and data set for evaluating distribution system upgrade costs, line losses, and interconnection costs as a function of DGPV penetration level.

  15. Transactive control: a framework for operating power systems characterized by high penetration of distributed energy resources

    DEFF Research Database (Denmark)

    Hu, Junjie; Yang, Guangya; Kok, Koen

    2016-01-01

    The increasing number of distributed energy resources connected to power systems raises operational challenges for the network operator, such as introducing grid congestion and voltage deviations in the distribution network level, as well as increasing balancing needs at the whole system level......, followed by a literature review and demonstration projects that apply to transactive control. Cases are then presented to illustrate the transactive control framework. At the end, discussions and research directions are presented, for applying transactive control to operating power systems, characterized...

  16. DeepSpark: A Spark-Based Distributed Deep Learning Framework for Commodity Clusters

    OpenAIRE

    Kim, Hanjoo; Park, Jaehong; Jang, Jaehee; Yoon, Sungroh

    2016-01-01

    The increasing complexity of deep neural networks (DNNs) has made it challenging to exploit existing large-scale data processing pipelines for handling massive data and parameters involved in DNN training. Distributed computing platforms and GPGPU-based acceleration provide a mainstream solution to this computational challenge. In this paper, we propose DeepSpark, a distributed and parallel deep learning framework that exploits Apache Spark on commodity clusters. To support parallel operation...

  17. Mobile agent-enabled framework for structuring and building distributed systems on the internet

    Institute of Scientific and Technical Information of China (English)

    CAO Jiannong; ZHOU Jingyang; ZHU Weiwei; LI Xuhui

    2006-01-01

    Mobile agent has shown its promise as a powerful means to complement and enhance existing technology in various application areas. In particular, existing work has demonstrated that MA can simplify the development and improve the performance of certain classes of distributed applications, especially for those running on a wide-area, heterogeneous, and dynamic networking environment like the Internet. In our previous work, we extended the application of MA to the design of distributed control functions, which require the maintenance of logical relationship among and/or coordination of processing entities in a distributed system. A novel framework is presented for structuring and building distributed systems, which use cooperating mobile agents as an aid to carry out coordination and cooperation tasks in distributed systems. The framework has been used for designing various distributed control functions such as load balancing and mutual exclusion in our previous work. In this paper, we use the framework to propose a novel approach to detecting deadlocks in distributed system by using mobile agents, which demonstrates the advantage of being adaptive and flexible of mobile agents. We first describe the MAEDD (Mobile Agent Enabled Deadlock Detection) scheme, in which mobile agents are dispatched to collect and analyze deadlock information distributed across the network sites and, based on the analysis, to detect and resolve deadlocks. Then the design of an adaptive hybrid algorithm derived from the framework is presented. The algorithm can dynamically adapt itself to the changes in system state by using different deadlock detection strategies. The performance of the proposed algorithm has been evaluated using simulations. The results show that the algorithm can outperform existing algorithms that use a fixed deadlock detection strategy.

  18. A Strategic Approach to Curriculum Design for Information Literacy in Teacher Education--Implementing an Information Literacy Conceptual Framework

    Science.gov (United States)

    Klebansky, Anna; Fraser, Sharon P.

    2013-01-01

    This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…

  19. Beyond Ambiguity: A Practical Framework for Developing and Implementing Open Government Reforms

    Directory of Open Access Journals (Sweden)

    Merlin Chatwin

    2017-12-01

    Full Text Available The broad idea of ‘Open Government’ is widely accepted as a facilitator for rebuilding trust and validation in governments around the world. The Open Government Partnership is a significant driver of this movement with over 75 member nations, 15 subnational government participants and many others local governments implementing reforms within their national frameworks. The central tenets of transparency, accountability, participation, and collaboration are well understood within scholarly works and practitioner publications. However, open government is yet to be attributed with a universally acknowledged definition. This leads to questions of adaptability and salience of the concept of open government across diverse contexts. This paper addresses these questions by utilizing a human systems framework called the Dialogue Boxes. To develop an understanding of how open government is currently positioned within scholarly works and practitioner publications, an extensive literature search was conducted. The search utilized major search engines, often-cited references, direct journal searches and colleague provided references. Using existing definitions and descriptions, this paper populates the framework with available information and allow for context specific content to be populated by future users. Ultimately, the aim of the paper is to support the development of open government action plans that maximize the direct positive impact on people’s lives.

  20. A Framework for Implementing and Valuing Biodiversity Offsets in Colombia: A Landscape Scale Perspective

    Directory of Open Access Journals (Sweden)

    Shirley Saenz

    2013-11-01

    Full Text Available Biodiversity offsets provide a mechanism for maintaining or enhancing environmental values in situations where development is sought, despite negative environmental impacts. They seek to ensure that unavoidable deleterious environmental impacts of development are balanced by environmental gains. When onsite impacts warrant the use of offsets there is often little attention paid to make sure that the location of offset sites provides the greatest conservation benefit, ensuring they are consistent with landscape level conservation goals. In most offset frameworks it is difficult for developers to proactively know the offset requirements they will need to implement. Here we propose a framework to address these needs. We propose a series of rules for selecting offset sites that meet the conservation needs of potentially impacted biological targets. We then discuss an accounting approach that seeks to support offset ratio determinations based on a structured and transparent approach. To demonstrate the approach, we present a framework developed in partnership with the Colombian Ministry of Environment and Sustainable Development to reform existing mitigation regulatory processes.

  1. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  2. Understanding effects in reviews of implementation interventions using the Theoretical Domains Framework.

    Science.gov (United States)

    Little, Elizabeth A; Presseau, Justin; Eccles, Martin P

    2015-06-17

    Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to

  3. A Market Framework for Enabling Electric Vehicles Flexibility Procurement at the Distribution Level Considering Grid Constraints

    DEFF Research Database (Denmark)

    Gadea, Ana; Marinelli, Mattia; Zecchino, Antonio

    2018-01-01

    In a context of extensive electrification of the transport sector, the use of flexibility services from electric vehicles (EVs) is becoming of paramount importance. This paper defines a market framework for enabling EVs flexibility at the distribution level, considering grid constraints. The main...... the benefit for DSOs and society, proving a technical and economic feasible solution....

  4. Development, implementation and critique of a bioethics framework for pharmaceutical sponsors of human biomedical research.

    Science.gov (United States)

    Van Campen, Luann E; Therasse, Donald G; Klopfenstein, Mitchell; Levine, Robert J

    2015-11-01

    Pharmaceutical human biomedical research is a multi-dimensional endeavor that requires collaboration among many parties, including those who sponsor, conduct, participate in, or stand to benefit from the research. Human subjects' protections have been promulgated to ensure that the benefits of such research are accomplished with respect for and minimal risk to individual research participants, and with an overall sense of fairness. Although these protections are foundational to clinical research, most ethics guidance primarily highlights the responsibilities of investigators and ethics review boards. Currently, there is no published resource that comprehensively addresses bioethical responsibilities of industry sponsors; including their responsibilities to parties who are not research participants, but are, nevertheless key stakeholders in the endeavor. To fill this void, in 2010 Eli Lilly and Company instituted a Bioethics Framework for Human Biomedical Research. This paper describes how the framework was developed and implemented and provides a critique based on four years of experience. A companion article provides the actual document used by Eli Lilly and Company to guide ethical decisions regarding all phases of human clinical trials. While many of the concepts presented in this framework are not novel, compiling them in a manner that articulates the ethical responsibilities of a sponsor is novel. By utilizing this type of bioethics framework, we have been able to develop bioethics positions on various topics, provide research ethics consultations, and integrate bioethics into the daily operations of our human biomedical research. We hope that by sharing these companion papers we will stimulate discussion within and outside the biopharmaceutical industry for the benefit of the multiple parties involved in pharmaceutical human biomedical research.

  5. Distributed Prognostics System Implementation on Wireless Embedded Devices

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed prognostics is the next step in the evolution of prognostic methodologies. It is an important enabling technology for the emerging Condition Based...

  6. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Ravindra [Argonne National Lab. (ANL), Argonne, IL (United States); Reilly, James T. [Reilly Associates, Pittston, PA (United States); Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to common barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.

  7. Supporting the Evaluation and Implementation of Musculoskeletal Models of Care: A Globally Informed Framework for Judging Readiness and Success.

    Science.gov (United States)

    Briggs, Andrew M; Jordan, Joanne E; Jennings, Matthew; Speerin, Robyn; Bragge, Peter; Chua, Jason; Woolf, Anthony D; Slater, Helen

    2017-04-01

    To develop a globally informed framework to evaluate readiness for implementation and success after implementation of musculoskeletal models of care (MOCs). Three phases were undertaken: 1) a qualitative study with 27 Australian subject matter experts (SMEs) to develop a draft framework; 2) an eDelphi study with an international panel of 93 SMEs across 30 nations to evaluate face validity, and refine and establish consensus on the framework components; and 3) translation of the framework into a user-focused resource and evaluation of its acceptability with the eDelphi panel. A comprehensive evaluation framework was developed for judging the readiness and success of musculoskeletal MOCs. The framework consists of 9 domains, with each domain containing a number of themes underpinned by detailed elements. In the first Delphi round, scores of "partly agree" or "completely agree" with the draft framework ranged 96.7%-100%. In the second round, "essential" scores ranged 58.6%-98.9%, resulting in 14 of 34 themes being classified as essential. SMEs strongly agreed or agreed that the final framework was useful (98.8%), usable (95.1%), credible (100%) and appealing (93.9%). Overall, 96.3% strongly supported or supported the final structure of the framework as it was presented, while 100%, 96.3%, and 100% strongly supported or supported the content within the readiness, initiating implementation, and success streams, respectively. An empirically derived framework to evaluate the readiness and success of musculoskeletal MOCs was strongly supported by an international panel of SMEs. The framework provides an important internationally applicable benchmark for the development, implementation, and evaluation of musculoskeletal MOCs. © 2016, American College of Rheumatology.

  8. Nash Bargaining Game-Theoretic Framework for Power Control in Distributed Multiple-Radar Architecture Underlying Wireless Communication System

    Directory of Open Access Journals (Sweden)

    Chenguang Shi

    2018-04-01

    Full Text Available This paper presents a novel Nash bargaining solution (NBS-based cooperative game-theoretic framework for power control in a distributed multiple-radar architecture underlying a wireless communication system. Our primary objective is to minimize the total power consumption of the distributed multiple-radar system (DMRS with the protection of wireless communication user’s transmission, while guaranteeing each radar’s target detection requirement. A unified cooperative game-theoretic framework is proposed for the optimization problem, where interference power constraints (IPCs are imposed to protect the communication user’s transmission, and a minimum signal-to-interference-plus-noise ratio (SINR requirement is employed to provide reliable target detection for each radar. The existence, uniqueness and fairness of the NBS to this cooperative game are proven. An iterative Nash bargaining power control algorithm with low computational complexity and fast convergence is developed and is shown to converge to a Pareto-optimal equilibrium for the cooperative game model. Numerical simulations and analyses are further presented to highlight the advantages and testify to the efficiency of our proposed cooperative game algorithm. It is demonstrated that the distributed algorithm is effective for power control and could protect the communication system with limited implementation overhead.

  9. Implementing Run-Time Evaluation of Distributed Timing Constraints in a Real-Time Environment

    DEFF Research Database (Denmark)

    Kristensen, C. H.; Drejer, N.

    1994-01-01

    In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments......In this paper we describe a solution to the problem of implementing run-time evaluation of timing constraints in distributed real-time environments...

  10. Implementering Run-time Evaluation of Distributed Timing Constraints in a Micro Kernel

    DEFF Research Database (Denmark)

    Kristensen, C.H.; Drejer, N.; Nielsen, Jens Frederik Dalsgaard

    In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems......In the present paper we describe a solution to the problem of implementing time-optimal evaluation of timing constraints in distributed real-time systems...

  11. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  12. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon

    2005-01-01

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  13. Identifying a practice-based implementation framework for sustainable interventions for improving the evolving working environment: Hitting the Moving Target Framework.

    Science.gov (United States)

    Højberg, Helene; Rasmussen, Charlotte Diana Nørregaard; Osborne, Richard H; Jørgensen, Marie Birk

    2018-02-01

    Our aim was to identify implementation components for sustainable working environment interventions in the nursing assistant sector to generate a framework to optimize the implementation of workplace improvement initiatives. The implementation framework was informed by: 1) an industry advisory group, 2) interviews with key stakeholder, 3) concept mapping workshops, and 4) an e-mail survey. Thirty five stakeholders were interviewed and contributed in the concept mapping workshops. Eleven implementation components were derived across four domains: 1) A supportive organizational platform, 2) An engaged workplace with mutual goals, 3) The intervention is sustainably fitted to the workplace, and 4) the intervention is an attractive choice. The highest rated component was "Engaged and Active Management" (mean 4.1) and the lowest rated was "Delivered in an Attractive Form" (mean 2.8). The framework provides new insights into implementation in an evolving working environment and is aiming to assist with addressing gaps in effectiveness of workplace interventions and implementation success. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  14. Biomass energy projects for joint implementation of the UN FCCC [Framework Convention on Climate Change

    International Nuclear Information System (INIS)

    Swisher, Joel N.; Renner, Frederick P.

    1998-01-01

    The UN Framework Convention on Climate Change (FCCC) allows for the joint implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial or full financial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. This paper addresses some key issues related to JI under the FCCC as they relate to the development of biomass energy projects for carbon offsets in developing countries. Issues include the reference case or baseline, carbon accounting and net carbon storage, potential project implementation barriers and risks, monitoring and verification, local agreements and host-country approval. All of these issues are important in project design and evaluation. We discuss briefly several case studies, which consist of a biomass-fueled co-generation projects under development at large sugar mills in the Philippines, India and Brazil, as potential JI projects. The case studies illustrate the benefits of bioenergy for reducing carbon emissions and some of the important barriers and difficulties in developing and crediting such projects. Results to date illustrate both the achievements and the difficulties of this type of project. (author)

  15. Implementation of density-based solver for all speeds in the framework of OpenFOAM

    Science.gov (United States)

    Shen, Chun; Sun, Fengxian; Xia, Xinlin

    2014-10-01

    In the framework of open source CFD code OpenFOAM, a density-based solver for all speeds flow field is developed. In this solver the preconditioned all speeds AUSM+(P) scheme is adopted and the dual time scheme is implemented to complete the unsteady process. Parallel computation could be implemented to accelerate the solving process. Different interface reconstruction algorithms are implemented, and their accuracy with respect to convection is compared. Three benchmark tests of lid-driven cavity flow, flow crossing over a bump, and flow over a forward-facing step are presented to show the accuracy of the AUSM+(P) solver for low-speed incompressible flow, transonic flow, and supersonic/hypersonic flow. Firstly, for the lid driven cavity flow, the computational results obtained by different interface reconstruction algorithms are compared. It is indicated that the one dimensional reconstruction scheme adopted in this solver possesses high accuracy and the solver developed in this paper can effectively catch the features of low incompressible flow. Then via the test cases regarding the flow crossing over bump and over forward step, the ability to capture characteristics of the transonic and supersonic/hypersonic flows are confirmed. The forward-facing step proves to be the most challenging for the preconditioned solvers with and without the dual time scheme. Nonetheless, the solvers described in this paper reproduce the main features of this flow, including the evolution of the initial transient.

  16. Joint Implementation under the UN Framework Convention on Climate Change. Technical and institutional challenges

    International Nuclear Information System (INIS)

    Swisher, J.N.

    1997-01-01

    The UN Framework Convention of Climate Change (FCCC) allows for the Joint Implementation (JI) of measures to mitigate the emissions of greenhouse gases. The concept of JI refers to the implementation of such measures in one country with partial and/or technical support from another country, potentially fulfilling some of the supporting country's emission-reduction commitment under the FCCC. At present, all JI transactions are voluntary, and no country has claimed JI credit against existing FCCC commitments. Nevertheless, JI could have important implications for both the economic efficiency and the international equity of the implementation of the FCCC. The paper discusses some of the information needs of JI projects and seeks to clarify some of the common assumptions and arguments about JI. Issues regarding JI are distinguished according to those that are specific to JI as well as other types of regimes and transactions. The focus is on the position of developing countries and their potential risks and benefits regarding JI. 2 figs., 3 tabs., 35 refs

  17. A New Mode of European Regulation? The Implementation of the Autonomous Framework Agreement on Telework in Five Countries

    OpenAIRE

    Larsen , Trine P.; Andersen , Søren Kaj

    2007-01-01

    Abstract This article examines the implementation of the first autonomous framework agreement signed by European social partners in a number of member states. Although the telework agreement states that it is to be implemented in accordance with national procedures and practices specific to management and labour, practice is often different. The approach adopted reflects the specific ...

  18. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  19. A multi-criteria vertical coordination framework for a reliable aid distribution

    International Nuclear Information System (INIS)

    Regis-Hernández, Fabiola; Mora-Vargas, Jaime; Ruíz, Angel

    2017-01-01

    This study proposes a methodology that translates multiple humanitarian supply chain stakeholders’ preferences from qualitative to quantitative values, enabling these preferences to be integrated into optimization models to ensure their balanced and simultaneous implementation during the decision-making process. Design/methodology/approach: An extensive literature review is used to justify the importance of developing a strategy that minimizes the impact of a lack of coordination on humanitarian logistics decisions. A methodology for a multi-criteria framework is presented that allows humanitarian stakeholders’ interests to be integrated into the humanitarian decision-making process. Findings: The findings suggest that integrating stakeholders’ interests into the humanitarian decision-making process will improve its reliability. Research limitations/implications: To further validate the weights of each stakeholder’s interests obtained from the literature review requires interviews with the corresponding organizations. However, the literature review supports the statements in this paper. Practical implications: The cost of a lack of coordination between stakeholders in humanitarian logistics has been increasing during the last decade. These coordination costs can be minimized if humanitarian logistics’ decision-makers measure and simultaneously consider multiple stakeholders’ preferences. Social implications: When stakeholders’ goals are aligned, the humanitarian logistics response becomes more efficient, increasing the quality of delivered aid and providing timely assistance to the affected population in order to minimize their suffering. This study provides a methodology that translates humanitarian supply chain stakeholders’ interests into quantitative values, enabling them to be integrated into mathematical models to ensure relief distribution based on the stakeholders’ preferences

  20. A multi-criteria vertical coordination framework for a reliable aid distribution

    Energy Technology Data Exchange (ETDEWEB)

    Regis-Hernández, Fabiola; Mora-Vargas, Jaime; Ruíz, Angel

    2017-07-01

    This study proposes a methodology that translates multiple humanitarian supply chain stakeholders’ preferences from qualitative to quantitative values, enabling these preferences to be integrated into optimization models to ensure their balanced and simultaneous implementation during the decision-making process. Design/methodology/approach: An extensive literature review is used to justify the importance of developing a strategy that minimizes the impact of a lack of coordination on humanitarian logistics decisions. A methodology for a multi-criteria framework is presented that allows humanitarian stakeholders’ interests to be integrated into the humanitarian decision-making process. Findings: The findings suggest that integrating stakeholders’ interests into the humanitarian decision-making process will improve its reliability. Research limitations/implications: To further validate the weights of each stakeholder’s interests obtained from the literature review requires interviews with the corresponding organizations. However, the literature review supports the statements in this paper. Practical implications: The cost of a lack of coordination between stakeholders in humanitarian logistics has been increasing during the last decade. These coordination costs can be minimized if humanitarian logistics’ decision-makers measure and simultaneously consider multiple stakeholders’ preferences. Social implications: When stakeholders’ goals are aligned, the humanitarian logistics response becomes more efficient, increasing the quality of delivered aid and providing timely assistance to the affected population in order to minimize their suffering. This study provides a methodology that translates humanitarian supply chain stakeholders’ interests into quantitative values, enabling them to be integrated into mathematical models to ensure relief distribution based on the stakeholders’ preferences.

  1. A framework for implementation of organ effect models in TOPAS with benchmarks extended to proton therapy

    International Nuclear Information System (INIS)

    Ramos-Méndez, J; Faddegon, B; Perl, J; Schümann, J; Paganetti, H; Shin, J

    2015-01-01

    The aim of this work was to develop a framework for modeling organ effects within TOPAS (TOol for PArticle Simulation), a wrapper of the Geant4 Monte Carlo toolkit that facilitates particle therapy simulation. The DICOM interface for TOPAS was extended to permit contour input, used to assign voxels to organs. The following dose response models were implemented: The Lyman–Kutcher–Burman model, the critical element model, the population based critical volume model, the parallel-serial model, a sigmoid-based model of Niemierko for normal tissue complication probability and tumor control probability (TCP), and a Poisson-based model for TCP. The framework allows easy manipulation of the parameters of these models and the implementation of other models.As part of the verification, results for the parallel-serial and Poisson model for x-ray irradiation of a water phantom were compared to data from the AAPM Task Group 166. When using the task group dose-volume histograms (DVHs), results were found to be sensitive to the number of points in the DVH, with differences up to 2.4%, some of which are attributable to differences between the implemented models. New results are given with the point spacing specified. When using Monte Carlo calculations with TOPAS, despite the relatively good match to the published DVH’s, differences up to 9% were found for the parallel-serial model (for a maximum DVH difference of 2%) and up to 0.5% for the Poisson model (for a maximum DVH difference of 0.5%). However, differences of 74.5% (in Rectangle1), 34.8% (in PTV) and 52.1% (in Triangle) for the critical element, critical volume and the sigmoid-based models were found respectively.We propose a new benchmark for verification of organ effect models in proton therapy. The benchmark consists of customized structures in the spread out Bragg peak plateau, normal tissue, tumor, penumbra and in the distal region. The DVH’s, DVH point spacing, and results of the organ effect models are

  2. Improved Diagnosis and Care for Rare Diseases through Implementation of Precision Public Health Framework.

    Science.gov (United States)

    Baynam, Gareth; Bowman, Faye; Lister, Karla; Walker, Caroline E; Pachter, Nicholas; Goldblatt, Jack; Boycott, Kym M; Gahl, William A; Kosaki, Kenjiro; Adachi, Takeya; Ishii, Ken; Mahede, Trinity; McKenzie, Fiona; Townshend, Sharron; Slee, Jennie; Kiraly-Borri, Cathy; Vasudevan, Anand; Hawkins, Anne; Broley, Stephanie; Schofield, Lyn; Verhoef, Hedwig; Groza, Tudor; Zankl, Andreas; Robinson, Peter N; Haendel, Melissa; Brudno, Michael; Mattick, John S; Dinger, Marcel E; Roscioli, Tony; Cowley, Mark J; Olry, Annie; Hanauer, Marc; Alkuraya, Fowzan S; Taruscio, Domenica; Posada de la Paz, Manuel; Lochmüller, Hanns; Bushby, Kate; Thompson, Rachel; Hedley, Victoria; Lasko, Paul; Mina, Kym; Beilby, John; Tifft, Cynthia; Davis, Mark; Laing, Nigel G; Julkowska, Daria; Le Cam, Yann; Terry, Sharon F; Kaufmann, Petra; Eerola, Iiro; Norstedt, Irene; Rath, Ana; Suematsu, Makoto; Groft, Stephen C; Austin, Christopher P; Draghia-Akli, Ruxandra; Weeramanthri, Tarun S; Molster, Caron; Dawkins, Hugh J S

    2017-01-01

    Public health relies on technologies to produce and analyse data, as well as effectively develop and implement policies and practices. An example is the public health practice of epidemiology, which relies on computational technology to monitor the health status of populations, identify disadvantaged or at risk population groups and thereby inform health policy and priority setting. Critical to achieving health improvements for the underserved population of people living with rare diseases is early diagnosis and best care. In the rare diseases field, the vast majority of diseases are caused by destructive but previously difficult to identify protein-coding gene mutations. The reduction in cost of genetic testing and advances in the clinical use of genome sequencing, data science and imaging are converging to provide more precise understandings of the 'person-time-place' triad. That is: who is affected (people); when the disease is occurring (time); and where the disease is occurring (place). Consequently we are witnessing a paradigm shift in public health policy and practice towards 'precision public health'.Patient and stakeholder engagement has informed the need for a national public health policy framework for rare diseases. The engagement approach in different countries has produced highly comparable outcomes and objectives. Knowledge and experience sharing across the international rare diseases networks and partnerships has informed the development of the Western Australian Rare Diseases Strategic Framework 2015-2018 (RD Framework) and Australian government health briefings on the need for a National plan.The RD Framework is guiding the translation of genomic and other technologies into the Western Australian health system, leading to greater precision in diagnostic pathways and care, and is an example of how a precision public health framework can improve health outcomes for the rare diseases population.Five vignettes are used to illustrate how policy

  3. A System for Distributed Mechanisms: Design, Implementation and Applications

    NARCIS (Netherlands)

    K.R. Apt (Krzysztof); F. Arbab (Farhad); H. Ma (Huiye)

    2007-01-01

    htmlabstractWe describe here a structured system for distributed mechanism design. In our approach the players dynamically form a network in which they know neither their neighbours nor the size of the network and interact to jointly take decisions. The only assumption concerning the underlying

  4. Implementation of the framework convention on tobacco control in Africa: current status of legislation.

    Science.gov (United States)

    Tumwine, Jacqueline

    2011-11-01

    To describe, as of July 2011, the status of tobacco control legislation in Africa in three key areas of the Framework Convention on Tobacco Control (FCTC)-(1) Protection from exposure to tobacco smoke, (2) Packaging and labelling of tobacco products, and (3) Tobacco advertising, promotion and sponsorship. Review and analysis of tobacco control legislation in Africa, media reports, journal articles, tobacco industry documents and data published in the 2011 WHO Report on the Global Tobacco Epidemic. Modest progress in FCTC implementation in Africa with many countries having legislation or policies on the protection from exposure to tobacco smoke, however, only a handful of countries meet the standards of the FCTC Article 8 and its Guidelines particularly with regards to designated smoking areas. Little progress on packaging and labelling of tobacco products, with few countries having legislation meeting the minimum standards of the FCTC Article 11 and its Guidelines. Mauritius is the only African country with graphic or pictorial health warnings in place and has the largest warning labels in Africa. Slightly better progress in banning tobacco advertising, promotion and sponsorship has been shown by African countries, although the majority of legislation falls short of the standards of the FCTC Article 13 and its Guidelines. Despite their efforts, African countries' FCTC implementation at national level has not matched the strong regional commitment demonstrated during the FCTC treaty negotiations. This study highlights the need for Africa to step up efforts to adopt and implement effective tobacco control legislation that is fully compliant with the FCTC. In order to achieve this, countries should prioritise resources for capacity building for drafting strong FCTC compliant legislation, research to inform policy and boost political will, and countering the tobacco industry which is a major obstacle to FCTC implementation in Africa.

  5. Implementation of the Framework Convention on Tobacco Control in Africa: Current Status of Legislation

    Directory of Open Access Journals (Sweden)

    Jacqueline Tumwine

    2011-11-01

    Full Text Available Objective: To describe, as of July 2011, the status of tobacco control legislation in Africa in three key areas of the Framework Convention on Tobacco Control (FCTC—(1 Protection from exposure to tobacco smoke, (2 Packaging and labelling of tobacco products, and (3 Tobacco advertising, promotion and sponsorship. Methods: Review and analysis of tobacco control legislation in Africa, media reports, journal articles, tobacco industry documents and data published in the 2011 WHO Report on the Global Tobacco Epidemic. Results: Modest progress in FCTC implementation in Africa with many countries having legislation or policies on the protection from exposure to tobacco smoke, however, only a handful of countries meet the standards of the FCTC Article 8 and its Guidelines particularly with regards to designated smoking areas. Little progress on packaging and labelling of tobacco products, with few countries having legislation meeting the minimum standards of the FCTC Article 11 and its Guidelines. Mauritius is the only African country with graphic or pictorial health warnings in place and has the largest warning labels in Africa. Slightly better progress in banning tobacco advertising, promotion and sponsorship has been shown by African countries, although the majority of legislation falls short of the standards of the FCTC Article 13 and its Guidelines. Despite their efforts, African countries’ FCTC implementation at national level has not matched the strong regional commitment demonstrated during the FCTC treaty negotiations. Conclusion: This study highlights the need for Africa to step up efforts to adopt and implement effective tobacco control legislation that is fully compliant with the FCTC. In order to achieve this, countries should prioritise resources for capacity building for drafting strong FCTC compliant legislation, research to inform policy and boost political will, and countering the tobacco industry which is a major obstacle to FCTC

  6. Can the theoretical domains framework account for the implementation of clinical quality interventions?

    Science.gov (United States)

    Lipworth, Wendy; Taylor, Natalie; Braithwaite, Jeffrey

    2013-12-21

    The health care quality improvement movement is a complex enterprise. Implementing clinical quality initiatives requires attitude and behaviour change on the part of clinicians, but this has proven to be difficult. In an attempt to solve this kind of behavioural challenge, the theoretical domains framework (TDF) has been developed. The TDF consists of 14 domains from psychological and organisational theory said to influence behaviour change. No systematic research has been conducted into the ways in which clinical quality initiatives map on to the domains of the framework. We therefore conducted a qualitative mapping experiment to determine to what extent, and in what ways, the TDF is relevant to the implementation of clinical quality interventions. We conducted a thematic synthesis of the qualitative literature exploring clinicians' perceptions of various clinical quality interventions. We analysed and synthesised 50 studies in total, in five domains of clinical quality interventions: clinical quality interventions in general, structural interventions, audit-type interventions, interventions aimed at making practice more evidence-based, and risk management interventions. Data were analysed thematically, followed by synthesis of these themes into categories and concepts, which were then mapped to the domains of the TDF. Our results suggest that the TDF is highly relevant to the implementation of clinical quality interventions. It can be used to map most, if not all, of the attitudinal and behavioural barriers and facilitators of uptake of clinical quality interventions. Each of these 14 domains appeared to be relevant to many different types of clinical quality interventions. One possible additional domain might relate to perceived trustworthiness of those instituting clinical quality interventions. The TDF can be usefully applied to a wide range of clinical quality interventions. Because all 14 of the domains emerged as relevant, and we did not identify any

  7. Paving the Road to Success: A Framework for Implementing the Success Tutoring Approach

    Directory of Open Access Journals (Sweden)

    Spark Linda

    2017-12-01

    Full Text Available The exponential growth of higher education enrolment in South Africa has resulted in increased diversity of the student body, leading to a proliferation of factors that affect student performance and success. Various initiatives have been adopted by tertiary institutions to mitigate the negative impact these factors may have on student success, and it is suggested that interventions that include aspects of social integration are the most successful. This paper outlines an approach called Success Tutoring (a non-academic tutorial approach used as part of a student success and support programme in the Faculty of Commerce, Law, and Management at the University of the Witwatersrand, which is underscored by empirical evidence drawn from evaluation data collected during Success Tutor symposia. The authors draw conclusions and make recommendations based on a thematic analysis of the dataset, and ultimately provide readers with a framework for implementing Success Tutoring at their tertiary institutions.

  8. Microplastics in seawater: Recommendations from the Marine Strategy Framework Directive implementation process

    Directory of Open Access Journals (Sweden)

    Jesus Gago

    2016-11-01

    Full Text Available Microplastic litter is a pervasive pollutant present in marine systems across the globe. The legacy of microplastics pollution in the marine environment today may remain for years to come due to the persistence of these materials. Microplastics are emerging contaminants of potential concern and as yet there are few recognised approaches for monitoring. In 2008, the EU Marine Strategy Framework Directive (MSFD, 2008/56/EC included microplastics as an aspect to be measured. Here we outline the approach as discussed by the European Union expert group on marine litter, the technical Subgroup on Marine litter (TSG-ML, with a focus on the implementation of monitoring microplastics in seawater in European seas. It is concluded that harmonization and coherence is needed to achieve reliable monitoring.

  9. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  10. Implementation of the Integrated Alarm System for KOMAC facility using EPICS framework and Eclipse

    International Nuclear Information System (INIS)

    Song, Young-Gi; Kim, Jae-Ha; Kim, Han-Sung; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2017-01-01

    The alarm detecting layer is the component that monitors alarm signals which are transported to the processing part through message queue. The main purpose of the processing part is to transfer the alarm signals connecting an alarm identification and state of the alarm to database system. The operation interface of system level signal links has been developed by EPICS framework. EPICS tools have been used for monitoring device alarm status. The KOMAC alarm system was developed for offering a user-friendly, intuitive user interface. The alarm system is implemented with EPICS IOC for alarm server, eclipse-mars integrated development tool for alarm viewer, and mariadb for alarm log. The new alarm system supports intuitive user interface on alarm information and alarm history. Alarm view has plans to add login function, user permission on alarm acknowledge, user permission of PV import, search and report function.

  11. Framework and operational procedure for implementing Strategic Environmental Assessment in China

    International Nuclear Information System (INIS)

    Bao Cunkuan; Lu Yongsen; Shang Jincheng

    2004-01-01

    Over the last 20 years, Environmental Impact Assessment (EIA) has been implemented and become an important instrument for decision-making in development projects in China. The Environmental Impact Assessment Law of the P.R. China was promulgated on 28 October 2002 and will be put into effect on 1 September of 2003. The law provides that Strategic Environmental Assessment (SEA) is required in regional and sector plans and programs. This paper introduces the research achievements and practice of SEA in China, discusses the relationship of SEA and 'integrating of environment and development in decision-making (IEDD)', and relevant political and legal basis of SEA. The framework and operational procedures of SEA administration and enforcement are presented. Nine cases are analyzed and some proposals are given

  12. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  13. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00225867; The ATLAS collaboration

    2017-01-01

    We present an implementation of the ATLAS High Level Trigger, HLT, that provides parallel execution of trigger algorithms within the ATLAS multithreaded software framework, AthenaMT. This development will enable the ATLAS HLT to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data-taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the HLT input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that each execute algorithms sequentially for different events. AthenaMT will provide a fully multi-threaded environment that will additionally enable concurrent ...

  14. A sustainable livelihood framework to implement CSR project in coal mining sector

    Directory of Open Access Journals (Sweden)

    Sapna A. Narula

    2017-01-01

    Full Text Available Corporate social responsibility (CSR in mining areas has increased momentum especially in countries like India where it has been made mandatory. The primary objective of this paper is to document actual social challenges of mining in field areas and find out how companies in the coal sector can work in a systematic manner to achieve uplift of affected communities. The first part of the paper draws evidence from three different bodies of literature, i.e. CSR and coal mining, capacity building and livelihood generation in mining areas. We try to converge the literature to propose a novel framework for livelihood generation work through capacity building with the help of CSR investments. The paper also documents a live case of planning and the implementation of capacity building activities in Muriadih coal mines in the Jharkhand state of India and offers lessons to both business and policy makers. The proposed framework has only been experimented in a local context, yet has the potential to be replicated in other mining areas.

  15. A methodological approach and framework for sustainability assessment in NGO-implemented primary health care programs.

    Science.gov (United States)

    Sarriot, Eric G; Winch, Peter J; Ryan, Leo J; Bowie, Janice; Kouletio, Michelle; Swedberg, Eric; LeBan, Karen; Edison, Jay; Welch, Rikki; Pacqué, Michel C

    2004-01-01

    An estimated 10.8 million children under 5 continue to die each year in developing countries from causes easily treatable or preventable. Non governmental organizations (NGOs) are frontline implementers of low-cost and effective child health interventions, but their progress toward sustainable child health gains is a challenge to evaluate. This paper presents the Child Survival Sustainability Assessment (CSSA) methodology--a framework and process--to map progress towards sustainable child health from the community level and upward. The CSSA was developed with NGOs through a participatory process of research and dialogue. Commitment to sustainability requires a systematic and systemic consideration of human, social and organizational processes beyond a purely biomedical perspective. The CSSA is organized around three interrelated dimensions of evaluation: (1) health and health services; (2) capacity and viability of local organizations; (3) capacity of the community in its social ecological context. The CSSA uses a participatory, action-planning process, engaging a 'local system' of stakeholders in the contextual definition of objectives and indicators. Improved conditions measured in the three dimensions correspond to progress toward a sustainable health situation for the population. This framework opens new opportunities for evaluation and research design and places sustainability at the center of primary health care programming.

  16. Challenges to the Implementation of a New Framework for Safeguarding Financial Stability

    Directory of Open Access Journals (Sweden)

    Vlahović Ana

    2014-09-01

    Full Text Available There is probably no single economic concept that has attracted more attention and intrigued scientific and professional circles than financial stability. For over a decade now that have been efforts to establish the starting point in explaining this condition or characteristic of the financial system since some find that the key to defining financial stability lies in stability and others argue in favour of the opposite, instability. Unfortunately, no agreement has been reached on a universal definition that would be widely accepted at the international level. Consequently, this gave rise to open discussions on systemic risk, creating a framework for preserving financial stability, and the role of central banks in this process. This article analyses the results achieved in the development of a theoretical concept of financial stability and its practical implementation. A consensus has been reached on the necessity of removing rigid barriers between macro and prudential policies and on the necessity of their coordinated actions. The primary objectives of monetary and fiscal stability have been shifted towards preserving financial stability. The isolated macroprudential principle rightfully got the epithet of an archaic approach. Coordinated micro and macroprudential policies have definitely prevailed and become reality in many countries, including Montenegro. Created institutional frameworks for safeguarding financial stability at all levels - national, Pan-European and global - represent a challenge for further comparative studies.

  17. MATLAB implementation of satellite positioning error overbounding by generalized Pareto distribution

    Science.gov (United States)

    Ahmad, Khairol Amali; Ahmad, Shahril; Hashim, Fakroul Ridzuan

    2018-02-01

    In the satellite navigation community, error overbound has been implemented in the process of integrity monitoring. In this work, MATLAB programming is used to implement the overbounding of satellite positioning error CDF. Using a trajectory of reference, the horizontal position errors (HPE) are computed and its non-parametric distribution function is given by the empirical Cumulative Distribution Function (ECDF). According to the results, these errors have a heavy-tailed distribution. Sınce the ECDF of the HPE in urban environment is not Gaussian distributed, the ECDF is overbound with the CDF of the generalized Pareto distribution (GPD).

  18. Implementing electric vehicles in urban distribution: A discrete event simulation

    OpenAIRE

    Lebeau, Philippe; Macharis, Cathy; Mierlo, Joeri Van; Maes, Guillaume

    2013-01-01

    Urban freight transport becomes increasingly important with the development of cities. However, it generates also inefficiencies on social, economic and environmental aspects. A possible solution is the use of urban distribution centres in order to rationalise the deliveries and to operate the last miles with clean vehicles. Electric vehicles are gaining attention lately but some barriers remain. Since costs barriers were already investigated, the paper aimed at evaluating the difference of p...

  19. The Framework for KM Implementation in Product and Service Oriented SMEs: Evidence from Field Studies in Taiwan

    Directory of Open Access Journals (Sweden)

    Yao Chin Lin

    2015-03-01

    Full Text Available Knowledge management (KM is a core competency that determines the success of small and medium-sized enterprises (SMEs in this knowledge-based economy. Instead of competing on the basis of physical and financial capital, the success of SMEs is influenced by the knowledge, experience and skills of the owners and its employees. Unfortunately, many SMEs are still struggling with KM implementation due to lacking a comprehensive KM framework. This study aims to identify enablers for KM success and build up a framework for KM implementation in service and product oriented SMEs. By using multiple research methods, this study collects data from SMEs in Taiwan to prove our suggested enablers and reference KM framework. The suggested framework can provide useful assistance and guidance for holistic KM solutions. The K-object concept, which adopted the XML standard, may become a significant managerial and technical element in the KM practice. The enhanced KM framework mandates every employee’s participation in knowledge activities, not just some elite knowledge workers. The findings provide useful implications for researchers and practitioners by providing useful templates for implementing KM initiatives in different industries and more comprehensive framework for KM implementation in different types of SMEs.

  20. A framework for establishing the technical efficiency of Electricity Distribution Counties (EDCs) using Data Envelopment Analysis

    International Nuclear Information System (INIS)

    Mullarkey, Shane; Caulfield, Brian; McCormack, Sarah; Basu, Biswajit

    2015-01-01

    Highlights: • Six models are employed to establish the technical efficiency of Electricity Distribution Counties. • A diagnostic parameter is incorporated to account for differences across Electricity Distribution Counties. • The amalgamation of Electricity Distribution Counties leads to improved efficiency in the production of energy. - Abstract: European Energy market liberalization has entailed the restructuring of electricity power markets through the unbundling of electricity generation, transmission and distribution, supply activities and introducing competition into electricity generation. Under these new electricity market regimes, it is important to have an evaluation tool that is capable of examining the impacts of these market changes. The adoption of Data Envelopment Analysis as a form of benchmarking for electricity distribution regulation is one method to conduct this analysis. This paper applies a Data Envelopment Analysis framework to the electricity distribution network in Ireland to explore the merits of using this approach, to determine the technical efficiency and the potential scope for efficiency improvements through reorganizing and the amalgamation of the distribution network in Ireland. The results presented show that overall grid efficiency is improved through this restructuring. A diagnostic parameter is defined and pursued to account for aberrations across Electricity Distribution Counties as opposed to the traditionally employed environmental variables. The adoption of this diagnostic parameter leads to a more intuitive understanding of Electricity Distribution Counties

  1. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  2. An Ambient Intelligence Framework for the Provision of Geographically Distributed Multimedia Content to Mobility Impaired Users

    Science.gov (United States)

    Kehagias, Dionysios D.; Giakoumis, Dimitris; Tzovaras, Dimitrios; Bekiaris, Evangelos; Wiethoff, Marion

    This chapter presents an ambient intelligence framework whose goal is to facilitate the information needs of mobility impaired users on the move. This framework couples users with geographically distributed services and the corresponding multimedia content, enabling access to context-sensitive information based on user geographic location and the use case under consideration. It provides a multi-modal facility that is realized through a set of mobile devices and user interfaces that address the needs of ten different types of user impairments. The overall ambient intelligence framework enables users who are equipped with mobile devices to access multimedia content in order to undertake activities relevant to one or more of the following domains: transportation, tourism and leisure, personal support services, work, business, education, social relations and community building. User experience is being explored against those activities through a specific usage scenario.

  3. A survey on the progress with implementation of the radiography profession's career progression framework in UK radiotherapy centres

    International Nuclear Information System (INIS)

    James, Sarah; Beardmore, Charlotte; Dumbleton, Claire

    2012-01-01

    Aim: The purpose of the survey was to benchmark the progress with implementing the radiography profession's career progression framework within radiotherapy centres across the United Kingdom (UK). Methods: A survey questionnaire was constructed using the Survey Monkey™ tool to assess implementation of the career progression framework of the Society and College of Radiographers. Once constructed, an on line link to the survey questionnaire was emailed to all radiotherapy centre managers in the UK (N = 67) who were invited to provide one response per centre. The survey comprised twenty nine questions which were grouped into nine sections. Key results: The workforce profile indicates that increases in assistant, advanced and consultant level practitioners are required to meet National Radiotherapy Advisory Group recommendations with only a small number of centres having fully implemented the career progression framework. The overall vacancy level across the therapeutic radiography workforce was 4.6% at the time of the survey. Conclusions: and Recommendations: The survey has highlighted some progress with implementation of the career progression framework across the UK since its launch in 2000. However the current level of implementation demonstrated is disappointing considering it is a key recommendation within the NRAG Report 2007 with respect to England. It is recommended that all centres undertake a multi-professional workforce review to embed the career progression framework within their service in order to meet the workforce challenge associated with the required anticipated large growth in radiotherapy capacity.

  4. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    Science.gov (United States)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  5. Psychological first aid following trauma: implementation and evaluation framework for high-risk organizations.

    Science.gov (United States)

    Forbes, David; Lewis, Virginia; Varker, Tracey; Phelps, Andrea; O'Donnell, Meaghan; Wade, Darryl J; Ruzek, Josef I; Watson, Patricia; Bryant, Richard A; Creamer, Mark

    2011-01-01

    International clinical practice guidelines for the management of psychological trauma recommend Psychological First Aid (PFA) as an early intervention for survivors of potentially traumatic events. These recommendations are consensus-based, and there is little published evidence assessing the effectiveness of PFA. This is not surprising given the nature of the intervention and the complicating factors involved in any evaluation of PFA. There is, nevertheless, an urgent need for stronger evidence evaluating its effectiveness. The current paper posits that the implementation and evaluation of PFA within high risk organizational settings is an ideal place to start. The paper provides a framework for a phasic approach to implementing PFA within such settings and presents a model for evaluating its effectiveness using a logic- or theory-based approach which considers both pre-event and post-event factors. Phases 1 and 2 of the PFA model are pre-event actions, and phases 3 and 4 are post-event actions. It is hoped that by using the Phased PFA model and evaluation method proposed in this paper, future researchers will begin to undertake the important task of building the evidence about the most effective approach to providing PFA in high risk organizational and community disaster settings.

  6. Interdisciplinary Priorities for Dissemination, Implementation, and Improvement Science: Frameworks, Mechanics, and Measures.

    Science.gov (United States)

    Brunner, Julian W; Sankaré, Ibrahima C; Kahn, Katherine L

    2015-12-01

    Much of dissemination, implementation, and improvement (DII) science is conducted by social scientists, healthcare practitioners, and biomedical researchers. While each of these groups has its own venues for sharing methods and findings, forums that bring together the diverse DII science workforce provide important opportunities for cross-disciplinary collaboration and learning. In particular, such forums are uniquely positioned to foster the sharing of three important components of research. First: they allow the sharing of conceptual frameworks for DII science that focus on the use and spread of innovations. Second: they provide an opportunity to share strategies for initiating and governing DII research, including approaches for eliciting and incorporating the research priorities of patients, study participants, and healthcare practitioners, and decision-makers. Third: they allow the sharing of outcome measures well-suited to the goals of DII science, thereby helping to validate these outcomes in diverse contexts, improving the comparability of findings across settings, and elevating the study of the implementation process itself. © 2015 Wiley Periodicals, Inc.

  7. BOT Contract through the optics of Albanian legal provisions - Issues of the implementation and transfer framework

    Directory of Open Access Journals (Sweden)

    Entela Prifti

    2016-07-01

    Full Text Available The last years have resulted in an increase of concession contracts in Albania, followed by a revised modern legal framework. Beside the debate on whether the government should perform most of the activities itself instead of giving them to the private sector through a concession contract, the concession contracts are nowadays a reality and as such they should be studied and analysed carefully. The scope of this article is limited to the provisions of the Albanian legislation and its approach to the international provisions regarding BOT (build – operate - transfer concession contract. A detailed analyse will drive to the conclusionas to what extent the Albanian concession legislation does compile with the international accepted principles of Public Private Partnership concerning mainly implementation and transfer phase of a BOT contract. Albanian Public Private Partnershiplegislation has gone through many revisions and amendments during the last twenty years, resulting in a challenging situation for everybody that deals with any aspects of a concession. Having a detailed understanding of the legal provisions is indeed the core element toward a successful implementation process of any concession, resulting in the highest profitability for concession parties, the public entity and the private investor, and consequently culminating to the best interest of the population.

  8. eTOXlab, an open source modeling framework for implementing predictive models in production environments.

    Science.gov (United States)

    Carrió, Pau; López, Oriol; Sanz, Ferran; Pastor, Manuel

    2015-01-01

    Computational models based in Quantitative-Structure Activity Relationship (QSAR) methodologies are widely used tools for predicting the biological properties of new compounds. In many instances, such models are used as a routine in the industry (e.g. food, cosmetic or pharmaceutical industry) for the early assessment of the biological properties of new compounds. However, most of the tools currently available for developing QSAR models are not well suited for supporting the whole QSAR model life cycle in production environments. We have developed eTOXlab; an open source modeling framework designed to be used at the core of a self-contained virtual machine that can be easily deployed in production environments, providing predictions as web services. eTOXlab consists on a collection of object-oriented Python modules with methods mapping common tasks of standard modeling workflows. This framework allows building and validating QSAR models as well as predicting the properties of new compounds using either a command line interface or a graphic user interface (GUI). Simple models can be easily generated by setting a few parameters, while more complex models can be implemented by overriding pieces of the original source code. eTOXlab benefits from the object-oriented capabilities of Python for providing high flexibility: any model implemented using eTOXlab inherits the features implemented in the parent model, like common tools and services or the automatic exposure of the models as prediction web services. The particular eTOXlab architecture as a self-contained, portable prediction engine allows building models with confidential information within corporate facilities, which can be safely exported and used for prediction without disclosing the structures of the training series. The software presented here provides full support to the specific needs of users that want to develop, use and maintain predictive models in corporate environments. The technologies used by e

  9. Nonlocal approach to the analysis of the stress distribution in granular systems. I. Theoretical framework

    Science.gov (United States)

    Kenkre, V. M.; Scott, J. E.; Pease, E. A.; Hurd, A. J.

    1998-05-01

    A theoretical framework for the analysis of the stress distribution in granular materials is presented. It makes use of a transformation of the vertical spatial coordinate into a formal time variable and the subsequent study of a generally non-Markoffian, i.e., memory-possessing (nonlocal) propagation equation. Previous treatments are obtained as particular cases corresponding to, respectively, wavelike and diffusive limits of the general evolution. Calculations are presented for stress propagation in bounded and unbounded media. They can be used to obtain desired features such as a prescribed stress distribution within the compact.

  10. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel

    2012-06-01

    Understanding the influence of multiple parameters in a complex simulation setting is a difficult task. In the ideal case, the scientist can freely steer such a simulation and is immediately presented with the results for a certain configuration of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute for the time-consuming simulation. The surrogate model we propose is based on the sparse grid technique, and we identify the main computational tasks associated with its evaluation and its extension. We further show how distributed data management combined with the specific use of accelerators allows us to approximate and deliver simulation results to a high-resolution visualization system in real-time. This significantly enhances the steering workflow and facilitates the interactive exploration of large datasets. © 2012 IEEE.

  11. ICT4RED 12 - Component implementation framework: A conceptual framework for integrating mobile technology into resource-constrained rural schools

    CSIR Research Space (South Africa)

    Ford, M

    2014-05-01

    Full Text Available ICT for Rural Education Development (ICT4RED) is a large-scale pilot that is testing the use of tablets in 26 deep rural schools in the Eastern Cape Province of South Africa. The aim is to develop a replicable framework that will enable evidence...

  12. The Road to Basel III – Quantitative Impact Study, the Basel III Framework and Implementation in the EU

    OpenAIRE

    Anastasia Gromova-Schneider; Caroline Niziolek

    2011-01-01

    In response to the financial crisis, the Basel Committee on Banking Supervision (BCBS) in December 2009 published its first consultative proposals to review the Basel II regulatory framework. Following a consultation process and a quantitative impact study (QIS), on December 16, 2010, the BCBS published the final Basel III framework for tightening the globally applicable capital adequacy and liquidity rules. The implementation of the new provisions in the EU is currently under way. The Europe...

  13. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  14. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  15. Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    This research is aimed at developing a neiv and advanced simulation framework that will significantly improve the overall efficiency of aerospace systems design and development. This objective will be accomplished through an innovative integration of object-oriented and Web-based technologies ivith both new and proven simulation methodologies. The basic approach involves Ihree major areas of research: Aerospace system and component representation using a hierarchical object-oriented component model which enables the use of multimodels and enforces component interoperability. Collaborative software environment that streamlines the process of developing, sharing and integrating aerospace design and analysis models. . Development of a distributed infrastructure which enables Web-based exchange of models to simplify the collaborative design process, and to support computationally intensive aerospace design and analysis processes. Research for the first year dealt with the design of the basic architecture and supporting infrastructure, an initial implementation of that design, and a demonstration of its application to an example aircraft engine system simulation.

  16. Framework and implementation of a continuous network-wide health monitoring system for roadways

    Science.gov (United States)

    Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar

    2014-03-01

    According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.

  17. Toward the Framework and Implementation for Clearance of Materials from Regulated Facilities

    International Nuclear Information System (INIS)

    Chen, Shih-Yew; Moeller, Dade W.; Dornsife, William P.; Meyer, H Robert; Lamastra, Anthony; Lubenau, Joel O.; Strom, Daniel J.; Yusko, James G.

    2005-01-01

    important disposition option for solid materials, establish the framework and basis of release, and discuss resolutions regarding the implementation of such a disposition option

  18. Toward the framework and implementation for clearance of materials from regulated facilities.

    Science.gov (United States)

    Chen, S Y; Moeller, D W; Dornsife, W P; Meyer, H R; Lamastra, A; Lubenau, J O; Strom, D J; Yusko, J G

    2005-08-01

    clearance as an important disposition option for solid materials, establish the framework and basis of release, and discuss resolutions regarding the implementation of such a disposition option.

  19. Distributed Attention Is Implemented through Theta-Rhythmic Gamma Modulation.

    Science.gov (United States)

    Landau, Ayelet Nina; Schreyer, Helene Marianne; van Pelt, Stan; Fries, Pascal

    2015-08-31

    When subjects monitor a single location, visual target detection depends on the pre-target phase of an ∼8 Hz brain rhythm. When multiple locations are monitored, performance decrements suggest a division of the 8 Hz rhythm over the number of locations, indicating that different locations are sequentially sampled. Indeed, when subjects monitor two locations, performance benefits alternate at a 4 Hz rhythm. These performance alternations were revealed after a reset of attention to one location. Although resets are common and important events for attention, it is unknown whether, in the absence of resets, ongoing attention samples stimuli in alternation. Here, we examined whether spatially specific attentional sampling can be revealed by ongoing pre-target brain rhythms. Visually induced gamma-band activity plays a role in spatial attention. Therefore, we hypothesized that performance on two simultaneously monitored stimuli can be predicted by a 4 Hz modulation of gamma-band activity. Brain rhythms were assessed with magnetoencephalography (MEG) while subjects monitored bilateral grating stimuli for a unilateral target event. The corresponding contralateral gamma-band responses were subtracted from each other to isolate spatially selective, target-related fluctuations. The resulting lateralized gamma-band activity (LGA) showed opposite pre-target 4 Hz phases for detected versus missed targets. The 4 Hz phase of pre-target LGA accounted for a 14.5% modulation in performance. These findings suggest that spatial attention is a theta-rhythmic sampling process that is continuously ongoing, with each sampling cycle being implemented through gamma-band synchrony. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Strategic Framework for Implementing the Potential of Import Substitution on the Example of Railway Engineering

    Directory of Open Access Journals (Sweden)

    Yuliya Georgievna Lavrikova

    2015-07-01

    Full Text Available At present, Russia’s economy is dependent on import in some of its strategically important sectors. The recent economic and political developments such as the aggravation of geopolitical situation and termination of economic partnership between Russia and a number of countries and entities, and also the Government’s policy that aims to reduce import dependence determine the need to expand the interaction between domestic producers and the need to use domestic resources, materials and equipment in economic activities. Import substitution in Russia can become a driving force of its industrial growth. The paper presents different interpretations of the term “import substitution” contained in several publications of recent years; it also reveals a common approach of the authors to this problem. The article summarizes existing proposals on priority areas of import substitution such as the shift towards import-substituting production and technology in strategically important industries. Mechanical engineering is seen as a most important industry in this respect. Russia’s machine-building complex is a highly diversified industry, therefore the policy of import substitution implies that it will be implemented efficiently in various sectors of mechanical engineering on the basis of the differentiated approach, with regard to industry and sectoral specifics. The article considers a strategic framework for the implementation of the import substitution potential on the example of railway engineering. The authors reveal trends in the development of the internal market of railway engineering products; they determine the degree of import dependence for individual sectors of the industry on the basis of statistical data. The article substantiates priorities and possibilities of import substitution in different sectors, and in high-tech sectors of railway engineering. The authors point out a goal of import substitution in these sectors, the goal is to

  1. Research and Implementation of Distributed Database HBase Monitoring System

    Directory of Open Access Journals (Sweden)

    Guo Lisi

    2017-01-01

    Full Text Available With the arrival of large data age, distributed database HBase becomes an important tool for storing data in massive data age. The normal operation of HBase database is an important guarantee to ensure the security of data storage. Therefore designing a reasonable HBase monitoring system is of great significance in practice. In this article, we introduce the solution, which contains the performance monitoring and fault alarm function module, to meet a certain operator’s demand of HBase monitoring database in their actual production projects. We designed a monitoring system which consists of a flexible and extensible monitoring agent, a monitoring server based on SSM architecture, and a concise monitoring display layer. Moreover, in order to deal with the problem that pages renders too slow in the actual operation process, we present a solution: reducing the SQL query. It has been proved that reducing SQL query can effectively improve system performance and user experience. The system work well in monitoring the status of HBase database, flexibly extending the monitoring index, and issuing a warning when a fault occurs, so that it is able to improve the working efficiency of the administrator, and ensure the smooth operation of the project.

  2. Enabling pathways to health equity: developing a framework for implementing social capital in practice.

    Science.gov (United States)

    Putland, Christine; Baum, Fran; Ziersch, Anna; Arthurson, Kathy; Pomagalska, Dorota

    2013-05-29

    Mounting evidence linking aspects of social capital to health and wellbeing outcomes, in particular to reducing health inequities, has led to intense interest in social capital theory within public health in recent decades. As a result, governments internationally are designing interventions to improve health and wellbeing by addressing levels of social capital in communities. The application of theory to practice is uneven, however, reflecting differing views on the pathways between social capital and health, and divergent theories about social capital itself. Unreliable implementation may restrict the potential to contribute to health equity by this means, yet to date there has been limited investigation of how the theory is interpreted at the level of policy and then translated into practice. The paper outlines a collaborative research project designed to address this knowledge deficit in order to inform more effective implementation. Undertaken in partnership with government departments, the study explored the application of social capital theory in programs designed to promote health and wellbeing in Adelaide, South Australia. It comprised three case studies of community-based practice, employing qualitative interviews and focus groups with community participants, practitioners, program managers and policy makers, to examine the ways in which the concept was interpreted and operationalized and identify the factors influencing success. These key lessons informed the development of practical resources comprising a guide for practitioners and briefing for policy makers. Overall the study showed that effective community projects can contribute to population health and wellbeing and reducing health inequities. Of specific relevance to this paper, however, is the finding that community projects rely for their effectiveness on a broader commitment expressed through policies and frameworks at the highest level of government decision making. In particular this

  3. Generic framework for vessel detection and tracking based on distributed marine radar image data

    Science.gov (United States)

    Siegert, Gregor; Hoth, Julian; Banyś, Paweł; Heymann, Frank

    2018-04-01

    Situation awareness is understood as a key requirement for safe and secure shipping at sea. The primary sensor for maritime situation assessment is still the radar, with the AIS being introduced as supplemental service only. In this article, we present a framework to assess the current situation picture based on marine radar image processing. Essentially, the framework comprises a centralized IMM-JPDA multi-target tracker in combination with a fully automated scheme for track management, i.e., target acquisition and track depletion. This tracker is conditioned on measurements extracted from radar images. To gain a more robust and complete situation picture, we are exploiting the aspect angle diversity of multiple marine radars, by fusing them a priori to the tracking process. Due to the generic structure of the proposed framework, different techniques for radar image processing can be implemented and compared, namely the BLOB detector and SExtractor. The overall framework performance in terms of multi-target state estimation will be compared for both methods based on a dedicated measurement campaign in the Baltic Sea with multiple static and mobile targets given.

  4. Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure

    Science.gov (United States)

    Sloan, Tine

    2013-01-01

    This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…

  5. Implementation of an Agent-Based Parallel Tissue Modelling Framework for the Intel MIC Architecture

    Directory of Open Access Journals (Sweden)

    Maciej Cytowski

    2017-01-01

    Full Text Available Timothy is a novel large scale modelling framework that allows simulating of biological processes involving different cellular colonies growing and interacting with variable environment. Timothy was designed for execution on massively parallel High Performance Computing (HPC systems. The high parallel scalability of the implementation allows for simulations of up to 109 individual cells (i.e., simulations at tissue spatial scales of up to 1 cm3 in size. With the recent advancements of the Timothy model, it has become critical to ensure appropriate performance level on emerging HPC architectures. For instance, the introduction of blood vessels supplying nutrients to the tissue is a very important step towards realistic simulations of complex biological processes, but it greatly increased the computational complexity of the model. In this paper, we describe the process of modernization of the application in order to achieve high computational performance on HPC hybrid systems based on modern Intel® MIC architecture. Experimental results on the Intel Xeon Phi™ coprocessor x100 and the Intel Xeon Phi processor x200 are presented.

  6. Evaluating the usefulness of dynamic pollutant fate models for implementing the EU Water Framework Directive.

    Science.gov (United States)

    Gevaert, Veerle; Verdonck, Frederik; Benedetti, Lorenzo; De Keyser, Webbey; De Baets, Bernard

    2009-06-01

    The European Water Framework Directive (WFD) aims at achieving a good ecological and chemical status of surface waters in river basins by 2015. The chemical status is considered good if the Environmental Quality Standards (EQSs) are met for all substances listed on the priority list and eight additional specific emerging substances. To check compliance with these standards, the WFD requires the establishment of monitoring programmes. The minimum measuring frequency for priority substances is currently set at once per month. This can result in non-representative sampling and increased probability of misinterpretation of the surface water quality status. To assist in the classification of the water body, the combined use of monitoring data and pollutant fate models is recommended. More specifically, dynamic models are suggested, as possible exceedance of the quality standards can be predicted by such models. In the presented work, four realistic scenarios are designed and discussed to illustrate the usefulness of dynamic pollutant fate models for implementing the WFD. They comprise a combination of two priority substances and two rivers, representative for Western Europe.

  7. Developing A Framework for Low-Volume Road Implementation of Pervious Concrete Pavements

    Directory of Open Access Journals (Sweden)

    Sonia Rahman, BSc

    2015-03-01

    Full Text Available Pervious concrete pavement is one of the promising pavement technologies, as it can help overcome traditional pavement environmental impacts, assist with stormwater management, and provide an effective low impact development solution. There are many benefits associated with pervious concrete pavement such as assisting with water filtration, absorbing heavy metals and reducing pollution. The most significant aspect, which draws the attention of environmental agencies and cities and municipalities, is its ability to reduce storm water runoff. Pervious concrete is documented as the paramount solution in storm water management by the United States Environmental Protection Agency. Though it has been used in the southern United States for years, the practice of using pervious concrete is more recent in northern climates where freeze thaw is observed. In Canada, several pervious concrete parking lots have been constructed over the past few years. However barriers exist for implementing the technology, as designers are not always fully informed on the various functional and structural design considerations. In this paper, a framework is developed to identify how pervious concrete can be integrated into low-volume infrastructure. This paper also summarizes the structural performance and drainage characteristics of pervious concrete parking lots constructed in various provinces of Canada, demonstrating the viability of pervious concrete for low-volume northern applications.

  8. Democratic governance and political rationalities in the implementation of the water framework directive in the Netherlands

    NARCIS (Netherlands)

    Behagel, J.H.; Arts, B.J.M.

    2014-01-01

    Multi-level governance, network governance, and, more recently, experimentalist governance are important analytical frameworks through which to understand democratic governance in the EU. However, these analytical frameworks carry normative assumptions that build on functionalist roots and

  9. Democratic governance and political rationalities in the implementation of the water framework directive in the Netherlands

    NARCIS (Netherlands)

    Behagel, J.H.; Arts, B.

    2013-01-01

    Multi-level governance, network governance, and, more recently, experimentalist governance are important analytical frameworks through which to understand democratic governance in the EU. However, these analytical frameworks carry normative assumptions that build on functionalist roots and

  10. A framework for evaluating distributed control systems in nuclear power plants

    International Nuclear Information System (INIS)

    O'Donell, C.; Jiang, J.

    2004-01-01

    A framework for evaluating the use of distributed control systems (DCS) in nuclear power plants (NPP) is proposed in this paper. The framework consists of advanced communication, control, hardware and software technology. This paper presents the results of an experiment using the framework test-bench, and elaborates on a variety of other research possibilities. Using a hardware in the loop system (HIL) a DeltaV M3 controller from Emerson Process is connected to a desktop NPP simulator. The industry standard communication protocol, Modbus, has been selected in this study. A simplified boiler pressure control (BPC) module is created on the NPP simulator. The test-bench provides an interface between the controller and the simulator. Through software monitoring the performance of the DCS can be evaluated. Controller access and response times over the Modbus network are observed and compared with theoretical values. The controller accomplishes its task under the specifications set out for the BPC. This novel framework allows a performance metric to be applied against different industrial controllers. (author)

  11. A Bayesian Framework for Estimating the Concordance Correlation Coefficient Using Skew-elliptical Distributions.

    Science.gov (United States)

    Feng, Dai; Baumgartner, Richard; Svetnik, Vladimir

    2018-04-05

    The concordance correlation coefficient (CCC) is a widely used scaled index in the study of agreement. In this article, we propose estimating the CCC by a unified Bayesian framework that can (1) accommodate symmetric or asymmetric and light- or heavy-tailed data; (2) select model from several candidates; and (3) address other issues frequently encountered in practice such as confounding covariates and missing data. The performance of the proposal was studied and demonstrated using simulated as well as real-life biomarker data from a clinical study of an insomnia drug. The implementation of the proposal is accessible through a package in the Comprehensive R Archive Network.

  12. The Climate Change Education Evidence Base: Lessons Learned from NOAA's Monitoring and Evaluation Framework Implementation

    Science.gov (United States)

    Baek, J.

    2012-12-01

    effort has provided some shared understanding and general guidance, there is still a lack of guidance to make decisions at any level of the community. A recent memorandum from the Office of Management and Budget provides more specific guidance around the generation and utilization of evidence. For example, the amount of funding awarded through grants should be weighted by the level of the evidence supporting a proposed project. As the field of climate change education establishes an evidence base, study designs should address a greater number of internal validity threats through comparison groups and reliable common measures. In addition, OMB invites agencies to develop systematic measurement of costs and costs per outcome. A growing evidence base, one that includes data that includes costs and even monetizes benefits, can inform decisions based on the strongest returns on investments within a portfolio. This paper will provide examples from NOAA's Monitoring and Evaluation Framework Implementation project that illustrate how NOAA is facing these challenges. This is intended to inform climate change educators, evaluators, and researchers in ways to integrate evaluation into the management of their programs while providing insight across the portfolio.

  13. 76 FR 22944 - Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity Management...

    Science.gov (United States)

    2011-04-25

    ... oversight program and operating conditions as well as the evolutionary process that distribution system... 20590. Hand Delivery: Docket Management System, Room W12-140, on the ground floor of the West Building... PHMSA-2011-0084] Pipeline Safety: Notice of Public Webinars on Implementation of Distribution Integrity...

  14. Implementing and Investigating Distributed Leadership in a National University Network--SaMnet

    Science.gov (United States)

    Sharma, Manjula D.; Rifkin, Will; Tzioumis, Vicky; Hill, Matthew; Johnson, Elizabeth; Varsavsky, Cristina; Jones, Susan; Beames, Stephanie; Crampton, Andrea; Zadnik, Marjan; Pyke, Simon

    2017-01-01

    The literature suggests that collaborative approaches to leadership, such as distributed leadership, are essential for supporting educational innovators in leading change in teaching in universities. This paper briefly describes the array of activities, processes and resources to support distributed leadership in the implementation of a network,…

  15. IMPLEMENTATION OF MULTIAGENT REINFORCEMENT LEARNING MECHANISM FOR OPTIMAL ISLANDING OPERATION OF DISTRIBUTION NETWORK

    DEFF Research Database (Denmark)

    Saleem, Arshad; Lind, Morten

    2008-01-01

    among electric power utilities to utilize modern information and communication technologies (ICT) in order to improve the automation of the distribution system. In this paper we present our work for the implementation of a dynamic multi-agent based distributed reinforcement learning mechanism...

  16. Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework

    Science.gov (United States)

    Wang, C.; Hu, F.; Sha, D.; Han, X.

    2017-10-01

    Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  17. Progress towards and barriers to implementation of a risk framework for US federal wildland fire policy and decision making

    Science.gov (United States)

    David C. Calkin; Mark A. Finney; Alan A. Ager; Matthew P. Thompson; Krista M. Gebert

    2011-01-01

    In this paper we review progress towards the implementation of a riskmanagement framework for US federal wildland fire policy and operations. We first describe new developments in wildfire simulation technology that catalyzed the development of risk-based decision support systems for strategic wildfire management. These systems include new analytical methods to measure...

  18. Design and implementation of the reconstruction software for the photon multiplicity detector in object oriented programming framework

    International Nuclear Information System (INIS)

    Chattopadhayay, Subhasis; Ghosh, Premomoy; Gupta, R.; Mishra, D.; Phatak, S.C.; Sood, G.

    2002-01-01

    High granularity photon multiplicity detector (PMD) is scheduled to take data in Relativistic Heavy Ion Collision(RHIC) this year. A detailed scheme has been designed and implemented in object oriented programming framework using C++ for the monitoring and reconstruction job of PMD data

  19. Assessing the risk of impact of farming intensification on calcareous grasslands in Europe: a quantitative implementation of the MIRABEL framework

    NARCIS (Netherlands)

    Petit, S.; Elbersen, B.S.

    2006-01-01

    Intensification of farming practices is still a major driver of biodiversity loss in Europe, despite the implementation of policies that aim to reverse this trend. A conceptual framework called MIRABEL was previously developed that enabled a qualitative and expert-based assessment of the impact of

  20. A Multi-Functional Fully Distributed Control Framework for AC Microgrids

    DEFF Research Database (Denmark)

    Shafiee, Qobad; Nasirian, Vahidreza; Quintero, Juan Carlos Vasquez

    2018-01-01

    This paper proposes a fully distributed control methodology for secondary control of AC microgrids. The control framework includes three modules: voltage regulator, reactive power regulator, and active power/frequency regulator. The voltage regulator module maintains the average voltage of the mi......This paper proposes a fully distributed control methodology for secondary control of AC microgrids. The control framework includes three modules: voltage regulator, reactive power regulator, and active power/frequency regulator. The voltage regulator module maintains the average voltage...... of the microgrid distribution line at the rated value. The reactive power regulator compares the local normalized reactive power of an inverter with its neighbors’ powers on a communication graph and, accordingly, fine-tunes Q-V droop coefficients to mitigate any reactive power mismatch. Collectively, these two....../reactive power sharing. An AC microgrid is prototyped to experimentally validate the proposed control methodology against the load change, plug-and-play operation, and communication constraints such as delay, packet loss, and limited bandwidth....

  1. Implementing the water framework directive in Denmark - Lessons on agricultural measures from a legal and regulatory perspective

    DEFF Research Database (Denmark)

    Jacobsen, Brian H.; Anker, Helle Tegner; Baaner, Lasse

    2017-01-01

    One of the major challenges in the implementation of the Water Framework Directive (WFD) is how to address diffuse agricultural pollution of the aquatic environment. In Denmark the implementation of agricultural measures has been fraught with difficulty in the form of delays and legal proceedings...... and a policy failure. It is argued that the adoption of more flexible measures to be implemented at the local level could have resulted in fewer difficulties from an economic and legal point of view as measures could have been applied where there was a clear environmental benefit, and possibly also at a lower...

  2. The SCEC Unified Community Velocity Model (UCVM) Software Framework for Distributing and Querying Seismic Velocity Models

    Science.gov (United States)

    Maechling, P. J.; Taborda, R.; Callaghan, S.; Shaw, J. H.; Plesch, A.; Olsen, K. B.; Jordan, T. H.; Goulet, C. A.

    2017-12-01

    Crustal seismic velocity models and datasets play a key role in regional three-dimensional numerical earthquake ground-motion simulation, full waveform tomography, modern physics-based probabilistic earthquake hazard analysis, as well as in other related fields including geophysics, seismology, and earthquake engineering. The standard material properties provided by a seismic velocity model are P- and S-wave velocities and density for any arbitrary point within the geographic volume for which the model is defined. Many seismic velocity models and datasets are constructed by synthesizing information from multiple sources and the resulting models are delivered to users in multiple file formats, such as text files, binary files, HDF-5 files, structured and unstructured grids, and through computer applications that allow for interactive querying of material properties. The Southern California Earthquake Center (SCEC) has developed the Unified Community Velocity Model (UCVM) software framework to facilitate the registration and distribution of existing and future seismic velocity models to the SCEC community. The UCVM software framework is designed to provide a standard query interface to multiple, alternative velocity models, even if the underlying velocity models are defined in different formats or use different geographic projections. The UCVM framework provides a comprehensive set of open-source tools for querying seismic velocity model properties, combining regional 3D models and 1D background models, visualizing 3D models, and generating computational models in the form of regular grids or unstructured meshes that can be used as inputs for ground-motion simulations. The UCVM framework helps researchers compare seismic velocity models and build equivalent simulation meshes from alternative velocity models. These capabilities enable researchers to evaluate the impact of alternative velocity models in ground-motion simulations and seismic hazard analysis applications

  3. TMD PDFs. A Monte Carlo implementation for the sea quark distribution

    International Nuclear Information System (INIS)

    Hautmann, F.

    2012-05-01

    This article gives an introduction to transverse momentum dependent (TMD) parton distribution functions and their use in shower Monte Carlo event generators for high-energy hadron collisions, and describes recent progress in the treatment of sea quark effects within a TMD parton-shower framework.

  4. The development of an implementation framework for service-learning during the undergraduate nursing programme in the Western Cape Province

    Directory of Open Access Journals (Sweden)

    Hester Julie

    2015-11-01

    Full Text Available Background: Service-learning (SL is a contested field of knowledge and issues of sustainability and scholarship have been raised about it. The South African Higher Education Quality Committee (HEQC has provided policy documents to guide higher education institutions (HEIs in the facilitation of SL institutionalisation in their academic programmes. An implementation framework was therefore needed to institutionalise the necessary epistemological shifts advocated in the national SL policy guidelines. Objectives: This article is based on the findings of a doctoral thesis that aimed at developing an SL implementation framework for the School of Nursing (SoN at the University of the Western Cape (UWC. Method: Mixed methods were used during the first four phases of the design and developmenti ntervention research model developed by Rothman and Thomas. Results: The SL implementation framework that was developed during Phase 3 specified the intervention elements to address the gaps that had been identified by the core findings of Phases 1 and 2. Four intervention elements were specified for the SL implementation framework. The first intervention element focused on the assessment of readiness for SL institutionalisation. The development of SL capacity and SL scholarship was regarded as the pivotal intervention element for three of the elements: the development of a contextual SL definition, an SL pedagogical model, and a monitoring and evaluation system for SL institutionalisation. Conclusion: The SL implementation framework satisfies the goals of SL institutionalisation, namely to develop a common language and a set of principles to guide practice, and to ensure the allocation of resources in order to facilitate the SL teaching methodology.The contextualised SL definition that was formulated for the SoN contributes to the SL operationalisation discourse at the HEI.

  5. Implementation of evidence into practice for cancer-related fatigue management of hospitalized adult patients using the PARIHS framework.

    Directory of Open Access Journals (Sweden)

    Li Tian

    Full Text Available This study aimed to explore an evidence-based nursing practice model of CRF management in hospitalized adult patients using the PARIHS evidence-implementation framework as the theoretical structure to provide guidance for similar nursing practices. The implementation of guideline evidence into clinical practice was conducted on the oncology and radiotherapy wards of a university-affiliated hospital. The process of integrating the guideline into the symptom management system of cancer patients was described. The impact of the evidence implementation was evaluated from three aspects: organizational innovations and outcome measures associated with nurses and with patients pre- and post-evidence implementation. During the implementation of evidence into practice on the wards, a nursing process, health education, a quality control sheet and CRF training courses were established. Through this implementation, compliance with evidence related to CRF increased significantly on the two wards, with that of ward B being higher than that of ward A. Regarding nursing outcomes, nursing knowledge, attitude and behavior scores with respect to CRF nursing care increased substantially after its application on the two wards, and the ward B nurses' scoring was higher than that of the ward A nurses. Qualitative analysis concerning the nurses suggested that leadership, patient concern about CRF management, and the need for professional development were the main motivators of the application, whereas the shortage and mobility of nursing human resources and insufficient communication between doctors and nurses were the main barriers. Additionally, most nurses felt more professional and confident about their work. Regarding patient outcomes, patient knowledge, attitude and behavior scores regarding CRF self-management increased significantly. Patients' post-implementation CRF was alleviated compared with the pre-implementation treatment cycle. The PARIHS framework may

  6. The implementation of a global fund grant in Lesotho: applying a framework on knowledge absorptive capacity.

    Science.gov (United States)

    Biesma, Regien; Makoa, Elsie; Mpemi, Regina; Tsekoa, Lineo; Odonkor, Philip; Brugha, Ruairi

    2012-02-01

    One of the biggest challenges in scaling up health interventions in sub-Saharan Africa for government recipients is to effectively manage the rapid influx of aid from different donors, each with its own requirements and conditions. However, there is little empirical evidence on how governments absorb knowledge from new donors in order to satisfy their requirements. This case study applies Cuellar and Gallivan's (2006) framework on knowledge absorptive capacity (AC) to illustrate how recipient government organisations in Lesotho identified, assimilated and utilised knowledge on how to meet the disbursement and reporting requirements of Lesotho's Round 5 grant from the Global Fund to Fight AIDS, TB and Malaria (Global Fund). In-depth topic guided interviews with 22 respondents and document reviews were conducted between July 2008 and February 2009. Analysis focused on six organisational determinants that affect an organisation's absorptive capacity: prior-related knowledge, combinative capabilities, motivation, organisational structure, cultural match, and communication channels. Absorptive capacity was mostly evident at the level of the Principal Recipient, the Ministry of Finance, who established a new organisational unit to meet the requirements of Global Fund Grants, while the level of AC was less advanced among the Ministry of Health (Sub-Recipient) and district level implementers. Recipient organisations can increase their absorptive capacity, not only through prior knowledge of donor requirements, but also by deliberately changing their organisational form and through combinative capabilities. The study also revealed how vulnerable African governments are to loss of staff capacity. The application of organisational theory to analyse the interactions of donor agencies with public and non-public country stakeholders illustrates the complexity of the environment that aid recipient governments have to manage. Copyright © 2011 Elsevier Ltd. All rights reserved.

  7. A framework for the assessment of the global potential of joint implementation

    International Nuclear Information System (INIS)

    Bollen, J.C.; Van Minnen, J.G.; Toet, A.M.C.; Kuik, O.J.; Bennis, M.

    1995-09-01

    Joint Implementation (JI) is a means for reaching cheaper solutions to CO 2 -emission reductions. The global potential for JI is defined as the portion of the necessary regional CO 2 -emission reductions to meet the goals defined for a target year that can be more cost-effectively implemented in other regions. The goals will come from multilateral negotiations and may be the starting point of any bilateral negotiation process concerning JI-agreements. The more the goals will take cost-effectiveness criteria into account the less scope there will be for additional JI-agreements. In cases where the goals are stricter, the globally required emission reductions will be larger and consequently larger emission reductions could be be achieved in other regions. However, as compared to the other cases presented in the report the potential for JI will be lower, since the potential is defined as a fraction of the necessary global emission reduction. JI could reduce the total costs of CO 2 -emission reduction by more than 75% compared to the situation without JI, depending on the initial distribution of CO 2 -targets, the target year and the scenario-assumptions. For donor countries the cost reductions could be more than 50%, even when it is assumed that the subsidies for emission reduction measures in receptor countries are 50% higher than the actual costs. In donor countries the resulting domestic CO 2 -emission reductions after JI could be more than 50% less than the original goals before JI. In comparison to other OECD regions (Japan, Oceania, and the United States), Western Europe's gains from JI are less. In other words, in a world market for JI projects, Western Europe could face strong competition. 7 figs., 25 tabs., 37 refs., 1 appendix

  8. Effect of implementing the 5As of obesity management framework on provider-patient interactions in primary care.

    Science.gov (United States)

    Rueda-Clausen, C F; Benterud, E; Bond, T; Olszowka, R; Vallis, M T; Sharma, A M

    2014-02-01

    Obesity counselling in primary care is positively associated with self-reported behaviour change in patients with obesity. Obesity counselling is rare, and when it does occur, it is often of low quality because of poor training and/or competency of providers' obesity management, lack of time and economical disincentives, and negative attitude towards obesity and obesity management. 5As frameworks are routinely used for behaviour-change counselling and addiction management (e.g. smoking cessation), but few studies have examined its efficacy for weight management. This study presents pilot data from the implementation and evaluation of an obesity management tool (5As of Obesity Management developed by the Canadian Obesity Network) in a primary care setting. Results show that the tool facilitates weight management in primary care by promoting physician-patient communications, medical assessments for obesity and plans for follow-up care. Obesity remains poorly managed in primary care. The 5As of Obesity Management is a theory-driven, evidence-based minimal intervention designed to facilitate obesity counselling and management by primary care practitioners. This project tested the impact of implementing this tool in primary care clinics. Electronic self-administered surveys were completed by pre-screened obese subjects at the end of their appointments in four primary care clinics (over 25 healthcare providers [HCPs]). These measurements were performed before (baseline, n = 51) and 1 month after implementing the 5As of Obesity Management (post-intervention, n = 51). Intervention consisted of one online training session (90 min) and distribution of the 5As toolkit to HCPs of participating clinics. Subjects completing the survey before and after the intervention were comparable in terms of age, sex, body mass index, comorbidities, satisfaction and self-reported health status (P > 0.2). Implementing the 5As of Obesity Management resulted in a twofold increase

  9. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  10. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    Science.gov (United States)

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  11. Toward enhancing the distributed video coder under a multiview video codec framework

    Science.gov (United States)

    Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua

    2016-11-01

    The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.

  12. Integrating a Trust Framework with a Distributed Certificate Validation Scheme for MANETs

    Directory of Open Access Journals (Sweden)

    Marias Giannis F

    2006-01-01

    Full Text Available Many trust establishment solutions in mobile ad hoc networks (MANETs rely on public key certificates. Therefore, they should be accompanied by an efficient mechanism for certificate revocation and validation. Ad hoc distributed OCSP for trust (ADOPT is a lightweight, distributed, on-demand scheme based on cached OCSP responses, which provides certificate status information to the nodes of a MANET. In this paper we discuss the ADOPT scheme and issues on its deployment over MANETs. We present some possible threats to ADOPT and suggest the use of a trust assessment and establishment framework, named ad hoc trust framework (ATF, to support ADOPT's robustness and efficiency. ADOPT is deployed as a trust-aware application that provides feedback to ATF, which calculates the trustworthiness of the peer nodes' functions and helps ADOPT to improve its performance by rapidly locating valid certificate status information. Moreover, we introduce the TrustSpan algorithm to reduce the overhead that ATF produces, and the TrustPath algorithm to identify and use trusted routes for propagating sensitive information, such as third parties' accusations. Simulation results show that ATF adds limited overhead compared to its efficiency in detecting and isolating malicious and selfish nodes. ADOPT's reliability is increased, since it can rapidly locate a legitimate response by using information provided by ATF.

  13. Intercomparison of Streamflow Simulations between WRF-Hydro and Hydrology Laboratory-Research Distributed Hydrologic Model Frameworks

    Science.gov (United States)

    KIM, J.; Smith, M. B.; Koren, V.; Salas, F.; Cui, Z.; Johnson, D.

    2017-12-01

    The National Oceanic and Atmospheric Administration (NOAA)-National Weather Service (NWS) developed the Hydrology Laboratory-Research Distributed Hydrologic Model (HL-RDHM) framework as an initial step towards spatially distributed modeling at River Forecast Centers (RFCs). Recently, the NOAA/NWS worked with the National Center for Atmospheric Research (NCAR) to implement the National Water Model (NWM) for nationally-consistent water resources prediction. The NWM is based on the WRF-Hydro framework and is run at a 1km spatial resolution and 1-hour time step over the contiguous United States (CONUS) and contributing areas in Canada and Mexico. In this study, we compare streamflow simulations from HL-RDHM and WRF-Hydro to observations from 279 USGS stations. For streamflow simulations, HL-RDHM is run on 4km grids with the temporal resolution of 1 hour for a 5-year period (Water Years 2008-2012), using a priori parameters provided by NOAA-NWS. The WRF-Hydro streamflow simulations for the same time period are extracted from NCAR's 23 retrospective run of the NWM (version 1.0) over CONUS based on 1km grids. We choose 279 USGS stations which are relatively less affected by dams or reservoirs, in the domains of six different RFCs. We use the daily average values of simulations and observations for the convenience of comparison. The main purpose of this research is to evaluate how HL-RDHM and WRF-Hydro perform at USGS gauge stations. We compare daily time-series of observations and both simulations, and calculate the error values using a variety of error functions. Using these plots and error values, we evaluate the performances of HL-RDHM and WRF-Hydro models. Our results show a mix of model performance across geographic regions.

  14. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  15. A framework for stochastic simulation of distribution practices for hotel reservations

    International Nuclear Information System (INIS)

    Halkos, George E.; Tsilika, Kyriaki D.

    2015-01-01

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system

  16. Capataz: a framework for distributing algorithms via the World Wide Web

    Directory of Open Access Journals (Sweden)

    Gonzalo J. Martínez

    2015-08-01

    Full Text Available In recent years, some scientists have embraced the distributed computing paradigm. As experiments and simulations demand ever more computing power, coordinating the efforts of many different processors is often the only reasonable resort. We developed an open-source distributed computing framework based on web technologies, and named it Capataz. Acting as an HTTP server, web browsers running on many different devices can connect to it to contribute in the execution of distributed algorithms written in Javascript. Capataz takes advantage of architectures with many cores using web workers. This paper presents an improvement in Capataz´ usability and why it was needed. In previous experiments the total time of distributed algorithms proved to be susceptible to changes in the execution time of the jobs. The system now adapts by bundling jobs together if they are too simple. The computational experiment to test the solution is a brute force estimation of pi. The benchmark results show that by bundling jobs, the overall perfomance is greatly increased.

  17. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    Science.gov (United States)

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to

  18. Hydroclimatic regimes: a distributed water-balance framework for hydrologic assessment, classification, and management

    Science.gov (United States)

    Weiskel, Peter K.; Wolock, David M.; Zarriello, Phillip J.; Vogel, Richard M.; Levin, Sara B.; Lent, Robert M.

    2014-01-01

    Runoff-based indicators of terrestrial water availability are appropriate for humid regions, but have tended to limit our basic hydrologic understanding of drylands – the dry-subhumid, semiarid, and arid regions which presently cover nearly half of the global land surface. In response, we introduce an indicator framework that gives equal weight to humid and dryland regions, accounting fully for both vertical (precipitation + evapotranspiration) and horizontal (groundwater + surface-water) components of the hydrologic cycle in any given location – as well as fluxes into and out of landscape storage. We apply the framework to a diverse hydroclimatic region (the conterminous USA) using a distributed water-balance model consisting of 53 400 networked landscape hydrologic units. Our model simulations indicate that about 21% of the conterminous USA either generated no runoff or consumed runoff from upgradient sources on a mean-annual basis during the 20th century. Vertical fluxes exceeded horizontal fluxes across 76% of the conterminous area. Long-term-average total water availability (TWA) during the 20th century, defined here as the total influx to a landscape hydrologic unit from precipitation, groundwater, and surface water, varied spatially by about 400 000-fold, a range of variation ~100 times larger than that for mean-annual runoff across the same area. The framework includes but is not limited to classical, runoff-based approaches to water-resource assessment. It also incorporates and reinterprets the green- and blue-water perspective now gaining international acceptance. Implications of the new framework for several areas of contemporary hydrology are explored, and the data requirements of the approach are discussed in relation to the increasing availability of gridded global climate, land-surface, and hydrologic data sets.

  19. A New Framework for Universiti Kebangsaan Malaysia Soft Skills Course: Implementation and Challenges

    Science.gov (United States)

    Che-Ani, Adi-Irfan; Ismail, Khaidzir; Ahmad, Azizan; Ariffin, Kadir; Razak, Mohd Zulhanif Abd

    2014-01-01

    The importance of soft skills to the graduates to compete in the working world is undeniable. Soft skills are complementary to the academic qualifications held by students. Recognizing this, the University Kebangsaan Malaysia (UKM) has established a new framework for Soft Skills courses to improve the existing framework of the course. The…

  20. Design and implementation of distributed spatial computing node based on WPS

    International Nuclear Information System (INIS)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-01-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed

  1. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  2. Implementation of internet-delivered cognitive behavior therapy within community mental health clinics: a process evaluation using the consolidated framework for implementation research.

    Science.gov (United States)

    Hadjistavropoulos, H D; Nugent, M M; Dirkse, D; Pugh, N

    2017-09-12

    Depression and anxiety are prevalent and under treated conditions that create enormous burden for the patient and the health system. Internet-delivered cognitive behavior therapy (ICBT) improves patient access to treatment by providing therapeutic information via the Internet, presented in sequential lessons, accompanied by brief weekly therapist support. While there is growing research supporting ICBT, use of ICBT within community mental health clinics is limited. In a recent trial, an external unit specializing in ICBT facilitated use of ICBT in community mental health clinics in one Canadian province (ISRCTN42729166; registered November 5, 2013). Patient outcomes were very promising and uptake was encouraging. This paper reports on a parallel process evaluation designed to understand facilitators and barriers impacting the uptake and implementation of ICBT. Therapists (n = 22) and managers (n = 11) from seven community mental health clinics dispersed across one Canadian province who were involved in implementing ICBT over ~2 years completed an online survey (including open and closed-ended questions) about ICBT experiences. The questions were based on the Consolidated Framework for Implementation Research (CFIR), which outlines diverse constructs that have the potential to impact program implementation. Analyses suggested ICBT implementation was perceived to be most prominently facilitated by intervention characteristics (namely the relative advantages of ICBT compared to face-to-face therapy, the quality of the ICBT program that was delivered, and evidence supporting ICBT) and implementation processes (namely the use of an external facilitation unit that aided with engaging patients, therapists, and managers and ICBT implementation). The inner setting was identified as the most significant barrier to implementation as a result of limited resources for ICBT combined with greater priority given to face-to-face care. The results contribute to understanding

  3. Adaptivna digitalna sita v strukturi porazdeljene aritmetike: Adaptive digital filter implementation with distributed arithmetic structure:

    OpenAIRE

    Babič, Rudolf; Horvat, Bogomir; Osebik, Davorin

    2001-01-01

    Adaptive digital filters have a wide range of applications in the area of signal processing where only minimum a priori knowledge of signal characteristics is available. In this article the adaptive FIR digital filter implementation based on the distributed arithmetic technique is described. The major problem with conventional adaptive digital filter is the need for fast multipliers. When using a hardware implementation. These multipliers take up the disproportional amount of the overall cost...

  4. Large Survey Database: A Distributed Framework for Storage and Analysis of Large Datasets

    Science.gov (United States)

    Juric, Mario

    2011-01-01

    The Large Survey Database (LSD) is a Python framework and DBMS for distributed storage, cross-matching and querying of large survey catalogs (>10^9 rows, >1 TB). The primary driver behind its development is the analysis of Pan-STARRS PS1 data. It is specifically optimized for fast queries and parallel sweeps of positionally and temporally indexed datasets. It transparently scales to more than >10^2 nodes, and can be made to function in "shared nothing" architectures. An LSD database consists of a set of vertically and horizontally partitioned tables, physically stored as compressed HDF5 files. Vertically, we partition the tables into groups of related columns ('column groups'), storing together logically related data (e.g., astrometry, photometry). Horizontally, the tables are partitioned into partially overlapping ``cells'' by position in space (lon, lat) and time (t). This organization allows for fast lookups based on spatial and temporal coordinates, as well as data and task distribution. The design was inspired by the success of Google BigTable (Chang et al., 2006). Our programming model is a pipelined extension of MapReduce (Dean and Ghemawat, 2004). An SQL-like query language is used to access data. For complex tasks, map-reduce ``kernels'' that operate on query results on a per-cell basis can be written, with the framework taking care of scheduling and execution. The combination leverages users' familiarity with SQL, while offering a fully distributed computing environment. LSD adds little overhead compared to direct Python file I/O. In tests, we sweeped through 1.1 Grows of PanSTARRS+SDSS data (220GB) less than 15 minutes on a dual CPU machine. In a cluster environment, we achieved bandwidths of 17Gbits/sec (I/O limited). Based on current experience, we believe LSD should scale to be useful for analysis and storage of LSST-scale datasets. It can be downloaded from http://mwscience.net/lsd.

  5. A policy-based multi-objective optimisation framework for residential distributed energy system design★

    Directory of Open Access Journals (Sweden)

    Wouters Carmen

    2017-01-01

    Full Text Available Distributed energy systems (DES are increasingly being introduced as solutions to alleviate conventional energy system challenges related to energy security, climate change and increasing demands. From a technological and economic perspective, distributed energy resources are already becoming viable. The question still remains as to how these technologies and practices can be “best” selected, sized and integrated within consumer areas. To aid decision-makers and enable widespread DES adoption, a strategic superstructure design framework is therefore still required that ensures balancing of multiple stakeholder interests and fits in with liberalised energy system objectives of competition, security of supply and sustainability. Such a design framework is presented in this work. An optimisation-based approach for the design of neighbourhood-based DES is developed that enables meeting their yearly electricity, heating and cooling needs by appropriately selecting, sizing and locating technologies and energy interactions. A pool of poly-generation and storage technologies is hereto considered combined with local energy sharing between participating prosumers through thermal pipeline design and microgrid operation, and, a bi-directional connection with the central distribution grid. A superstructure mixed-integer linear programming approach (MILP is proposed to trade off three minimisation objectives in the design process: total annualised cost, annual CO2 emissions and electrical system unavailability, aligned with the three central energy system objectives. The developed model is applied on a small South Australian neighbourhood. The approach enables identifying “knee-point” neighbourhood energy system designs through Pareto trade-offs between objectives and serves to inform decision-makers about the impact of policy objectives on DES development strategies.

  6. Climate Services for Development Planning and Implementation: A Framework for Assessing and Valuing Climate Services

    Science.gov (United States)

    Anderson, G.

    2012-04-01

    Climate Services for Development Planning and Implementation: A Framework for Assessing and Valuing Climate Services Anderson, Glen D. While weather forecasting products have been available globally for decades, the full suite of climate services - including historical and real time observational meteorological data, daily, weekly, and seasonal forecasts, and longer-term climate projections - has only been under development in the last 15 to 20 years. Climate services have been developed and implemented quite quickly in developed countries for public and private sector users. However, diffusion of these tools in developing countries has been slower for several reasons related to 1) lack of awareness of the opportunities and benefits of climate services; 2) spotty record of managing local weather and climate data; and 3) limited resources to build and sustain capacity in providing climate services. The Climate Services Partnership (CSP) was formed during the International Conference on Climate Services (ICCS) in October 2011. The CSP seeks to improve the provision and development of climate services worldwide. During the ICCS, three working groups were formed to carry out the work program of the CSP leading up to the second ICCS in Berlin in September 2012. The Economic Valuation of Climate Services Working Group, chaired by John Zillman and myself, is collaborating on several activities to demonstrate the benefits of climate services and help providers prioritize opportunities for expanding the use of climate services. The proposed paper will provide an overview of the Working Group's activities leading up to the next ICCS and describe specific work that is underway and expected to be completed prior to the EGU meetings. The focal point of the Working Group's activities is the development of matrix to help identify and value the best opportunities for using climate services. Different categories of climate services will be listed in rows and potential users of

  7. Generic-distributed framework for cloud services marketplace based on unified ontology

    Directory of Open Access Journals (Sweden)

    Samer Hasan

    2017-11-01

    Full Text Available Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors’ knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  8. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  9. Examination of the utility of the promoting action on research implementation in health services framework for implementation of evidence based practice in residential aged care settings.

    Science.gov (United States)

    Perry, Lin; Bellchambers, Helen; Howie, Andrew; Moxey, Annette; Parkinson, Lynne; Capra, Sandra; Byles, Julie

    2011-10-01

    This study examined the relevance and fit of the PARiHS framework (Promoting Action on Research Implementation in Health Services) as an explanatory model for practice change in residential aged care. Translation of research knowledge into routine practice is a complex matter in health and social care environments. Examination of the environment may identify factors likely to support and hinder practice change, inform strategy development, predict and explain successful uptake of new ways of working. Frameworks to enable this have been described but none has been tested in residential aged care. This paper reports preliminary qualitative analyses from the Encouraging Best Practice in Residential Aged Care Nutrition and Hydration project conducted in New South Wales in 2007-2009. We examined congruence with the PARiHS framework of factors staff described as influential for practice change during 29 digitally recorded and transcribed staff interviews and meetings at three facilities. Unique features of the setting were flagged, with facilities simultaneously filling the roles of residents' home, staff's workplace and businesses. Participants discussed many of the same characteristics identified by the PARiHS framework, but in addition temporal dimensions of practice change were flagged. Overall factors described by staff as important for practice change in aged care settings showed good fit with those of the PARiHS framework. This framework can be recommended for use in this setting. Widespread adoption will enable cross-project and international synthesis of findings, a major step towards building a cumulative science of knowledge translation and practice change. © 2011 The Authors. Journal of Advanced Nursing © 2011 Blackwell Publishing Ltd.

  10. The Development Of A Theoretical Lean Culture Causal Framework To Support The Effective Implementation Of Lean In Automotive Component Manufacturers

    Directory of Open Access Journals (Sweden)

    Van der Merwe, Karl Robert

    2014-05-01

    Full Text Available Although it is generally accepted that lean manufacturing improves operational performance, many organisations are struggling to adapt to the lean philosophy. The purpose of this study is to contribute to a more effective strategy for implementing the lean manufacturing improvement philosophy. The study sets out both to integrate well-researched findings and theories related to generic organisational culture with more recent research and experience related to lean culture, and to examine the role that culture plays in the effective implementation of lean manufacturing principles and techniques. The ultimate aim of this exercise is to develop a theoretical lean culture causal framework.

  11. Time-reversal symmetric work distributions for closed quantum dynamics in the histories framework

    International Nuclear Information System (INIS)

    Miller, Harry J D; Anders, Janet

    2017-01-01

    A central topic in the emerging field of quantum thermodynamics is the definition of thermodynamic work in the quantum regime. One widely used solution is to define work for a closed system undergoing non-equilibrium dynamics according to the two-point energy measurement scheme. However, due to the invasive nature of measurement the two-point quantum work probability distribution cannot describe the statistics of energy change from the perspective of the system alone. We here introduce the quantum histories framework as a method to characterise the thermodynamic properties of the unmeasured , closed dynamics. Constructing continuous power operator trajectories allows us to derive an alternative quantum work distribution for closed quantum dynamics that fulfils energy conservation and is time-reversal symmetric. This opens the possibility to compare the measured work with the unmeasured work, contrasting with the classical situation where measurement does not affect the work statistics. We find that the work distribution of the unmeasured dynamics leads to deviations from the classical Jarzynski equality and can have negative values highlighting distinctly non-classical features of quantum work. (fast track communication)

  12. A framework for multi-object tracking over distributed wireless camera networks

    Science.gov (United States)

    Gau, Victor; Hwang, Jenq-Neng

    2010-07-01

    In this paper, we propose a unified framework targeting at two important issues in a distributed wireless camera network, i.e., object tracking and network communication, to achieve reliable multi-object tracking over distributed wireless camera networks. In the object tracking part, we propose a fully automated approach for tracking of multiple objects across multiple cameras with overlapping and non-overlapping field of views without initial training. To effectively exchange the tracking information among the distributed cameras, we proposed an idle probability based broadcasting method, iPro, which adaptively adjusts the broadcast probability to improve the broadcast effectiveness in a dense saturated camera network. Experimental results for the multi-object tracking demonstrate the promising performance of our approach on real video sequences for cameras with overlapping and non-overlapping views. The modeling and ns-2 simulation results show that iPro almost approaches the theoretical performance upper bound if cameras are within each other's transmission range. In more general scenarios, e.g., in case of hidden node problems, the simulation results show that iPro significantly outperforms standard IEEE 802.11, especially when the number of competing nodes increases.

  13. BIOMedical Search Engine Framework: Lightweight and customized implementation of domain-specific biomedical search engines.

    Science.gov (United States)

    Jácome, Alberto G; Fdez-Riverola, Florentino; Lourenço, Anália

    2016-07-01

    Text mining and semantic analysis approaches can be applied to the construction of biomedical domain-specific search engines and provide an attractive alternative to create personalized and enhanced search experiences. Therefore, this work introduces the new open-source BIOMedical Search Engine Framework for the fast and lightweight development of domain-specific search engines. The rationale behind this framework is to incorporate core features typically available in search engine frameworks with flexible and extensible technologies to retrieve biomedical documents, annotate meaningful domain concepts, and develop highly customized Web search interfaces. The BIOMedical Search Engine Framework integrates taggers for major biomedical concepts, such as diseases, drugs, genes, proteins, compounds and organisms, and enables the use of domain-specific controlled vocabulary. Technologies from the Typesafe Reactive Platform, the AngularJS JavaScript framework and the Bootstrap HTML/CSS framework support the customization of the domain-oriented search application. Moreover, the RESTful API of the BIOMedical Search Engine Framework allows the integration of the search engine into existing systems or a complete web interface personalization. The construction of the Smart Drug Search is described as proof-of-concept of the BIOMedical Search Engine Framework. This public search engine catalogs scientific literature about antimicrobial resistance, microbial virulence and topics alike. The keyword-based queries of the users are transformed into concepts and search results are presented and ranked accordingly. The semantic graph view portraits all the concepts found in the results, and the researcher may look into the relevance of different concepts, the strength of direct relations, and non-trivial, indirect relations. The number of occurrences of the concept shows its importance to the query, and the frequency of concept co-occurrence is indicative of biological relations

  14. Web Application to Monitor Logistics Distribution of Disaster Relief Using the CodeIgniter Framework

    Science.gov (United States)

    Jamil, Mohamad; Ridwan Lessy, Mohamad

    2018-03-01

    Disaster management is the responsibility of the central government and local governments. The principles of disaster management, among others, are quick and precise, priorities, coordination and cohesion, efficient and effective manner. Help that is needed by most societies are logistical assistance, such as the assistance covers people’s everyday needs, such as food, instant noodles, fast food, blankets, mattresses etc. Logistical assistance is needed for disaster management, especially in times of disasters. The support of logistical assistance must be timely, to the right location, target, quality, quantity, and needs. The purpose of this study is to make a web application to monitorlogistics distribution of disaster relefusing CodeIgniter framework. Through this application, the mechanisms of aid delivery will be easily controlled from and heading to the disaster site.

  15. Web Application To Monitor Logistics Distribution of Disaster Relief Using the CodeIgniter Framework

    Directory of Open Access Journals (Sweden)

    Mohamad Jamil

    2017-10-01

    Full Text Available Disaster management is the responsibility of the central government and local governments. The principles of disaster management, among others, are quick and precise, priorities, coordination and cohesion, efficient and effective manner. Help that is needed by most societies are logistical assistance, such as the assistance covers people's everyday needs, such as food, instant noodles, fast food, blankets, mattresses etc. Logistical assistance is needed for disaster management, especially in times of disasters. The support of logistical assistance must be timely, to the right location, target, quality, quantity, and needs. The purpose of this study is to make a web application to monitorlogistics distribution of disaster relefusing CodeIgniter framework. Through this application, the mechanisms of aid delivery will be easily controlled from and heading to the disaster site

  16. Framework and Method for Controlling a Robotic System Using a Distributed Computer Network

    Science.gov (United States)

    Sanders, Adam M. (Inventor); Barajas, Leandro G. (Inventor); Permenter, Frank Noble (Inventor); Strawser, Philip A. (Inventor)

    2015-01-01

    A robotic system for performing an autonomous task includes a humanoid robot having a plurality of compliant robotic joints, actuators, and other integrated system devices that are controllable in response to control data from various control points, and having sensors for measuring feedback data at the control points. The system includes a multi-level distributed control framework (DCF) for controlling the integrated system components over multiple high-speed communication networks. The DCF has a plurality of first controllers each embedded in a respective one of the integrated system components, e.g., the robotic joints, a second controller coordinating the components via the first controllers, and a third controller for transmitting a signal commanding performance of the autonomous task to the second controller. The DCF virtually centralizes all of the control data and the feedback data in a single location to facilitate control of the robot across the multiple communication networks.

  17. A Practical Framework for Sharing and Rendering Real-World Bidirectional Scattering Distribution Functions

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Greg [Anywhere Software, Albany, CA (United States); Kurt, Murat [International Computer Institute, Ege University (Turkey); Bonneel, Nicolas [Harvard Univ., Cambridge, MA (United States)

    2012-09-30

    The utilization of real-world materials has been hindered by a lack of standards for sharing and interpreting measured data. This paper presents an XML representation and an Open Source C library to support bidirectional scattering distribution functions (BSDFs) in data-driven lighting simulation and rendering applications.The library provides for the efficient representation, query, and Monte Carlo sampling of arbitrary BSDFs in amodel-free framework. Currently, we support two BSDF data representations: one using a fixed subdivision of thehemisphere, and one with adaptive density. The fixed type has advantages for certain matrix operations, while theadaptive type can more accurately represent highly peaked data. We discuss advanced methods for data-drivenBSDF rendering for both types, including the proxy of detailed geometry to enhance appearance and accuracy.We also present an advanced interpolation method to reduce measured data into these standard representations.We end with our plan for future extensions and sharing of BSDF data.

  18. EFFICIENT LIDAR POINT CLOUD DATA MANAGING AND PROCESSING IN A HADOOP-BASED DISTRIBUTED FRAMEWORK

    Directory of Open Access Journals (Sweden)

    C. Wang

    2017-10-01

    Full Text Available Light Detection and Ranging (LiDAR is one of the most promising technologies in surveying and mapping,city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop’s storage and computing ability. At the same time, the Point Cloud Library (PCL, an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.

  19. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  20. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  1. Criteria for selecting implementation science theories and frameworks: results from an international survey

    Directory of Open Access Journals (Sweden)

    Sarah A. Birken

    2017-10-01

    Full Text Available Abstract Background Theories provide a synthesizing architecture for implementation science. The underuse, superficial use, and misuse of theories pose a substantial scientific challenge for implementation science and may relate to challenges in selecting from the many theories in the field. Implementation scientists may benefit from guidance for selecting a theory for a specific study or project. Understanding how implementation scientists select theories will help inform efforts to develop such guidance. Our objective was to identify which theories implementation scientists use, how they use theories, and the criteria used to select theories. Methods We identified initial lists of uses and criteria for selecting implementation theories based on seminal articles and an iterative consensus process. We incorporated these lists into a self-administered survey for completion by self-identified implementation scientists. We recruited potential respondents at the 8th Annual Conference on the Science of Dissemination and Implementation in Health and via several international email lists. We used frequencies and percentages to report results. Results Two hundred twenty-three implementation scientists from 12 countries responded to the survey. They reported using more than 100 different theories spanning several disciplines. Respondents reported using theories primarily to identify implementation determinants, inform data collection, enhance conceptual clarity, and guide implementation planning. Of the 19 criteria presented in the survey, the criteria used by the most respondents to select theory included analytic level (58%, logical consistency/plausibility (56%, empirical support (53%, and description of a change process (54%. The criteria used by the fewest respondents included fecundity (10%, uniqueness (12%, and falsifiability (15%. Conclusions Implementation scientists use a large number of criteria to select theories, but there is little

  2. Criteria for selecting implementation science theories and frameworks: results from an international survey.

    Science.gov (United States)

    Birken, Sarah A; Powell, Byron J; Shea, Christopher M; Haines, Emily R; Alexis Kirk, M; Leeman, Jennifer; Rohweder, Catherine; Damschroder, Laura; Presseau, Justin

    2017-10-30

    Theories provide a synthesizing architecture for implementation science. The underuse, superficial use, and misuse of theories pose a substantial scientific challenge for implementation science and may relate to challenges in selecting from the many theories in the field. Implementation scientists may benefit from guidance for selecting a theory for a specific study or project. Understanding how implementation scientists select theories will help inform efforts to develop such guidance. Our objective was to identify which theories implementation scientists use, how they use theories, and the criteria used to select theories. We identified initial lists of uses and criteria for selecting implementation theories based on seminal articles and an iterative consensus process. We incorporated these lists into a self-administered survey for completion by self-identified implementation scientists. We recruited potential respondents at the 8th Annual Conference on the Science of Dissemination and Implementation in Health and via several international email lists. We used frequencies and percentages to report results. Two hundred twenty-three implementation scientists from 12 countries responded to the survey. They reported using more than 100 different theories spanning several disciplines. Respondents reported using theories primarily to identify implementation determinants, inform data collection, enhance conceptual clarity, and guide implementation planning. Of the 19 criteria presented in the survey, the criteria used by the most respondents to select theory included analytic level (58%), logical consistency/plausibility (56%), empirical support (53%), and description of a change process (54%). The criteria used by the fewest respondents included fecundity (10%), uniqueness (12%), and falsifiability (15%). Implementation scientists use a large number of criteria to select theories, but there is little consensus on which are most important. Our results suggest that the

  3. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    Science.gov (United States)

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  4. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection

    Directory of Open Access Journals (Sweden)

    Declan T. Delaney

    2016-12-01

    Full Text Available No single network solution for Internet of Things (IoT networks can provide the required level of Quality of Service (QoS for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  5. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection.

    Science.gov (United States)

    Delaney, Declan T; O'Hare, Gregory M P

    2016-12-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  6. Implementation of a Real-Time Microgrid Simulation Platform Based on Centralized and Distributed Management

    Directory of Open Access Journals (Sweden)

    Omid Abrishambaf

    2017-06-01

    Full Text Available Demand response and distributed generation are key components of power systems. Several challenges are raised at both technical and business model levels for integration of those resources in smart grids and microgrids. The implementation of a distribution network as a test bed can be difficult and not cost-effective; using computational modeling is not sufficient for producing realistic results. Real-time simulation allows us to validate the business model’s impact at the technical level. This paper comprises a platform supporting the real-time simulation of a microgrid connected to a larger distribution network. The implemented platform allows us to use both centralized and distributed energy resource management. Using an optimization model for the energy resource operation, a virtual power player manages all the available resources. Then, the simulation platform allows us to technically validate the actual implementation of the requested demand reduction in the scope of demand response programs. The case study has 33 buses, 220 consumers, and 68 distributed generators. It demonstrates the impact of demand response events, also performing resource management in the presence of an energy shortage.

  7. A framework to assess plan implementation maturity with an application to flood management in Vietnam

    NARCIS (Netherlands)

    Phi, Ho Long; Hermans, L.M.; Douven, W.J.A.M.; Halsema, Van G.E.; Khan, Malik Fida

    2015-01-01

    Implementation failure is a long-known Achilles’ heel of water and flood management plans. Contemporary planning approaches address the implementation challenge by using more participatory planning processes to ensure support for plans, assuming that this support will also benefit plan

  8. Towards a framework of critical success factors for implementing supply-chain information systems

    NARCIS (Netherlands)

    Denolf, J.M.; Wognum, P.M.; Trienekens, J.H.; Vorst, van der J.G.A.J.; Omta, S.W.F.

    2015-01-01

    Supply chain information systems (SCISs) have emerged as the core of successful management in supply chains. However, the difficulties of SCIS implementations have been widely cited in the literature. Research on the critical success factors (CSFs) for SCIS implementation is rather scarce and

  9. Implementing a Reentry Framework at a Correctional Facility: Challenges to the Culture

    Science.gov (United States)

    Rudes, Danielle S.; Lerch, Jennifer; Taxman, Faye S.

    2011-01-01

    Implementation research is emerging in the field of corrections, but few studies have examined the complexities associated with implementing change among frontline workers embedded in specific organizational cultures. Using a mixed methods approach, the authors examine the challenges faced by correctional workers in a work release correctional…

  10. A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations

    Science.gov (United States)

    2015-08-13

    Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative...and debugging several SDN applications. Example-based SDN synthesis. Recent emergence of software - defined networks offers an opportunity to design...domain of Software - defined Networks ( SDN ). We developed a declarative platform for implementing SDN protocols using declarative networking

  11. Education Policy Implementation: A Literature Review and Proposed Framework. OECD Education Working Papers, No. 162

    Science.gov (United States)

    Viennet, Romane; Pont, Beatriz

    2017-01-01

    This literature review focuses on education policy implementation, its definition, processes and determinants. It aims to clarify what implementing policies involve in complex education systems to support policy work, building on the literature and country examples. An introduction delves into the reasons behind the need to update the concept of…

  12. dCache: implementing a high-end NFSv4.1 service using a Java NIO framework

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    dCache is a high performance scalable storage system widely used by HEP community. In addition to set of home grown protocols we also provide industry standard access mechanisms like WebDAV and NFSv4.1. This support places dCache as a direct competitor to commercial solutions. Nevertheless conforming to a protocol is not enough; our implementations must perform comparably or even better than commercial systems. To achieve this, dCache uses two high-end IO frameworks from well know application servers: GlassFish and JBoss. This presentation describes how we implemented an rfc1831 and rfc2203 compliant ONC RPC (Sun RPC) service based on the Grizzly NIO framework, part of the GlassFish application server. This ONC RPC service is the key component of dCache’s NFSv4.1 implementation, but is independent of dCache and available for other projects. We will also show some details of dCache NFS v4.1 implementations, describe some of the Java NIO techniques used and, finally, present details of our performance e...

  13. A study on the establishment of national regulatory framework for effective implementation of exemption or clearance concept

    International Nuclear Information System (INIS)

    Cheong, J.H.; Park, S.H.; Suk, T.W.

    1998-01-01

    The concepts of exemption and clearance have a lot of advantages in the aspects of effective use of limited resources, land, and optimization of regulatory works. The exact scopes and extent of the implementation of the concepts, however, can widely vary depending upon each country's own specific situations. In order to support the political decision-making on the practical implementation, a series of possible alternatives, general methodology for decision-making, and factors to be considered were proposed. Five primary categories and subsequent nineteen secondary categories were suggested and discussed, and four-step-approach was introduced in order to show the general guidelines for establishing an appropriate national regulatory framework. Though the specific procedure for each country to get to the practical implementation of the exemption and clearance concepts was not described, it is anticipated that the basic guidelines proposed in this paper can be used as a general reference. (author)

  14. A framework for assessing cost management system changes: the case of activity-based costing implementation at food industry

    Directory of Open Access Journals (Sweden)

    Tayebeh Faraji

    2015-04-01

    Full Text Available An opportunity to investigate the technical and organizational effect of management accounting system changes has appeared with companies' adoption of activity-based costing (ABC. This paper presents an empirical investigation to study the effects of ABC system for case study from food industry in Iran. From this case, the paper develops a framework for assessing ABC implementation and hypotheses about factors that influence implementation. The study detects five cost centers and for each cost center, it determines different cost drivers. The results of our survey has detected that implementation of ABC system not only helps precise allocation of overhead costs but also helps internal management companies for better planning and control of production, making better decisions for company's profits.

  15. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series

    Directory of Open Access Journals (Sweden)

    Mittman Brian S

    2008-05-01

    Full Text Available Abstract Background The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers. The Quality Enhancement Research Initiative (QUERI was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI. Strategic approach to organizational change QUERI used an evidence-based organizational framework focused on three contextual elements: 1 cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2 capacity, in this case among researchers and key partners to engage in implementation research; 3 and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in Implementation Science, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery. Conclusion QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the

  16. Understanding the interaction between wild fire and vegetation distribution within the NCAR CESM framework

    Science.gov (United States)

    Seo, H.; Kim, Y.; Kim, H. J.

    2017-12-01

    Every year wild fire brings about 400Mha of land burned therefore 2Pg of carbon emissions from the surface occur. In this way fire not only affects the carbon circulation but also has an effect on the terrestrial ecosystems. This study aims to understand role of fire on the geographic vegetation distribution and the terrestrial carbon balances within the NCAR CESM framework, specifically with the CLM-BGC and CLM-BGC-DV. Global climate data from Climate Research Unit (CRU)-National Centers for Environmental Prediction (NCEP) data ranging from 1901 to 2010 are used to drive the land models. First, by comparing fire-on and fire-off simulations with the CLM-BGC-DV, the fire impacts in dynamic vegetation are quantified by the fractional land areas of the different plant functional types. In addition, we examine how changes in vegetation distribution affect the total sum of the burned areas and the carbon balances. This study would provide the limits of and suggestions for the fire and dynamic vegetation modules of the CLM-BGC. AcknowledgementsThis work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT & Future Planning (2015R1C1A2A01054800) and by the Korea Meteorological Administration R&D Program under Grant KMIPA 2015-6180. This work was also supported by the Yonsei University Future-leading Research Initiative of 2015(2016-22-0061).

  17. Where-Fi: a dynamic energy-efficient multimedia distribution framework for MANETs

    Science.gov (United States)

    Mohapatra, Shivajit; Carbunar, Bogdan; Pearce, Michael; Chaudhri, Rohit; Vasudevan, Venu

    2008-01-01

    Next generation mobile ad-hoc applications will revolve around users' need for sharing content/presence information with co-located devices. However, keeping such information fresh requires frequent meta-data exchanges, which could result in significant energy overheads. To address this issue, we propose distributed algorithms for energy efficient dissemination of presence and content usage information between nodes in mobile ad-hoc networks. First, we introduce a content dissemination protocol (called CPMP) for effectively distributing frequent small meta-data updates between co-located devices using multicast. We then develop two distributed algorithms that use the CPMP protocol to achieve "phase locked" wake up cycles for all the participating nodes in the network. The first algorithm is designed for fully-connected networks and then extended in the second to handle hidden terminals. The "phase locked" schedules are then exploited to adaptively transition the network interface to a deep sleep state for energy savings. We have implemented a prototype system (called "Where-Fi") on several Motorola Linux-based cell phone models. Our experimental results show that for all network topologies our algorithms were able to achieve "phase locking" between nodes even in the presence of hidden terminals. Moreover, we achieved battery lifetime extensions of as much as 28% for fully connected networks and about 20% for partially connected networks.

  18. [Sustainable Implementation of Evidence-Based Programmes in Health Promotion: A Theoretical Framework and Concept of Interactive Knowledge to Action].

    Science.gov (United States)

    Rütten, A; Wolff, A; Streber, A

    2016-03-01

    This article discusses 2 current issues in the field of public health research: (i) transfer of scientific knowledge into practice and (ii) sustainable implementation of good practice projects. It also supports integration of scientific and practice-based evidence production. Furthermore, it supports utilisation of interactive models that transcend deductive approaches to the process of knowledge transfer. Existing theoretical approaches, pilot studies and thoughtful conceptual considerations are incorporated into a framework showing the interplay of science, politics and prevention practice, which fosters a more sustainable implementation of health promotion programmes. The framework depicts 4 key processes of interaction between science and prevention practice: interactive knowledge to action, capacity building, programme adaptation and adaptation of the implementation context. Ensuring sustainability of health promotion programmes requires a concentrated process of integrating scientific and practice-based evidence production in the context of implementation. Central to the integration process is the approach of interactive knowledge to action, which especially benefits from capacity building processes that facilitate participation and systematic interaction between relevant stakeholders. Intense cooperation also induces a dynamic interaction between multiple actors and components such as health promotion programmes, target groups, relevant organisations and social, cultural and political contexts. The reciprocal adaptation of programmes and key components of the implementation context can foster effectiveness and sustainability of programmes. Sustainable implementation of evidence-based health promotion programmes requires alternatives to recent deductive models of knowledge transfer. Interactive approaches prove to be promising alternatives. Simultaneously, they change the responsibilities of science, policy and public health practice. Existing boundaries

  19. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; The ATLAS collaboration; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows the sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain confugurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the...

  20. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of Ath...

  1. An analysis of Cobit 5 as a framework for the implementation of it governance with reference to King III

    Directory of Open Access Journals (Sweden)

    Maseko, L.

    2016-02-01

    Full Text Available Owing to the complexity and general lack of understanding of information technology (“IT”, the management of IT is often treated as a separately managed value-providing asset. This has resulted in IT rarely receiving the necessary attention of the board, thus creating a disconnect between the board and IT. The King Code of Governance for South Africa 2009 (hereafter referred to as “King III” provides principles and recommended practices for effective IT governance in order to create a greater awareness at board level. King III, however, provides no detailed guidance with regard to the practical implementation of these principles and practices. It is worth noting that numerous international guidelines are recommended within King III that can be adopted as frameworks to assist in the effective implementation of IT governance. COBIT 5 provides, as part of its governance process practices, related guidance activities linking it to the seven IT governance principles of King III, thus making it a practical framework for the implementation of King III recommendations. This study sought to establish the extent to which the governance processes, practices and activities of COBIT 5 are mapped to the recommended practices of IT governance as highlighted in King III in order to resolve COBIT 5 as the de facto framework for IT governance in terms of King III. The study found that though King III principles and practices may be interpreted as vague with regard to how to implement IT governance principles, COBIT 5 succeeds in bridging the gap between control requirements, technical issues, information systems and business risk, which consequently results in a better facilitation of IT governance. The study also revealed that COBIT 5 contains additional activities to assist the board in more transparent reporting of IT performance and conformance management to stakeholders as well activities which enable the connection of resource management with human

  2. A Design Based Research Framework for Implementing a Transnational Mobile and Blended Learning Solution

    Science.gov (United States)

    Palalas, Agnieszka; Berezin, Nicole; Gunawardena, Charlotte; Kramer, Gretchen

    2015-01-01

    The article proposes a modified Design-Based Research (DBR) framework which accommodates the various socio-cultural factors that emerged in the longitudinal PA-HELP research study at Central University College (CUC) in Ghana, Africa. A transnational team of stakeholders from Ghana, Canada, and the USA collaborated on the development,…

  3. Examining Teachers' Beliefs about and Implementation of a Balanced Literacy Framework

    Science.gov (United States)

    Bingham, Gary E.; Hall-Kenyon, Kendra M.

    2013-01-01

    While many embrace balanced literacy as a framework for quality literacy instruction, the way in which teachers operationalise the tenets of balanced literacy can vary greatly. In the present study, 581 teachers in the United States completed questionnaires concerning: (a) their beliefs about literacy skills and literacy instructional strategies…

  4. Using the Consolidated Framework for Implementation Research (CFIR) to produce actionable findings: a rapid-cycle evaluation approach to improving implementation.

    Science.gov (United States)

    Keith, Rosalind E; Crosson, Jesse C; O'Malley, Ann S; Cromp, DeAnn; Taylor, Erin Fries

    2017-02-10

    Much research does not address the practical needs of stakeholders responsible for introducing health care delivery interventions into organizations working to achieve better outcomes. In this article, we present an approach to using the Consolidated Framework for Implementation Research (CFIR) to guide systematic research that supports rapid-cycle evaluation of the implementation of health care delivery interventions and produces actionable evaluation findings intended to improve implementation in a timely manner. To present our approach, we describe a formative cross-case qualitative investigation of 21 primary care practices participating in the Comprehensive Primary Care (CPC) initiative, a multi-payer supported primary care practice transformation intervention led by the Centers for Medicare and Medicaid Services. Qualitative data include observational field notes and semi-structured interviews with primary care practice leadership, clinicians, and administrative and medical support staff. We use intervention-specific codes, and CFIR constructs to reduce and organize the data to support cross-case analysis of patterns of barriers and facilitators relating to different CPC components. Using the CFIR to guide data collection, coding, analysis, and reporting of findings supported a systematic, comprehensive, and timely understanding of barriers and facilitators to practice transformation. Our approach to using the CFIR produced actionable findings for improving implementation effectiveness during this initiative and for identifying improvements to implementation strategies for future practice transformation efforts. The CFIR is a useful tool for guiding rapid-cycle evaluation of the implementation of practice transformation initiatives. Using the approach described here, we systematically identified where adjustments and refinements to the intervention could be made in the second year of the 4-year intervention. We think the approach we describe has broad

  5. A theoretical framework for convergence and continuous dependence of estimates in inverse problems for distributed parameter systems

    Science.gov (United States)

    Banks, H. T.; Ito, K.

    1988-01-01

    Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.

  6. Implementation and validation of the condensation model for containment hydrogen distribution studies

    International Nuclear Information System (INIS)

    Ravva, Srinivasa Rao; Iyer, Kannan N.; Gupta, S.K.; Gaikwad, Avinash J.

    2014-01-01

    Highlights: • A condensation model based on diffusion was implemented in FLUENT. • Validation of a condensation model for the H 2 distribution studies was performed. • Multi-component diffusion is used in the present work. • Appropriate grid and turbulence model were identified. - Abstract: This paper aims at the implementation details of a condensation model in the CFD code FLUENT and its validation so that it can be used in performing the containment hydrogen distribution studies. In such studies, computational fluid dynamics simulations are necessary for obtaining accurate predictions. While steam condensation plays an important role, commercial CFD codes such as FLUENT do not have an in-built condensation model. Therefore, a condensation model was developed and implemented in the FLUENT code through user defined functions (UDFs) for the sink terms in the mass, momentum, energy and species balance equations together with associated turbulence quantities viz., kinetic energy and dissipation rate. The implemented model was validated against the ISP-47 test of TOSQAN facility using the standard wall functions and enhanced wall treatment approaches. The best suitable grid size and the turbulence model for the low density gas (He) distribution studies are brought out in this paper

  7. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework: Perceptions of stakeholders.

    Science.gov (United States)

    Maluka, Stephen; Kamuzora, Peter; San Sebastián, Miguel; Byskov, Jens; Ndawi, Benedict; Hurtig, Anna-Karin

    2010-12-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania. The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees. Individual interviews were carried out with different categories of actors and stakeholders in the district. The interview guide consisted of a series of questions, asking respondents to describe their perceptions regarding each condition of the Accountability for Reasonableness framework in terms of priority setting. Interviews were analysed using thematic framework analysis. Documentary data were used to support, verify and highlight the key issues that emerged. Almost all stakeholders viewed Accountability for Reasonableness as an important and feasible approach for improving priority-setting and health service delivery in their context. However, a few aspects of Accountability for Reasonableness were seen as too difficult to implement given the socio-political conditions and traditions in Tanzania. Respondents mentioned: budget ceilings and guidelines, low level of public awareness, unreliable and untimely funding, as well as the limited capacity of the district to generate local resources as the major contextual factors that hampered the full implementation of the framework in their context. This study was one of the first assessments of the applicability of Accountability for Reasonableness in health care priority-setting in Tanzania. The analysis, overall, suggests that the Accountability for Reasonableness framework could be an important tool for improving priority-setting processes in the contexts of resource-poor settings

  8. EVALUATION OF SUSTAINABLE DEVELOPMENT FRAMEWORK FOR VOCATIONAL COLLEGES: IMPLEMENTATION AND CHALLENGES

    OpenAIRE

    Minghat, Asnul Dahar; Safie, Siti Nadia Mohd; Mustakim, Siti Salina

    2018-01-01

    The main purpose of this study was to explore the extensive growth of courses among the Vocational Colleges developed by the Ministry of Education since 2012. In order to determine the sustainability of the development, this study specifically seeks to explore the scope and quality of Vocational Colleges’ Curriculum Standard implementation via the integration of sustainability development among educators, and to investigate challenges occurred during the implementation of the college ever sin...

  9. From situation modelling to a distributed rule-based platform for situation awareness : an ontological framework for disaster management applications

    NARCIS (Netherlands)

    Moreira, João

    2015-01-01

    Situation-aware (SA) applications are particularly useful for disaster management. The complex nature of emergency scenarios presents challenges to the development of collaborative and distributed SA solutions. These challenges concern the whole lifecycle, from specification to implementation

  10. Deconstructing public participation in the Water Framework Directive: implementation and compliance with the letter or with the spirit of the law

    NARCIS (Netherlands)

    Ker Rault, P.A.; Jeffrey, P.J.

    2008-01-01

    This article offers a fresh reading of the Water Framework Directive (WFD) and of the Common Implementation Strategy guidance document number 8 on public participation (PP) aimed at identifying the conditions required for successful implementation. We propose that a central barrier to implementing

  11. Guidelines for Implementing Advanced Distribution Management Systems-Requirements for DMS Integration with DERMS and Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Chen [Argonne National Lab. (ANL), Argonne, IL (United States); Lu, Xiaonan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-08-01

    This guideline focuses on the integration of DMS with DERMS and microgrids connected to the distribution grid by defining generic and fundamental design and implementation principles and strategies. It starts by addressing the current status, objectives, and core functionalities of each system, and then discusses the new challenges and the common principles of DMS design and implementation for integration with DERMS and microgrids to realize enhanced grid operation reliability and quality power delivery to consumers while also achieving the maximum energy economics from the DER and microgrid connections.

  12. A Framework for the Generation and Dissemination of Drop Size Distribution (DSD) Characteristics Using Multiple Platforms

    Science.gov (United States)

    Wolf, David B.; Tokay, Ali; Petersen, Walt; Williams, Christopher; Gatlin, Patrick; Wingo, Mathew

    2010-01-01

    Proper characterization of the precipitation drop size distribution (DSD) is integral to providing realistic and accurate space- and ground-based precipitation retrievals. Current technology allows for the development of DSD products from a variety of platforms, including disdrometers, vertical profilers and dual-polarization radars. Up to now, however, the dissemination or availability of such products has been limited to individual sites and/or field campaigns, in a variety of formats, often using inconsistent algorithms for computing the integral DSD parameters, such as the median- and mass-weighted drop diameter, total number concentration, liquid water content, rain rate, etc. We propose to develop a framework for the generation and dissemination of DSD characteristic products using a unified structure, capable of handling the myriad collection of disdrometers, profilers, and dual-polarization radar data currently available and to be collected during several upcoming GPM Ground Validation field campaigns. This DSD super-structure paradigm is an adaptation of the radar super-structure developed for NASA s Radar Software Library (RSL) and RSL_in_IDL. The goal is to provide the DSD products in a well-documented format, most likely NetCDF, along with tools to ingest and analyze the products. In so doing, we can develop a robust archive of DSD products from multiple sites and platforms, which should greatly benefit the development and validation of precipitation retrieval algorithms for GPM and other precipitation missions. An outline of this proposed framework will be provided as well as a discussion of the algorithms used to calculate the DSD parameters.

  13. Monitoring and evaluation of spatially managed areas: A generic framework for implementation of ecosystem based marine management and its application

    DEFF Research Database (Denmark)

    Stelzenmüller, Vanessa; Breen, Patricia; Stamford, Tammy

    2013-01-01

    This study introduces a framework for the monitoring and evaluation of spatially managed areas (SMAs), which is currently being tested by nine European case studies. The framework provides guidance on the selection, mapping, and assessment of ecosystem components and human pressures, the evaluati...... on qualitative information are addressed. The lessons learned will provide a better insight into the full range of methods and approaches required to support the implementation of the ecosystem approach to marine spatial management in Europe and elsewhere.......This study introduces a framework for the monitoring and evaluation of spatially managed areas (SMAs), which is currently being tested by nine European case studies. The framework provides guidance on the selection, mapping, and assessment of ecosystem components and human pressures, the evaluation...... of management effectiveness and potential adaptations to management. Moreover, it provides a structured approach with advice on spatially explicit tools for practical tasks like the assessment of cumulative impacts of human pressures or pressure-state relationships. The case studies revealed emerging challenges...

  14. A Framework to Evaluate Ecological and Social Outcomes of Collaborative Management: Lessons from Implementation with a Northern Arizona Collaborative Group

    Science.gov (United States)

    Muñoz-Erickson, Tischa A.; Aguilar-González, Bernardo; Loeser, Matthew R. R.; Sisk, Thomas D.

    2010-01-01

    As collaborative groups gain popularity as an alternative means for addressing conflict over management of public lands, the need for methods to evaluate their effectiveness in achieving ecological and social goals increases. However, frameworks that examine both effectiveness of the collaborative process and its outcomes are poorly developed or altogether lacking. This paper presents and evaluates the utility of the holistic ecosystem health indicator (HEHI), a framework that integrates multiple ecological and socioeconomic criteria to evaluate management effectiveness of collaborative processes. Through the development and application of the HEHI to a collaborative in northern Arizona, the Diablo Trust, we present the opportunities and challenges in using this framework to evaluate the ecological and social outcomes of collaborative adaptive management. Baseline results from the first application of the HEHI are presented as an illustration of its potential as a co-adaptive management tool. We discuss lessons learned from the process of selecting indicators and potential issues to their long-term implementation. Finally, we provide recommendations for applying this framework to monitoring and adaptive management in the context of collaborative management.

  15. Building an Ensemble Seismic Hazard Model for the Magnitude Distribution by Using Alternative Bayesian Implementations

    Science.gov (United States)

    Taroni, M.; Selva, J.

    2017-12-01

    In this work we show how we built an ensemble seismic hazard model for the magnitude distribution for the TSUMAPS-NEAM EU project (http://www.tsumaps-neam.eu/). The considered source area includes the whole NEAM region (North East Atlantic, Mediterranean and connected seas). We build our models by using the catalogs (EMEC and ISC), their completeness and the regionalization provided by the project. We developed four alternative implementations of a Bayesian model, considering tapered or truncated Gutenberg-Richter distributions, and fixed or variable b-value. The frequency size distribution is based on the Weichert formulation. This allows for simultaneously assessing all the frequency-size distribution parameters (a-value, b-value, and corner magnitude), using multiple completeness periods for the different magnitudes. With respect to previous studies, we introduce the tapered Pareto distribution (in addition to the classical truncated Pareto), and we build a novel approach to quantify the prior distribution. For each alternative implementation, we set the prior distributions using the global seismic data grouped according to the different types of tectonic setting, and assigned them to the related regions. The estimation is based on the complete (not declustered) local catalog in each region. Using the complete catalog also allows us to consider foreshocks and aftershocks in the seismic rate computation: the Poissonicity of the tsunami events (and similarly the exceedances of the PGA) will be insured by the Le Cam's theorem. This Bayesian approach provides robust estimations also in the zones where few events are available, but also leaves us the possibility to explore the uncertainty associated with the estimation of the magnitude distribution parameters (e.g. with the classical Metropolis-Hastings Monte Carlo method). Finally we merge all the models with their uncertainty to create the ensemble model that represents our knowledge of the seismicity in the

  16. An Equivalent cross-section Framework for improving computational efficiency in Distributed Hydrologic Modelling

    Science.gov (United States)

    Khan, Urooj; Tuteja, Narendra; Ajami, Hoori; Sharma, Ashish

    2014-05-01

    While the potential uses and benefits of distributed catchment simulation models is undeniable, their practical usage is often hindered by the computational resources they demand. To reduce the computational time/effort in distributed hydrological modelling, a new approach of modelling over an equivalent cross-section is investigated where topographical and physiographic properties of first-order sub-basins are aggregated to constitute modelling elements. To formulate an equivalent cross-section, a homogenization test is conducted to assess the loss in accuracy when averaging topographic and physiographic variables, i.e. length, slope, soil depth and soil type. The homogenization test indicates that the accuracy lost in weighting the soil type is greatest, therefore it needs to be weighted in a systematic manner to formulate equivalent cross-sections. If the soil type remains the same within the sub-basin, a single equivalent cross-section is formulated for the entire sub-basin. If the soil type follows a specific pattern, i.e. different soil types near the centre of the river, middle of hillslope and ridge line, three equivalent cross-sections (left bank, right bank and head water) are required. If the soil types are complex and do not follow any specific pattern, multiple equivalent cross-sections are required based on the number of soil types. The equivalent cross-sections are formulated for a series of first order sub-basins by implementing different weighting methods of topographic and physiographic variables of landforms within the entire or part of a hillslope. The formulated equivalent cross-sections are then simulated using a 2-dimensional, Richards' equation based distributed hydrological model. The simulated fluxes are multiplied by the weighted area of each equivalent cross-section to calculate the total fluxes from the sub-basins. The simulated fluxes include horizontal flow, transpiration, soil evaporation, deep drainage and soil moisture. To assess

  17. Implementing Relative Ranking Evaluation Framework at Department of Energy (DOE) installations

    International Nuclear Information System (INIS)

    Sharma, S.K.; Williamson, D.; Treichel, L.C.; James, L.M.

    1996-01-01

    The US Department of Energy (DOE) Office of Environmental Restoration (EM-40) has developed the Relative Ranking Evaluation Framework (RREF) to help categorize release sites, facilities and buildings requiring restoration or decommissioning. Based on this framework, a computer tool, the Relative Rank Evaluation Program (RREP) has been developed to evaluate release sites, facilities and buildings, and to manage information pertaining to relative ranking evaluations. The relative ranking information is being used by both Headquarters and field project managers, and other environmental personnel responsible for planning, executing and evaluation environmental restoration activities at DOE installations. External stakeholders, such as representatives of federal and state regulatory agencies, local governments and communities in the vicinity of current and formerly used DOE installations may use this data to review proposed and planned activities

  18. Implementation of Electricity Business Competition Framework with Economic Dispatch Direct Method

    Directory of Open Access Journals (Sweden)

    Yusra Sabri

    2012-12-01

    Full Text Available Technically, electricity business under competition structure is more complex than that of vertically integrated one. The main prolems here are how to create an applicable competition framework and to solve electric calculations very quickly to obtain an optimal energi pricing, cost of losses, congestion and transportation costs by less than 15 minutes. This paper proposes a competition framework with the electric calculations, where a bilateral contract has been accommodated. Optimal energy price in the paper is calculated based on direct method of economic dispatch to obtain the result very quickly. The proposed method has been simulated to a 4-bus system. The simulation results show that the method works well and complies with the expectation. Therefore, electric power business under competition structure can be well realized by the proposed method.

  19. Designing and implementing the logical security framework for e-commerce based on service oriented architecture

    OpenAIRE

    Luhach, Ashish Kr.; Dwivedi, Sanjay K; Jha, C K

    2014-01-01

    Rapid evolution of information technology has contributed to the evolution of more sophisticated E- commerce system with the better transaction time and protection. The currently used E-commerce models lack in quality properties such as logical security because of their poor designing and to face the highly equipped and trained intruders. This editorial proposed a security framework for small and medium sized E-commerce, based on service oriented architecture and gives an analysis of the emin...

  20. FireCalc: An XML-based framework for distributed data analysis

    International Nuclear Information System (INIS)

    Duarte, A.S.; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F.

    2008-01-01

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis

  1. FireCalc: An XML-based framework for distributed data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Duarte, A.S. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)], E-mail: andre.duarte@cfn.ist.utl.pt; Santos, J.H.; Fernandes, H.; Neto, A.; Pereira, T.; Varandas, C.A.F. [Associacao Euratom/IST, Centro de Fusao Nuclear, Av. Rovisco Pais P-1049-001 Lisboa (Portugal)

    2008-04-15

    Requirements and specifications for Control Data Access and Communication (CODAC) systems in fusion reactors point towards flexible and modular solutions, independent from operating system and computer architecture. These concepts can also be applied to calculation and data analysis systems, where highly standardized solutions must also apply in order to anticipate long time-scales and high technology evolution changes. FireCalc is an analysis tool based on standard Extensible Markup Language (XML) technologies. Actions are described in an XML file, which contains necessary data specifications and the code or references to scripts. This is used by the user to send the analysis code and data to a server, which can be running either locally or remotely. Communications between the user and the server are performed through XML-RPC, an XML based remote procedure call, thus enabling the client and server to be coded in different computer languages. Access to the database, security procedures and calls to the code interpreter are handled through independent modules, which unbinds them from specific solutions. Currently there is an implementation of the FireCalc framework in Java, that uses the Shared Data Access System (SDAS) for accessing the ISTTOK database and the Scilab kernel for the numerical analysis.

  2. A novel framework of ERP implementation in Indian SMEs: Kernel principal component analysis and intuitionistic Fuzzy TOPSIS driven approach

    Directory of Open Access Journals (Sweden)

    Indranil Ghosh

    2016-04-01

    Full Text Available Over the years, organizations have witnessed a transformational change at global market place. Integration of operations and partnership have become the key success factors for organizations. In order to achieve inclusive growth while operating in a dynamic uncertain environment, organizations irrespective of the scale of business need to stay connected across the entire value chain. The purpose of this paper is to analyze Enterprise Resource Planning (ERP implementation process for Small and Medium Enterprises (SMEs in India to identify the key enablers. Exhaustive survey of existing literature as a part of secondary research work, has been conducted in order to identify the critical success factors and usefulness of ERP implementation in different industrial sectors initially and examines the impact of those factors in Indian SMEs. Kernel Principal Component Analysis (KPCA has been applied on survey response to recognize the key constructs related to Critical Success Factors (CSFs and tangible benefits of ERP implementation. Intuitionistic Fuzzy set theory based Technique of Order Preference by Similarity to Ideal Solution (TOPSIS method is then used to rank the respective CSFs by mapping their contribution to the benefits realized through implementing ERP. Overall this work attempts to present a guideline for ERP adoption process in the said sector utilizing the framework built upon KPCA and Intuitionistic Fuzzy TOPSIS. Findings of this work can act as guidelines for monitoring the entire ERP implementation project.

  3. MODELING AND IMPLEMENTATION OF A DISTRIBUTED SHOP FLOOR MANAGEMENT AND CONTROL SYSTEM

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Adopting distributed control architecture is the important development direction for shop floor management and control system,is also the requirement of making it agile,intelligent and concurrent. Some key problems in achieving distributed control architecture are researched. An activity model of shop floor is presented as the requirement definition of the prototype system. The multi-agent based software architecture is constructed. How the core part in shop floor management and control system,production plan and scheduling is achieved. The cooperation of different agents is illustrated. Finally,the implementation of the prototype system is narrated.

  4. Development of a framework towards successful implementation of e-governance initiatives in health sector in India.

    Science.gov (United States)

    Ray, Subhasis; Mukherjee, Amitava

    2007-01-01

    The purpose of this paper is to explore the route map for employing efficient e-governance so that at least existing resource and infrastructure are better utilized and deficiencies are tracked for future planning. National health is one of the most important factors in a country's economic growth. India seems to be a victim of the vicious cycle around poor economy and poor health conditions. A detailed study was carried out to find out India's healthcare infrastructure and its standing in e-governance initiatives. After consolidating the fact that effective e-governance can enhance the quality of healthcare service even within limited resources, authors explored success and failure factors of many e-governance initiatives in India and abroad. Finally, an e-governance framework is suggested based on the above factors together with the authors' own experience of implementing e-governance projects in India and abroad. The suggested framework is based on a phased implementation approach. The first phase "Information Dissemination" is more geared towards breaking the "digital divide" across three dimensions: G2Business; G2Citizen; and G2Agent. The most advanced stage is aimed towards joining up healthcare information across the above three dimensions and drawing meaningful analytics out of it. The recommendations also include management of Policies, Scope, Process Reform, Infrastructure, Technology, Finance, Partnership and People for efficient implementation of such e-governance initiatives. The paper provides measures for continuous evaluation of systems as one passes through various stages of implementation. However, the framework can be tested on real or simulated environment to prove its worthiness. This paper can be a potential frame of reference for nation-wide e-healthcare projects not only in India but also in other developing countries. The paper also describes challenges that are most likely to be faced during implementation. Since the paper is practical in

  5. Toward the sustainability of health interventions implemented in sub-Saharan Africa: a systematic review and conceptual framework.

    Science.gov (United States)

    Iwelunmor, Juliet; Blackstone, Sarah; Veira, Dorice; Nwaozuru, Ucheoma; Airhihenbuwa, Collins; Munodawafa, Davison; Kalipeni, Ezekiel; Jutal, Antar; Shelley, Donna; Ogedegebe, Gbenga

    2016-03-23

    Sub-Saharan Africa (SSA) is facing a double burden of disease with a rising prevalence of non-communicable diseases (NCDs) while the burden of communicable diseases (CDs) remains high. Despite these challenges, there remains a significant need to understand how or under what conditions health interventions implemented in sub-Saharan Africa are sustained. The purpose of this study was to conduct a systematic review of empirical literature to explore how health interventions implemented in SSA are sustained. We searched MEDLINE, Biological Abstracts, CINAHL, Embase, PsycInfo, SCIELO, Web of Science, and Google Scholar for available research investigating the sustainability of health interventions implemented in sub-Saharan Africa. We also used narrative synthesis to examine factors whether positive or negative that may influence the sustainability of health interventions in the region. The search identified 1819 citations, and following removal of duplicates and our inclusion/exclusion criteria, only 41 papers were eligible for inclusion in the review. Twenty-six countries were represented in this review, with Kenya and Nigeria having the most representation of available studies examining sustainability. Study dates ranged from 1996 to 2015. Of note, majority of these studies (30 %) were published in 2014. The most common framework utilized was the sustainability framework, which was discussed in four of the studies. Nineteen out of 41 studies (46 %) reported sustainability outcomes focused on communicable diseases, with HIV and AIDS represented in majority of the studies, followed by malaria. Only 21 out of 41 studies had clear definitions of sustainability. Community ownership and mobilization were recognized by many of the reviewed studies as crucial facilitators for intervention sustainability, both early on and after intervention implementation, while social and ecological conditions as well as societal upheavals were barriers that influenced the sustainment

  6. Piloting a logic-based framework for understanding organisational change process for a health IT implementation.

    Science.gov (United States)

    Diment, Kieren; Garrety, Karin; Yu, Ping

    2011-01-01

    This paper describes how a method for evaluating organisational change based on the theory of logical types can be used for classifying organisational change processes to understand change after the implementation of an electronic documentation system in a residential aged care facility. In this instance we assess the organisational change reflected by care staff's perceptions of the benefits of the new documentation system at one site, at pre-implementation, and at 12 months post-implementation. The results show how a coherent view from the staff as a whole of the personal benefits, the benefits for others and the benefits for the organization create a situation of positive feedback leading to embeddedness of the documentation system into the site, and a broader appreciation of the potential capabilities of the electronic documentation system.

  7. ClimateSpark: An In-memory Distributed Computing Framework for Big Climate Data Analytics

    Science.gov (United States)

    Hu, F.; Yang, C. P.; Duffy, D.; Schnase, J. L.; Li, Z.

    2016-12-01

    Massive array-based climate data is being generated from global surveillance systems and model simulations. They are widely used to analyze the environment problems, such as climate changes, natural hazards, and public health. However, knowing the underlying information from these big climate datasets is challenging due to both data- and computing- intensive issues in data processing and analyzing. To tackle the challenges, this paper proposes ClimateSpark, an in-memory distributed computing framework to support big climate data processing. In ClimateSpark, the spatiotemporal index is developed to enable Apache Spark to treat the array-based climate data (e.g. netCDF4, HDF4) as native formats, which are stored in Hadoop Distributed File System (HDFS) without any preprocessing. Based on the index, the spatiotemporal query services are provided to retrieve dataset according to a defined geospatial and temporal bounding box. The data subsets will be read out, and a data partition strategy will be applied to equally split the queried data to each computing node, and store them in memory as climateRDDs for processing. By leveraging Spark SQL and User Defined Function (UDFs), the climate data analysis operations can be conducted by the intuitive SQL language. ClimateSpark is evaluated by two use cases using the NASA Modern-Era Retrospective Analysis for Research and Applications (MERRA) climate reanalysis dataset. One use case is to conduct the spatiotemporal query and visualize the subset results in animation; the other one is to compare different climate model outputs using Taylor-diagram service. Experimental results show that ClimateSpark can significantly accelerate data query and processing, and enable the complex analysis services served in the SQL-style fashion.

  8. A distributed big data storage and data mining framework for solar-generated electricity quantity forecasting

    Science.gov (United States)

    Wang, Jianzong; Chen, Yanjun; Hua, Rui; Wang, Peng; Fu, Jia

    2012-02-01

    Photovoltaic is a method of generating electrical power by converting solar radiation into direct current electricity using semiconductors that exhibit the photovoltaic effect. Photovoltaic power generation employs solar panels composed of a number of solar cells containing a photovoltaic material. Due to the growing demand for renewable energy sources, the manufacturing of solar cells and photovoltaic arrays has advanced considerably in recent years. Solar photovoltaics are growing rapidly, albeit from a small base, to a total global capacity of 40,000 MW at the end of 2010. More than 100 countries use solar photovoltaics. Driven by advances in technology and increases in manufacturing scale and sophistication, the cost of photovoltaic has declined steadily since the first solar cells were manufactured. Net metering and financial incentives, such as preferential feed-in tariffs for solar-generated electricity; have supported solar photovoltaics installations in many countries. However, the power that generated by solar photovoltaics is affected by the weather and other natural factors dramatically. To predict the photovoltaic energy accurately is of importance for the entire power intelligent dispatch in order to reduce the energy dissipation and maintain the security of power grid. In this paper, we have proposed a big data system--the Solar Photovoltaic Power Forecasting System, called SPPFS to calculate and predict the power according the real-time conditions. In this system, we utilized the distributed mixed database to speed up the rate of collecting, storing and analysis the meteorological data. In order to improve the accuracy of power prediction, the given neural network algorithm has been imported into SPPFS.By adopting abundant experiments, we shows that the framework can provide higher forecast accuracy-error rate less than 15% and obtain low latency of computing by deploying the mixed distributed database architecture for solar-generated electricity.

  9. Report on OCDE’s tax bases erosion and shifting benefits: origin and implementation within international and global framework

    Directory of Open Access Journals (Sweden)

    Fernando Serrano Antón

    2014-07-01

    Full Text Available This work is intended to analyze circumstances leading to OCDE’s report on tax bases erosion and shifting benefits. Inconsistency of tax systems and unilateralism in current economic globalization framework might have led to asymmetric tax situations, mostly exploited by multinational companies. Means and tools used and proposed by several international institutions in order to implement legally binding actions through soft law and acceptance by different countries as method used in the fight against tax avoidance and fraud are also discussed.

  10. A Framework for Evaluating Implementation of Community College Workforce Education Partnerships and Programs

    Science.gov (United States)

    Yarnall, Louise; Tennant, Elizabeth; Stites, Regie

    2016-01-01

    Greater investments in community college workforce education are fostering large-scale partnerships between employers and educators. However, the evaluation work in this area has focused on outcome and productivity metrics, rather than addressing measures of implementation quality, which is critical to scaling any innovation. To deepen…

  11. A Framework for the Assessment of the Global Potential of Joint Implementation

    NARCIS (Netherlands)

    Bollen JC; Minnen JG van; Toet AMC; Bennis M; Kuik OJ; MTV; VUA/IVM

    1995-01-01

    Joint Implementation (JI) is een instrument om op goedkopere wijze de mondiale CO2 uitstoot te verminderen. Het mondiale potentieel voor JI is gedefinieerd als de fractie van de noodzakelijke vermindering van regionale CO2 uitstoot om de uitstootdoelstelling in een bepaald zichtjaar te bereiken,

  12. A Framework for Institutional Adoption and Implementation of Blended Learning in Higher Education

    Science.gov (United States)

    Graham, Charles R.; Woodfield, Wendy; Harrison, J. Buckley

    2013-01-01

    There has been rapid growth in blended learning implementation and research focused on course-level issues such as improved learning outcomes, but very limited research focused on institutional policy and adoption issues. More institutional-level blended learning research is needed to guide institutions of higher education in strategically…

  13. Examining the Quality of Technology Implementation in STEM Classrooms: Demonstration of an Evaluative Framework

    Science.gov (United States)

    Parker, Caroline E.; Stylinski, Cathlyn D.; Bonney, Christina R.; Schillaci, Rebecca; McAuliffe, Carla

    2015-01-01

    Technology applications aligned with science, technology, engineering, and math (STEM) workplace practices can engage students in real-world pursuits but also present dramatic challenges for classroom implementation. We examined the impact of teacher professional development focused on incorporating these workplace technologies in the classroom.…

  14. Implementing a Quality Management Framework in a Higher Education Organisation: A Case Study

    Science.gov (United States)

    O'Mahony, Kim; Garavan, Thomas N.

    2012-01-01

    Purpose: This paper aims to report and analyse the lessons learned from a case study on the implementation of a quality management system within an IT Division in a higher education (HE) organisation. Design/methodology/approach: The paper is based on a review of the relevant literatures and the use of primary sources such as document analysis,…

  15. Implementations of FroboMind using the Robot Operating System framework

    DEFF Research Database (Denmark)

    Nielsen, Søren Hundevadt; Bøgild, Anders; Jensen, Kjeld

    Conclusion The work provides a highly domain specific architecture in form of the field robotic vehicle conceptual architecture FroboMind (Jensen et al. 2011). This architecture is currently, as a work in progress, being implemented in ROS to evaluate how well FroboMind maps into ROS. A prominent...

  16. PBL as a Framework for Implementing Video Games in the Classroom

    Science.gov (United States)

    Watson, William R.; Fang, Jun

    2012-01-01

    Video games and problem-based learning (PBL) are both significant trends in progressive approaches to education. The literature demonstrates a fit between the two approaches, indicating they may be mutually beneficial. With limited literature on implementing games in the classroom, and a growing body of researchers highlighting the importance of…

  17. Defining the challenges for ecodesign implementation in companies: development and consolidation of a framework

    DEFF Research Database (Denmark)

    Dekoninck, Elies A.; Domingo, Lucie; O'Hare, Jamie Alexander

    2016-01-01

    This study addresses the problem of the slow take-up of ecodesign in industry by identifying and categorising the implementation challenges faced by practitioners. Case studies from nine manufacturing companies from five different countries are reported based on interviews with key ecodesign pers...

  18. A conceptual framework for outsourcing of materials handling activities in automotive : differentiation and implementation

    NARCIS (Netherlands)

    Klingenberg, W.; Boksma, J. D.

    2010-01-01

    This article discusses the outsourcing of materials handling activities and investigates different options for its implementation. The article uses descriptive case studies found in literature from the Western European automotive industry to map out differences in current practice and to evaluate

  19. Implementing Competency-Based Education: Challenges, Strategies, and a Decision-Making Framework

    Science.gov (United States)

    Dragoo, Amie; Barrows, Richard

    2016-01-01

    The number of competency-based education (CBE) degree programs has increased rapidly over the past five years, yet there is little research on CBE program development. This study utilized conceptual models of higher education change and a qualitative methodology to analyze the strategies and challenges in implementing CBE business degree programs…

  20. Valuation-Based Framework for Considering Distributed Generation Photovoltaic Tariff Design: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Zinaman, O. R.; Darghouth, N. R.

    2015-02-01

    While an export tariff is only one element of a larger regulatory framework for distributed generation, we choose to focus on tariff design because of the significant impact this program design component has on the various flows of value among power sector stakeholders. In that context, this paper is organized into a series of steps that can be taken during the design of a DGPV export tariff design. To that end this paper outlines a holistic, high-level approach to the complex undertaking of DGPV tariff design, the crux of which is an iterative cost-benefit analysis process. We propose a multi-step progression that aims to promote transparent, focused, and informed dialogue on CBA study methodologies and assumptions. When studies are completed, the long-run marginal avoided cost of the DGPV program should be compared against the costs imposed on utilities and non-participating customers, recognizing that these can be defined differently depending on program objectives. The results of this comparison can then be weighed against other program objectives to formulate tariff options. Potential changes to tariff structures can be iteratively fed back into established analytical tools to inform further discussions.

  1. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  2. Implementation of the EU-policy framework WFD and GWD in Europe - Activities of CIS Working Group Groundwater

    Science.gov (United States)

    Grath, Johannes; Ward, Rob; Hall, Anna

    2013-04-01

    At the European level, the basic elements for groundwater management and protection are laid down in the Water Framework Directive (WFD) (2000/60/EC) and the Groundwater Daughter Directive (2006/118/EC). EU Member States, Norway and the European Commission (EC) have jointly developed a common strategy for supporting the implementation of the WFD. The main aim of this Common Implementation Strategy (CIS) is to ensure the coherent and harmonious implementation of the directives through the clarification of a number of methodological questions enabling a common understanding to be reached on the technical and scientific implications of the WFD (European Communities, 2008). Groundwater specific issues are dealt with in Working Group C Groundwater. Members of the working group are experts nominated by Member states, Norway, Switzerland and Accession Countries (from administrative bodies, research institutes, …) and representatives from relevant stakeholders and NGOs. Working Group C Groundwater has produced numerous guidance documents and technical reports that have been endorsed by EU Water Directors to support and enable Member States to implement the directives. All the documents are published by the EC. Access is available via the following link: http://ec.europa.eu/environment/water/water-framework/groundwater/activities.htm Having addressed implementations issues during the 1st river basin planning cycle, WG C Groundwater is currently focussing on the following issues: groundwater dependent ecosystems, and climate change and groundwater. In the future, the outcome and recommendations of the "Blueprint" - to safeguard Europe's water resources - which was recently published by the EC will be of utmost importance in setting the agenda for the group. Most likely this will include water pricing, water demand management and water abstraction. Complementory to the particular working groups, a Science Policy Interface (SPI) activity has been established. Its purpose is

  3. Studying the implementation of the Water Framework Directive in Europe: a meta-analysis of 89 journal articles

    Directory of Open Access Journals (Sweden)

    Blandine Boeuf

    2016-06-01

    Full Text Available The Water Framework Directive (WFD is arguably the most ambitious piece of European Union (EU legislation in the field of water. The directive defines a general framework for integrated river basin management in Europe with a view to achieving "good water status" by 2015. Institutional novelties include, among others, water management at hydrological scales, the involvement of nonstate actors in water planning, and various economic principles, as well as a common strategy to support EU member states during the implementation of the directive. More than 15 years after the adoption of the WFD, and with the passing of an important milestone, 2015, we believe it is time for an interim assessment. This article provides a systematic review of existing scholarship on WFD implementation. We identify well-documented areas of research, describe largely unchartered territories, and suggest avenues for future studies. Methodologically, we relied on a meta-analysis. Based on a codebook of more than 35 items, we analyzed 89 journal articles reporting on the implementation of the directive in EU member states. Our review is organized around three major themes. The first is "who, when, and where"; we explore publication patterns, thereby looking into authors, timelines, and target journals. The second is "what"; we analyze the object of study in our source articles with a particular focus on case study countries, policy levels, the temporal stage of WFD implementation, and if the directive was not studied in its entirety, the aspect of the WFD that received scholarly attention. The third is "how," i.e., theoretical and methodological choices made when studying the WFD.

  4. Implementing a modular framework in a conditions database explorer for ATLAS

    Energy Technology Data Exchange (ETDEWEB)

    Simoes, J; Amorim, A; Batista, J; Lopes, L; Neves, R; Pereira, P [SIM and FCUL, University of Lisbon, Campo Grande, P-1749-016 Lisbon (Portugal); Kolos, S [University of California, Irvine, California 92697-4575 (United States); Soloviev, I [Petersburg Nuclear Physics Institute, Gatchina, St-Petersburg RU-188350 (Russian Federation)], E-mail: jalmeida@mail.cern.ch, E-mail: Antonio.Amorim@sim.fc.ul.pt

    2008-07-15

    The ATLAS conditions databases will be used to manage information of quite diverse nature and level of complexity. The usage of a relational database manager like Oracle, together with the object managers POOL and OKS developed in-house, poses special difficulties in browsing the available data while understanding its structure in a general way. This is particularly relevant for the database browser projects where it is difficult to link with the class defining libraries generated by general frameworks such as Athena. A modular approach to tackle these problems is presented here. The database infrastructure is under development using the LCG COOL infrastructure, and provides a powerful information sharing gateway upon many different systems. The nature of the stored information ranges from temporal series of simple values up to very complex objects describing the configuration of systems like ATLAS' TDAQ infrastructure, including also associations to large objects managed outside of the database infrastructure. An important example of this architecture is the Online Objects Extended Database BrowsEr (NODE), which is designed to access and display all data, available in the ATLAS Monitoring Data Archive (MDA), including histograms and data tables. To deal with the special nature of the monitoring objects, a plugin from the MDA framework to the Time managed science Instrument Databases (TIDB2) is used. The database browser is extended, in particular to include operations on histograms such as display, overlap, comparisons as well as commenting and local storage.

  5. Pattern-Based Development of Enterprise Systems: from Conceptual Framework to Series of Implementations

    Directory of Open Access Journals (Sweden)

    Sergey V. Zykov

    2013-04-01

    Full Text Available Building enterprise software is a dramatic challenge due to data size, complexity and rapid growth of the both in time. The issue becomes even more dramatic when it gets to integrating heterogeneous applications. Therewith, a uniform approach is required, which combines formal models and CASE tools. The methodology is based on extracting common ERP module level patterns and applying them to series of heterogeneous implementations. The approach includes a lifecycle model, which extends conventional spiral model by formal data representation/management models and DSL-based "low-level" CASE tools supporting the formalisms. The methodology has been successfully implemented as a series of portal-based ERP systems in ITERA oil-and-gas corporation, and in a number of trading/banking enterprise applications for other enterprises. Semantic network-based airline dispatch system, and a 6D-model-driven nuclear power plant construction support system are currently in progress.

  6. Governance Strengths and Weaknesses to implement the Marine Strategy Framework Directive in European Waters

    DEFF Research Database (Denmark)

    Freire-Gibb, L. Carlos; Koss, Rebecca; Piotr, Margonski

    2014-01-01

    addresses the Strengths, Weakness, Opportunities and Threats (SWOT) of the current European marine governance structures and its relationship to implement the MSFD. Results of the SWOT analysis were acquired through a combination of approaches with MSFD experts and stakeholders including: 30 face......-to-face interviews, an online survey with 264 stakeholder respondents and focus groups within each European marine region. The SWOT analysis concurrently identifies common strengths and weakness and key governance issues for implementing the MSFD for European marine regions. This paper forms one assessment within...... the governance component of the Options for Delivering Ecosystem Based Marine Management (ODEMM) project and presents timely issues that can be of benefit to national and European Union policy makers....

  7. Implementation of the Master Plan Activities in Serayu River Voyage (SRV Within the Framework of Tourism Development in Banyumas Regency

    Directory of Open Access Journals (Sweden)

    Imam Pamungkas

    2015-02-01

    Full Text Available The Master Plan Activity of Serayu River Voyage (SRV for tourism development in Banyumas Regency were expected to be completed within five years from 2008 to 2012, but during the period until 2013, most programs and activities have not been implemented. The results showed that the Master Plan of SRV in the framework of tourism development in Banyumas Regency has not been implemented properly. The cause is the absence of good coordination between agencies, the lack programs and activities integration, supporting documents have not been revised, absence of good socialization, and the lack of private sector contribution. The factors that constrain and support implementation of the Master Plan is described as follows. Supporting factors: competent human resources (implementor already available at the managerial level and have intellectual tourism, it is only need to add personnel in the sector of culture; the availability of adequate budget; institutions that have been effective and efficient; High community response; High commitment of Banyumas Regent and cooperation related parties (stakeholders; and natural conditions of Serayu tend to calm and the river slope condition is small. The constrain factors: regulatory policies; integration of programs and activities; coordination and socialization implied sectoral ego that need to be addressed. Keywords : implementation, master plan, Serayu River Voyage, human resources, regulation

  8. Using the ecological framework to identify barriers and enablers to implementing Namaste Care in Canada's long-term care system.

    Science.gov (United States)

    Hunter, Paulette V; Kaasalainen, Sharon; Froggatt, Katherine A; Ploeg, Jenny; Dolovich, Lisa; Simard, Joyce; Salsali, Mahvash

    2017-10-01

    Higher acuity of care at the time of admission to long-term care (LTC) is resulting in a shorter period to time of death, yet most LTC homes in Canada do not have formalized approaches to palliative care. Namaste Care is a palliative care approach specifically tailored to persons with advanced cognitive impairment who are living in LTC. The purpose of this study was to employ the ecological framework to identify barriers and enablers to an implementation of Namaste Care. Six group interviews were conducted with families, unlicensed staff, and licensed staff at two Canadian LTC homes that were planning to implement Namaste Care. None of the interviewees had prior experience implementing Namaste Care. The resulting qualitative data were analyzed using a template organizing approach. We found that the strongest implementation enablers were positive perceptions of need for the program, benefits of the program, and fit within a resident-centred or palliative approach to care. Barriers included a generally low resource base for LTC, the need to adjust highly developed routines to accommodate the program, and reliance on a casual work force. We conclude that within the Canadian LTC system, positive perceptions of Namaste Care are tempered by concerns about organizational capacity to support new programming.

  9. Framework for Instructional Technology: Methods of Implementing Adaptive Training and Education

    Science.gov (United States)

    2014-01-01

    whether instructional environment actually has the time and resources to implement an adaptive strategy. For this reason, in this paper we...referred to as fading. Just as a person healing from a broken leg may go from crutches to a cane to no assistance, ultimately, the learner should be...Chicago. Retrieved from http://www.fossati.us/ papers /ilist-phdthesis.pdf Graesser, A. C., Jeon, M. & Dufty, D. (2008). Agent technologies designed

  10. How lessons learnt informed the development of an implementation framework in an ICT4D initiative

    CSIR Research Space (South Africa)

    Botha, Adèle

    2015-05-01

    Full Text Available problem and phenomena central to the investigation in order to meet the purpose of the research [16; 17]. Snowball sampling identifies research participants through a chain reaction as a result of word of mouth. Researchers find one person who comes... Marketing strategy, Social Media Strategy, Knowledge Management Monitoring & Evolution Learners, Teachers, Schools Evidence-Based Policy Academic Research, Implementation guidelines, Policy guidelines Community Engagement Learners & Parents...

  11. Implementing a Mentally Healthy Schools Framework Based on the Population Wide Act-Belong-Commit Mental Health Promotion Campaign: A Process Evaluation

    Science.gov (United States)

    Anwar-McHenry, Julia; Donovan, Robert John; Nicholas, Amberlee; Kerrigan, Simone; Francas, Stephanie; Phan, Tina

    2016-01-01

    Purpose: Mentally Healthy WA developed and implemented the Mentally Healthy Schools Framework in 2010 in response to demand from schools wanting to promote the community-based Act-Belong-Commit mental health promotion message within a school setting. Schools are an important setting for mental health promotion, therefore, the Framework encourages…

  12. POLARIS: Agent-based modeling framework development and implementation for integrated travel demand and network and operations simulations

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua; Hope, Michael; Ley, Hubert; Sokolov, Vadim; Xu, Bo; Zhang, Kuilin

    2016-03-01

    This paper discusses the development of an agent-based modelling software development kit, and the implementation and validation of a model using it that integrates dynamic simulation of travel demand, network supply and network operations. A description is given of the core utilities in the kit: a parallel discrete event engine, interprocess exchange engine, and memory allocator, as well as a number of ancillary utilities: visualization library, database IO library, and scenario manager. The overall framework emphasizes the design goals of: generality, code agility, and high performance. This framework allows the modeling of several aspects of transportation system that are typically done with separate stand-alone software applications, in a high-performance and extensible manner. The issue of integrating such models as dynamic traffic assignment and disaggregate demand models has been a long standing issue for transportation modelers. The integrated approach shows a possible way to resolve this difficulty. The simulation model built from the POLARIS framework is a single, shared-memory process for handling all aspects of the integrated urban simulation. The resulting gains in computational efficiency and performance allow planning models to be extended to include previously separate aspects of the urban system, enhancing the utility of such models from the planning perspective. Initial tests with case studies involving traffic management center impacts on various network events such as accidents, congestion and weather events, show the potential of the system.

  13. Development and implementation of the IPCS conceptual framework for evaluating mode of action of chemical carcinogens

    International Nuclear Information System (INIS)

    Dybing, E.

    2002-01-01

    The framework developed by the International Programme on Chemical Safety (IPCS) for assessing the mode of action of tumour induction of chemicals in experimental animals has been illustrated with d-limonene, sodium saccharin, di(2-ethylhexyl)phthalate (DEHP) and sulfamethazine as examples. d-Limonene causes renal tumours only in male rats through a response associated with α 2u -globulin. Sodium saccharin induces urinary bladder tumours only in male rats through formation of a urinary precipitate causing erosion of the bladder surface and extensive regenerative hyperplasia. DEHP causes liver tumours in rats and mice through activation of the receptor PPARα leading to peroxisome proliferation and hepatocellular proliferation. Sulfamethazine induces thyroid follicular cell tumours in rats and mice through a mechanism involving altered thyroid hormone homeostasis

  14. Implementing an overdose education and naloxone distribution program in a health system.

    Science.gov (United States)

    Devries, Jennifer; Rafie, Sally; Polston, Gregory

    To design and implement a health system-wide program increasing provision of take-home naloxone in patients at risk for opioid overdose, with the downstream aim of reducing fatalities. The program includes health care professional education and guidelines, development, and dissemination of patient education materials, electronic health record changes to promote naloxone prescriptions, and availability of naloxone in pharmacies. Academic health system, San Diego, California. University of California, San Diego Health (UCSDH), offers both inpatient and outpatient primary care and specialty services with 563 beds spanning 2 hospitals and 6 pharmacies. UCSDH is part of the University of California health system, and it serves as the county's safety net hospital. In January 2016, a multisite academic health system initiated a system-wide overdose education and naloxone distribution program to prevent opioid overdose and opioid overdose-related deaths. An interdisciplinary, interdepartmental team came together to develop and implement the program. To strengthen institutional support, naloxone prescribing guidelines were developed and approved for the health system. Education on naloxone for physicians, pharmacists, and nurses was provided through departmental trainings, bulletins, and e-mail notifications. Alerts in the electronic health record and preset naloxone orders facilitated co-prescribing of naloxone with opioid prescriptions. Electronic health record reports captured naloxone prescriptions ordered. Summary reports on the electronic health record measured naloxone reminder alerts and response rates. Since the start of the program, the health system has trained 252 physicians, pharmacists, and nurses in overdose education and take-home naloxone. There has been an increase in the number of prescriptions for naloxone from a baseline of 4.5 per month to an average of 46 per month during the 3 months following full implementation of the program including

  15. Implementation of Pilot Protection System for Large Scale Distribution System like The Future Renewable Electric Energy Distribution Management Project

    Science.gov (United States)

    Iigaya, Kiyohito

    A robust, fast and accurate protection system based on pilot protection concept was developed previously and a few alterations in that algorithm were made to make it faster and more reliable and then was applied to smart distribution grids to verify the results for it. The new 10 sample window method was adapted into the pilot protection program and its performance for the test bed system operation was tabulated. Following that the system comparison between the hardware results for the same algorithm and the simulation results were compared. The development of the dual slope percentage differential method, its comparison with the 10 sample average window pilot protection system and the effects of CT saturation on the pilot protection system are also shown in this thesis. The implementation of the 10 sample average window pilot protection system is done to multiple distribution grids like Green Hub v4.3, IEEE 34, LSSS loop and modified LSSS loop. Case studies of these multi-terminal model are presented, and the results are also shown in this thesis. The result obtained shows that the new algorithm for the previously proposed protection system successfully identifies fault on the test bed and the results for both hardware and software simulations match and the response time is approximately less than quarter of a cycle which is fast as compared to the present commercial protection system and satisfies the FREEDM system requirement.

  16. Implementation of an evolutionary algorithm in planning investment in a power distribution system

    Directory of Open Access Journals (Sweden)

    Carlos Andrés García Montoya

    2011-06-01

    Full Text Available The definition of an investment plan to implement in a distribution power system, is a task that constantly faced by utilities. This work presents a methodology for determining the investment plan for a distribution power system under a shortterm, using as a criterion for evaluating investment projects, associated costs and customers benefit from its implementation. Given the number of projects carried out annually on the system, the definition of an investment plan requires the use of computational tools to evaluate, a set of possibilities, the one that best suits the needs of the present system and better results. That is why in the job, implementing a multi objective evolutionary algorithm SPEA (Strength Pareto Evolutionary Algorithm, which, based on the principles of Pareto optimality, it deliver to the planning expert, the best solutions found in the optimization process. The performance of the algorithm is tested using a set of projects to determine the best among the possible plans. We analyze also the effect of operators on the performance of evolutionary algorithm and results.

  17. From Tobacco to Obesity Prevention Policies: A Framework for Implementing Community-Driven Policy Change.

    Science.gov (United States)

    Walter, Lauren; Dumke, Kelly; Oliva, Ariana; Caesar, Emily; Phillips, Zoë; Lehman, Nathan; Aragon, Linda; Simon, Paul; Kuo, Tony

    2018-04-01

    Efforts to reverse the obesity epidemic require policy, systems, and environmental (PSE) change strategies. Despite the availability of evidence-based and other promising PSE interventions, limited evidence exists on the "how-to" of transitioning them into practice. For the past 13 years, the Los Angeles County Department of Public Health has been building capacity among community residents and other stakeholders to create effective community coalitions and to implement well-designed policy strategy campaigns using an evidence-based approach to policy change, the policy adoption model (PAM). Implementing a phase-based approach to policy change, the PAM was initially used to support the passage of over 140 tobacco control and prevention policies in Los Angeles County. Following these successes, Los Angeles County Department of Public Health applied the PAM to obesity prevention, operationalizing the policy process by training community residents and other stakeholders on the use of the model. The PAM has shown to be helpful in promoting PSE change in tobacco control and obesity prevention, suggesting a local-level model potentially applicable to other fields of public health seeking sustainable, community-driven policy change.

  18. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  19. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  20. Models of Coupled Settlement and Habitat Networks for Biodiversity Conservation: Conceptual Framework, Implementation and Potential Applications

    Directory of Open Access Journals (Sweden)

    Maarten J. van Strien

    2018-04-01

    Full Text Available Worldwide, the expansion of settlement and transport infrastructure is one of the most important proximate as well as ultimate causes of biodiversity loss. As much as every modern human society depends on a network of settlements that is well-connected by transport infrastructure (i.e., settlement network, animal and plant species depend on networks of habitats between which they can move (i.e., habitat networks. However, changes to a settlement network in a region often threaten the integrity of the region's habitat networks. Determining plans and policy to prevent these threats is made difficult by the numerous interactions and feedbacks that exist between and within the settlement and habitat networks. Mathematical models of coupled settlement and habitat networks can help us understand the dynamics of this social-ecological system. Yet, few attempts have been made to develop such mathematical models. In this paper, we promote the development of models of coupled settlement and habitat networks for biodiversity conservation. First, we present a conceptual framework of key variables that are ideally considered when operationalizing the coupling of settlement and habitat networks. In this framework, we first describe important network-internal interactions by differentiating between the structural (i.e., relating to purely physical conditions determining the suitability of a location for living or movement and functional (i.e., relating to the actual presence, abundance or movement of people or other organisms properties of either network. We then describe the main one-way influences that a settlement network can exert on the habitat networks and vice versa. Second, we give several recommendations for the mathematical modeling of coupled settlement and habitat networks and present several existing modeling approaches (e.g., habitat network models and land-use transport interaction models that could be used for this purpose. Lastly, we elaborate

  1. A tale of two directories Implementing distributed shared objects in Java

    CERN Document Server

    Herlihy, M

    1999-01-01

    A directory service keep tracks of the location and status of mobile objects in a distributed system. This paper describes our experience implementing two distributed directory protocols as part of the Aleph toolkit, a distributed shared object system implemented in Java. One protocol is a conventional home-based protocol, in which a fixed node keeps track of the object's location and status. The other is a novel arrow protocol, based on a simple path-reversal algorithm. We were surprised to discover that the arrow protocol outperformed the home protocol, sometimes substantially, across a range of system sizes. This paper describes a series of experiments testing whether the discrepancy is due to an artifact of the Java run-time system (such as differences in thread management or object serialization costs), or whether it is something inherent in the protocols themselves. In the end, we use insights gained from these experimental results to design a new directory protocol that usually outperforms both. (29 re...

  2. Implementing a framework for integrating toxicokinetics into human health risk assessment for agrochemicals.

    Science.gov (United States)

    Terry, Claire; Hays, Sean; McCoy, Alene T; McFadden, Lisa G; Aggarwal, Manoj; Rasoulpour, Reza J; Juberg, Daland R

    2016-03-01

    A strategic and comprehensive program in which toxicokinetic (TK) measurements are made for all agrochemicals undergoing toxicity testing (both new compounds and compounds already registered for use) is described. This approach provides the data to more accurately assess the toxicokinetics of agrochemicals and their metabolites in laboratory animals and humans. Having this knowledge provides the ability to conduct more insightful toxicity studies, refine and interpret exposure assessments and reduce uncertainty in risk assessments. By developing a better understanding of TK across species, including humans via in vitro metabolism studies, any differences across species in TK can be identified early and the most relevant species can be selected for toxicity tests. It also provides the ability to identify any non-linearities in TK as a function of dose, which in turn can be used to identify a kinetically derived maximum dose (KMD) and avoid dosing inappropriately outside of the kinetic linear range. Measuring TK in key life stages also helps to identify changes in ADME parameters from in utero to adults. A robust TK database can also be used to set internal concentration based "Reference Concentrations" and Biomonitoring Equivalents (BE), and support selection of Chemical Specific Adjustment Factors (CSAF). All of these factors support the reduction of uncertainty throughout the entire risk assessment process. This paper outlines how a TK research strategy can be integrated into new agrochemical toxicity testing programs, together with a proposed Framework for future use. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Have artificial neural networks met expectations in drug discovery as implemented in QSAR framework?

    Science.gov (United States)

    Dobchev, Dimitar; Karelson, Mati

    2016-07-01

    Artificial neural networks (ANNs) are highly adaptive nonlinear optimization algorithms that have been applied in many diverse scientific endeavors, ranging from economics, engineering, physics, and chemistry to medical science. Notably, in the past two decades, ANNs have been used widely in the process of drug discovery. In this review, the authors discuss advantages and disadvantages of ANNs in drug discovery as incorporated into the quantitative structure-activity relationships (QSAR) framework. Furthermore, the authors examine the recent studies, which span over a broad area with various diseases in drug discovery. In addition, the authors attempt to answer the question about the expectations of the ANNs in drug discovery and discuss the trends in this field. The old pitfalls of overtraining and interpretability are still present with ANNs. However, despite these pitfalls, the authors believe that ANNs have likely met many of the expectations of researchers and are still considered as excellent tools for nonlinear data modeling in QSAR. It is likely that ANNs will continue to be used in drug development in the future.

  4. Evaluation of the implementation of a whole-workplace walking programme using the RE-AIM framework

    Directory of Open Access Journals (Sweden)

    Emma J. Adams

    2017-05-01

    Full Text Available Abstract Background Promoting walking for the journey to/from work and during the working day is one potential approach to increase physical activity in adults. Walking Works was a practice-led, whole-workplace walking programme delivered by employees (walking champions. This study aimed to evaluate the implementation of Walking Works using the RE-AIM framework and provide recommendations for future delivery of whole-workplace walking programmes. Methods Two cross sectional surveys were conducted; 1544 (28% employees completed the baseline survey and 918 employees (21% completed the follow-up survey. Effectiveness was assessed using baseline and follow-up data; reach, implementation and maintenance were assessed using follow-up data only. For categorical data, Chi square tests were conducted to assess differences between surveys or groups. Continuous data were analysed to test for significant differences using a Mann-Whitney U test. Telephone interviews were conducted with the lead organisation co-ordinator, eight walking champions and three business representatives at follow-up. Interviews were transcribed verbatim and analysed to identify key themes related to adoption, implementation and maintenance. Results Adoption: Five workplaces participated in Walking Works. Reach: 480 (52.3% employees were aware of activities and 221 (24.1% participated. Implementation: A variety of walking activities were delivered. Some programme components were not delivered as planned which was partly due to barriers in using walking champions to deliver activities. These included the walking champions’ capacity, skills, support needs, ability to engage senior management, and the number and type of activities they could deliver. Other barriers included lack of management support, difficulties communicating information about activities and challenges embedding the programme into normal business activities. Effectiveness: No significant changes in walking to

  5. Implementation of a framework for multi-species, multi-objective adaptive management in Delaware Bay

    Science.gov (United States)

    McGowan, Conor P.; Smith, David R.; Nichols, James D.; Lyons, James E.; Sweka, John A.; Kalasz, Kevin; Niles, Lawrence J.; Wong, Richard; Brust, Jeffrey; Davis, Michelle C.; Spear, Braddock

    2015-01-01

    Decision analytic approaches have been widely recommended as well suited to solving disputed and ecologically complex natural resource management problems with multiple objectives and high uncertainty. However, the difference between theory and practice is substantial, as there are very few actual resource management programs that represent formal applications of decision analysis. We applied the process of structured decision making to Atlantic horseshoe crab harvest decisions in the Delaware Bay region to develop a multispecies adaptive management (AM) plan, which is currently being implemented. Horseshoe crab harvest has been a controversial management issue since the late 1990s. A largely unregulated horseshoe crab harvest caused a decline in crab spawning abundance. That decline coincided with a major decline in migratory shorebird populations that consume horseshoe crab eggs on the sandy beaches of Delaware Bay during spring migration. Our approach incorporated multiple stakeholders, including fishery and shorebird conservation advocates, to account for diverse management objectives and varied opinions on ecosystem function. Through consensus building, we devised an objective statement and quantitative objective function to evaluate alternative crab harvest policies. We developed a set of competing ecological models accounting for the leading hypotheses on the interaction between shorebirds and horseshoe crabs. The models were initially weighted based on stakeholder confidence in these hypotheses, but weights will be adjusted based on monitoring and Bayesian model weight updating. These models were used together to predict the effects of management actions on the crab and shorebird populations. Finally, we used a dynamic optimization routine to identify the state dependent optimal harvest policy for horseshoe crabs, given the possible actions, the stated objectives and our competing hypotheses about system function. The AM plan was reviewed, accepted and

  6. Fast implementation of length-adaptive privacy amplification in quantum key distribution

    International Nuclear Information System (INIS)

    Zhang Chun-Mei; Li Mo; Huang Jing-Zheng; Li Hong-Wei; Li Fang-Yi; Wang Chuan; Yin Zhen-Qiang; Chen Wei; Han Zhen-Fu; Treeviriyanupab Patcharapong; Sripimanwat Keattisak

    2014-01-01

    Post-processing is indispensable in quantum key distribution (QKD), which is aimed at sharing secret keys between two distant parties. It mainly consists of key reconciliation and privacy amplification, which is used for sharing the same keys and for distilling unconditional secret keys. In this paper, we focus on speeding up the privacy amplification process by choosing a simple multiplicative universal class of hash functions. By constructing an optimal multiplication algorithm based on four basic multiplication algorithms, we give a fast software implementation of length-adaptive privacy amplification. “Length-adaptive” indicates that the implementation of privacy amplification automatically adapts to different lengths of input blocks. When the lengths of the input blocks are 1 Mbit and 10 Mbit, the speed of privacy amplification can be as fast as 14.86 Mbps and 10.88 Mbps, respectively. Thus, it is practical for GHz or even higher repetition frequency QKD systems. (general)

  7. State-of-the-Art: Research Theoretical Framework of Information Systems Implementation Research in the Health Sector in Sub-Saharan Africa

    DEFF Research Database (Denmark)

    Tetteh, Godwin Kofi

    2014-01-01

    This study is about the state-of-the-art of reference theories and theoretical framework of information systems implementation research in the health industry in the Sub-Saharan countries from a process perspective. A process – variance framework, Poole et al, (2000), Markus & Robey, (1988......) and Shaw & Jarvenpaa, (1997) is employed to examine reference theories employed in research conducted on information systems implementation in the health sector in the Sub-Saharan region and published between 2003 and 2013. Using a number of key words and searching on a number of databases, EBSCO, CSA...... the process theoretical framework to enhance our insight into successful information systems implementation in the region. It is our optimism that the process based theoretical framework will be useful for, information system practitioners and organisational managers and researchers in the health sector...

  8. Implementing and Sustaining Data Lifecycle best Practices: a Framework for Researchers and Repositories

    Science.gov (United States)

    Stall, S.

    2016-02-01

    Emerging data management mandates in conjunction with cross-domain international interoperability are posing new challenges for researchers and repositories. Domain repositories are serving in this critical, growing role monitoring and leading data management standards and capability within their own repository and working on mappings between repositories internationally. Leading research institutions and companies will also be important as they develop and expand data curation efforts. This landscape poses a number of challenges for developing and ensuring the use of best practices in curating research data, enabling discovery, elevating quality across diverse repositories, and helping researchers collect and organize it through the full data life cycle. This multidimensional challenge will continue to grow in complexity. The American Geophysical Union (AGU) is developing two programs to help researchers and data repositories develop and elevate best practices and address these challenges. The goal is to provide tools for the researchers and repositories, whether domain, institutional, or other, that improve performance throughout the data lifecycle across the Earth and space science community. For scientists and researchers, AGU is developing courses around handling data that can lead toward a certification in geoscience data management. Course materials will cover metadata management and collection, data analysis, integration of data, and data presentation. The course topics are being finalized by the advisory board with the first one planned to be available later this year. AGU is also developing a program aimed at helping data repositories, large and small, domain-specific to general, assess and improve data management practices. AGU has partnered with the CMMI® Institute to adapt their Data Management Maturity (DMM)SM framework within the Earth and space sciences. A data management assessment using the DMMSM involves identifying accomplishments and

  9. [Registries for rare diseases : OSSE - An open-source framework for technical implementation].

    Science.gov (United States)

    Storf, Holger; Schaaf, Jannik; Kadioglu, Dennis; Göbel, Jens; Wagner, Thomas O F; Ückert, Frank

    2017-05-01

    Meager amounts of data stored locally, a small number of experts, and a broad spectrum of technological solutions incompatible with each other characterize the landscape of registries for rare diseases in Germany. Hence, the free software Open Source Registry for Rare Diseases (OSSE) was created to unify and streamline the process of establishing specific rare disease patient registries. The data to be collected is specified based on metadata descriptions within the registry framework's so-called metadata repository (MDR), which was developed according to the ISO/IEC 11179 standard. The use of a central MDR allows for sharing the same data elements across any number of registries, thus providing a technical prerequisite for making data comparable and mergeable between registries and promoting interoperability.With OSSE, the foundation is laid to operate linked patient registries while respecting strong data protection regulations. Using the federated search feature, data for clinical studies can be identified across registries. Data integrity, however, remains intact since no actual data leaves the premises without the owner's consent. Additionally, registry solutions other than OSSE can participate via the OSSE bridgehead, which acts as a translator between OSSE registry networks and non-OSSE registries. The pseudonymization service Mainzelliste adds further data protection.Currently, more than 10 installations are under construction in clinical environments (including university hospitals in Frankfurt, Hamburg, Freiburg and Münster). The feedback given by the users will influence further development of OSSE. As an example, the installation process of the registry for undiagnosed patients at University Hospital Frankfurt is described in more detail.

  10. Benefits of the implementation and use of a warehouse management system in a distribution center

    Directory of Open Access Journals (Sweden)

    Alexsander Machado

    2011-12-01

    Full Text Available The aim of this article was to describe how the deployment and use of a Warehouse Management System (WMS can help increase productivity, reduce errors and speed up the flow of information in a distribution center. The research method was the case study. We had chosen a distributor of goods, located in Vale do Rio dos Sinos, RS, which sells and distributes for companies throughout Brazil products for business use. The main research technique was participant observation. In order to highlight the observed results, we collected two indicators, productivity and errors for the separation of items into applications. After four months of observation, both showed significant improvement, strengthening the hypothesis that selection and implementation of management system was beneficial for the company.

  11. Considerations for control system software verification and validation specific to implementations using distributed processor architectures

    International Nuclear Information System (INIS)

    Munro, J.K. Jr.

    1993-01-01

    Until recently, digital control systems have been implemented on centralized processing systems to function in one of several ways: (1) as a single processor control system; (2) as a supervisor at the top of a hierarchical network of multiple processors; or (3) in a client-server mode. Each of these architectures uses a very different set of communication protocols. The latter two architectures also belong to the category of distributed control systems. Distributed control systems can have a central focus, as in the cases just cited, or be quite decentralized in a loosely coupled, shared responsibility arrangement. This last architecture is analogous to autonomous hosts on a local area network. Each of the architectures identified above will have a different set of architecture-associated issues to be addressed in the verification and validation activities during software development. This paper summarizes results of efforts to identify, describe, contrast, and compare these issues

  12. Distributed models of radionuclide transport on watersheds: development and implementation for the Chernobyl and Fukushima catchments

    Energy Technology Data Exchange (ETDEWEB)

    Kivva, S.; Zheleznyak, M. [Institute of Environmental Radioactivity, Fukushima University (Japan)

    2014-07-01

    The distributed hydrological 'rainfall- runoff' models provide possibilities of the physically based simulation of surface and subsurface flow on watersheds based on the GIS processed data. The success of such modeling approaches for the predictions of the runoff and soil erosion provides a basis for the implementation of the distributed radionuclide transport watershed models. Two distributed watershed models of radionuclide transport - RUNTOX and DHSVM-R have been used to simulate the radionuclide transport in the basin of the Dnieper River, Ukraine and watersheds of Prefecture Fukushima. RUNTOX is used for the simulation of radionuclide wash off from the experimental plots and small watersheds, and DHSVM-R is used for medium and large watersheds RUNTOX is two dimensional distributed hydrological model based on the finite-difference solution of the coupled equations the surface flow, subsurface flow, groundwater flow and advection- dispersion equations of the sediments (eroded soil) and radionuclide transport in liquid and solid phases, taking into parameterize the radionuclide exchanges between liquid and solid phases.. This model has been applied to the experimental plots in Ukraine after the Chernobyl accident and experimental plots in the Fukushima Prefecture. The experience of RUNTOX development and application has been used for the extension of the distributed hydrological model DHSVM by the including of the module of the watershed radionuclide transport. The updated model was named by DHSMV-R. The original DHSVM (Distributed Hydrology Soil Vegetation Model) was developed in the University of Washington and Pacific Northwest National Laboratories. DHSVM is a physical distributed hydrology-vegetation model for complex terrain based on the numerical solution of the network of one dimensional equations. The model accounts explicitly for the spatial distribution of land-surface processes, and can be applied over a range of scales, from plot to large

  13. Multi-criteria decision support framework for sustainable implementation of effective green supply chain management practices.

    Science.gov (United States)

    Boutkhoum, Omar; Hanine, Mohamed; Boukhriss, Hicham; Agouti, Tarik; Tikniouine, Abdessadek

    2016-01-01

    At present, environmental issues become real critical barriers for many supply chain corporations concerning the sustainability of their businesses. In this context, several studies have been proposed from both academia and industry trying to develop new measurements related to green supply chain management (GSCM) practices to overcome these barriers, which will help create new environmental strategies, implementing those practices in their manufacturing processes. The objective of this study is to present the technical and analytical contribution that multi-criteria decision making analysis (MCDA) can bring to environmental decision making problems, and especially to GSCM field. For this reason, a multi-criteria decision-making methodology, combining fuzzy analytical hierarchy process and fuzzy technique for order preference by similarity to ideal solution (fuzzy TOPSIS), is proposed to contribute to a better understanding of new sustainable strategies through the identification and evaluation of the most appropriate GSCM practices to be adopted by industrial organizations. The fuzzy AHP process is used to construct hierarchies of the influential criteria, and then identify the importance weights of the selected criteria, while the fuzzy TOPSIS process employs these weighted criteria as inputs to evaluate and measure the performance of each alternative. To illustrate the effectiveness and performance of our MCDA approach, we have applied it to a chemical industry corporation located in Safi, Morocco.

  14. Evaluating the implementation of a quality improvement process in General Practice using a realist evaluation framework.

    Science.gov (United States)

    Moule, Pam; Clompus, Susan; Fieldhouse, Jon; Ellis-Jones, Julie; Barker, Jacqueline

    2018-05-25

    Underuse of anticoagulants in atrial fibrillation is known to increase the risk of stroke and is an international problem. The National Institute for Health Care and Excellence guidance CG180 seeks to reduce atrial fibrillation related strokes through prescriptions of Non-vitamin K antagonist Oral Anticoagulants. A quality improvement programme was established by the West of England Academic Health Science Network (West of England AHSN) to implement this guidance into General Practice. A realist evaluation identified whether the quality improvement programme worked, determining how and in what circumstances. Six General Practices in 1 region, became the case study sites. Quality improvement team, doctor, and pharmacist meetings within each of the General Practices were recorded at 3 stages: initial planning, review, and final. Additionally, 15 interviews conducted with the practice leads explored experiences of the quality improvement process. Observation and interview data were analysed and compared against the initial programme theory. The quality improvement resources available were used variably, with the training being valued by all. The initial programme theories were refined. In particular, local workload pressures and individual General Practitioner experiences and pre-conceived ideas were acknowledged. Where key motivators were in place, such as prior experience, the programme achieved optimal outcomes and secured a lasting quality improvement legacy. The employment of a quality improvement programme can deliver practice change and improvement legacy outcomes when particular mechanisms are employed and in contexts where there is a commitment to improve service. © 2018 John Wiley & Sons, Ltd.

  15. Formalization, implementation, and modeling of institutional controllers for distributed robotic systems.

    Science.gov (United States)

    Pereira, José N; Silva, Porfírio; Lima, Pedro U; Martinoli, Alcherio

    2014-01-01

    The work described is part of a long term program of introducing institutional robotics, a novel framework for the coordination of robot teams that stems from institutional economics concepts. Under the framework, institutions are cumulative sets of persistent artificial modifications made to the environment or to the internal mechanisms of a subset of agents, thought to be functional for the collective order. In this article we introduce a formal model of institutional controllers based on Petri nets. We define executable Petri nets-an extension of Petri nets that takes into account robot actions and sensing-to design, program, and execute institutional controllers. We use a generalized stochastic Petri net view of the robot team controlled by the institutional controllers to model and analyze the stochastic performance of the resulting distributed robotic system. The ability of our formalism to replicate results obtained using other approaches is assessed through realistic simulations of up to 40 e-puck robots. In particular, we model a robot swarm and its institutional controller with the goal of maintaining wireless connectivity, and successfully compare our model predictions and simulation results with previously reported results, obtained by using finite state automaton models and controllers.

  16. Evaluation of the implementation of a whole-workplace walking programme using the RE-AIM framework.

    Science.gov (United States)

    Adams, Emma J; Chalkley, Anna E; Esliger, Dale W; Sherar, Lauren B

    2017-05-18

    Promoting walking for the journey to/from work and during the working day is one potential approach to increase physical activity in adults. Walking Works was a practice-led, whole-workplace walking programme delivered by employees (walking champions). This study aimed to evaluate the implementation of Walking Works using the RE-AIM framework and provide recommendations for future delivery of whole-workplace walking programmes. Two cross sectional surveys were conducted; 1544 (28%) employees completed the baseline survey and 918 employees (21%) completed the follow-up survey. Effectiveness was assessed using baseline and follow-up data; reach, implementation and maintenance were assessed using follow-up data only. For categorical data, Chi square tests were conducted to assess differences between surveys or groups. Continuous data were analysed to test for significant differences using a Mann-Whitney U test. Telephone interviews were conducted with the lead organisation co-ordinator, eight walking champions and three business representatives at follow-up. Interviews were transcribed verbatim and analysed to identify key themes related to adoption, implementation and maintenance. Adoption: Five workplaces participated in Walking Works. Reach: 480 (52.3%) employees were aware of activities and 221 (24.1%) participated. A variety of walking activities were delivered. Some programme components were not delivered as planned which was partly due to barriers in using walking champions to deliver activities. These included the walking champions' capacity, skills, support needs, ability to engage senior management, and the number and type of activities they could deliver. Other barriers included lack of management support, difficulties communicating information about activities and challenges embedding the programme into normal business activities. Effectiveness: No significant changes in walking to/from work or walking during the working day were observed. Maintenance

  17. An Architectural Based Framework for the Distributed Collection, Analysis and Query from Inhomogeneous Time Series Data Sets and Wearables for Biofeedback Applications

    Directory of Open Access Journals (Sweden)

    James Lee

    2017-02-01

    Full Text Available The increasing professionalism of sports persons and desire of consumers to imitate this has led to an increased metrification of sport. This has been driven in no small part by the widespread availability of comparatively cheap assessment technologies and, more recently, wearable technologies. Historically, whilst these have produced large data sets, often only the most rudimentary analysis has taken place (Wisbey et al in: “Quantifying movement demands of AFL football using GPS tracking”. This paucity of analysis is due in no small part to the challenges of analysing large sets of data that are often from disparate data sources to glean useful key performance indicators, which has been a largely a labour intensive process. This paper presents a framework that can be cloud based for the gathering, storing and algorithmic interpretation of large and inhomogeneous time series data sets. The framework is architecture based and technology agnostic in the data sources it can gather, and presents a model for multi set analysis for inter- and intra- devices and individual subject matter. A sample implementation demonstrates the utility of the framework for sports performance data collected from distributed inertial sensors in the sport of swimming.

  18. Costa Rica’s Implementation of the Framework Convention on Tobacco Control: Overcoming decades of industry dominance

    Directory of Open Access Journals (Sweden)

    Eric Crosbie

    2016-01-01

    Full Text Available Objective. To analyze the passage of Costa Rica’s 2012 tobacco control law. Materials and methods. Review of legislation, newspaper articles, and key informant interviews. Results. Tobacco control advocates, in close collaboration with international health groups, recruited national, regional and international experts to testify in the Legislative Assembly, implemented grassroots advocacy campaigns, and generated media coverage to enact strong legislation in March 2012 consistent with the World Health Organization Framework Convention on Tobacco Control, despite tobacco industry lobbying efforts that for decades blocked effective tobacco control legislation. Conclusion. Costa Rica’s experience illustrates how with resources, good strategic planning, aggressive tactics and perseverance tobacco control advocates can overcome tobacco industry opposition in the Legislative Assembly and Executive Branch. This determined approach has positioned Costa Rica to become a regional leader in tobacco control.

  19. Practical Considerations regarding Implementation of Wind Power Applications into Real-Time Hardware-In-The-Loop Framework

    DEFF Research Database (Denmark)

    Petersen, Lennart; Iov, Florin

    2017-01-01

    , where the focus is laid on the model development in a real-time simulator. It enables to verify the functionality of developed controls, which is one of the research priorities due to the increased complexity of large wind power plants requiring high level of com-munication between plant control......This paper addresses the system implementation of voltage control architecture in wind power plants into a Real-Time Hardware-In-The-Loop framework. The increasing amount of wind power penetration into the power systems has en-gaged the wind power plants to take over the responsibility for adequate...... control of the node voltages, which has previ-ously been accomplished by conventional generation. Voltage support at the point of common coupling is realized by an overall wind power plant controller which requires high-performance and robust control solution. In most cases the system including all...

  20. Hydrogeologic framework and salinity distribution of the Floridan aquifer system of Broward County, Florida

    Science.gov (United States)

    Reese, Ronald S.; Cunningham, Kevin J.

    2014-01-01

    Concerns about water-level decline and seawater intrusion in the surficial Biscayne aquifer, currently the principal source of water supply to Broward County, prompted a study to refine the hydrogeologic framework of the underlying Floridan aquifer system to evaluate its potential as an alternative source of supply. This report presents cross sections that illustrate the stratigraphy and hydrogeology in eastern Broward County; maps of the upper surfaces and thicknesses of several geologic formations or units within the Floridan aquifer system; and maps of two of the potentially productive water-bearing zones within the system, the Upper Floridan aquifer and the Avon Park permeable zone. An analysis of data on rock depositional textures, associated pore networks, and flow zones in the Floridan aquifer system shows that groundwater moves through the system in two ways. These data support a conceptual, dual-porosity model of the system wherein groundwater moves either as concentrated flow in discrete, thin bedding-plane vugs or zones of vuggy megaporosity, or as diffuse flow through rocks with primarily interparticle and moldic-particle porosity. Because considerable exchange of groundwater may occur between the zones of vuggy and matrix-dominated porosity, understanding the distribution of that porosity and flow zone types is important to evaluating the suitability of the several units within the Floridan aquifer system for managing the water through practices such as aquifer storage and recovery (ASR). The salinity of the water in the Floridan aquifer system is highest in the central part of the study area, and lower toward the north and south. Although salinity generally increases with depth, in the western part of the study area a zone of relatively high saline water is perched above water of lower salinity in the underlying Avon Park permeable zone. Overall, the areas of highest salinity in the aquifer system coincide with those with the lowest estimated

  1. Uniframe: A Unified Framework for Developing Service-Oriented, Component-Based Distributed Software Systems

    National Research Council Canada - National Science Library

    Raje, Rajeev R; Olson, Andrew M; Bryant, Barrett R; Burt, Carol C; Auguston, Makhail

    2005-01-01

    .... It describes how this approach employs a unifying framework for specifying such systems to unite the concepts of service-oriented architectures, a component-based software engineering methodology...

  2. Implementation of the ATLAS trigger within the ATLAS Multi­Threaded Software Framework AthenaMT

    CERN Document Server

    Wynne, Benjamin; The ATLAS collaboration

    2016-01-01

    We present an implementation of the ATLAS High Level Trigger that provides parallel execution of trigger algorithms within the ATLAS multi­threaded software framework, AthenaMT. This development will enable the ATLAS High Level Trigger to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data­taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the High Level Trigger input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that process events independently, executing algorithms sequentially in each process. AthenaMT will provide a fully multi­threaded env...

  3. A conceptual framework to study the role of communication through social software for coordination in globally-distributed software teams

    DEFF Research Database (Denmark)

    Giuffrida, Rosalba; Dittrich, Yvonne

    2015-01-01

    Background In Global Software Development (GSD) the lack of face-to-face communication is a major challenge and effective computer-mediated practices are necessary to mitigate the effect of physical distance. Communication through Social Software (SoSo) supports team coordination, helping to deal...... with geographical distance; however, in Software Engineering literature, there is a lack of suitable theoretical concepts to analyze and describe everyday practices of globally-distributed software development teams and to study the role of communication through SoSo. Objective The paper proposes a theoretical...... framework for analyzing how communicative and coordinative practices are constituted and maintained in globally-distributed teams. Method The framework is based on the concepts of communicative genres and coordination mechanisms; it is motivated and explicated through examples from two qualitative empirical...

  4. COMDES-II: A Component-Based Framework for Generative Development of Distributed Real-Time Control Systems

    DEFF Research Database (Denmark)

    Ke, Xu; Sierszecki, Krzysztof; Angelov, Christo K.

    2007-01-01

    The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher lev...... methodology for COMDES-II from a general perspective, describes the component models in details and demonstrates their application through a DC-Motor control system case study.......The paper presents a generative development methodology and component models of COMDES-II, a component-based software framework for distributed embedded control systems with real-time constraints. The adopted methodology allows for rapid modeling and validation of control software at a higher level...

  5. A WEB-BASED FRAMEWORK FOR VISUALIZING INDUSTRIAL SPATIOTEMPORAL DISTRIBUTION USING STANDARD DEVIATIONAL ELLIPSE AND SHIFTING ROUTES OF GRAVITY CENTERS

    Directory of Open Access Journals (Sweden)

    Y. Song

    2017-09-01

    Full Text Available Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  6. a Web-Based Framework for Visualizing Industrial Spatiotemporal Distribution Using Standard Deviational Ellipse and Shifting Routes of Gravity Centers

    Science.gov (United States)

    Song, Y.; Gui, Z.; Wu, H.; Wei, Y.

    2017-09-01

    Analysing spatiotemporal distribution patterns and its dynamics of different industries can help us learn the macro-level developing trends of those industries, and in turn provides references for industrial spatial planning. However, the analysis process is challenging task which requires an easy-to-understand information presentation mechanism and a powerful computational technology to support the visual analytics of big data on the fly. Due to this reason, this research proposes a web-based framework to enable such a visual analytics requirement. The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories. The calculation of gravity centers and ellipses is paralleled using Apache Spark to accelerate the processing. In the experiments, we use the enterprise registration dataset in Mainland China from year 1960 to 2015 that contains fine-grain location information (i.e., coordinates of each individual enterprise) to demonstrate the feasibility of this framework. The experiment result shows that the developed visual analytics method is helpful to understand the multi-level patterns and developing trends of different industries in China. Moreover, the proposed framework can be used to analyse any nature and social spatiotemporal point process with large data volume, such as crime and disease.

  7. Applying Cross-Decking and Activity-Based Costing to Military Distribution Centers: A Proposed Framework

    National Research Council Canada - National Science Library

    Elliott, Jonathan

    1997-01-01

    .... Cross-docking is a commercially proven approach to material distribution through a distribution center that can help reduce inventories, speed material flows, and cut related logistics activity costs...

  8. Design and Analysis of Electrical Distribution Networks and Balancing Markets in the UK: A New Framework with Applications

    Directory of Open Access Journals (Sweden)

    Vijayanarasimha Hindupur Pakka

    2016-02-01

    Full Text Available We present a framework for the design and simulation of electrical distribution systems and short term electricity markets specific to the UK. The modelling comprises packages relating to the technical and economic features of the electrical grid. The first package models the medium/low distribution networks with elements such as transformers, voltage regulators, distributed generators, composite loads, distribution lines and cables. This model forms the basis for elementary analysis such as load flow and short circuit calculations and also enables the investigation of effects of integrating distributed resources, voltage regulation, resource scheduling and the like. The second part of the modelling exercise relates to the UK short term electricity market with specific features such as balancing mechanism and bid-offer strategies. The framework is used for investigating methods of voltage regulation using multiple control technologies, to demonstrate the effects of high penetration of wind power on balancing prices and finally use these prices towards achieving demand response through aggregated prosumers.

  9. Distributed Energy Systems in the Built Environment - European Legal Framework on Distributed Energy Systems in the Built Environment

    NARCIS (Netherlands)

    Pront-van Bommel, S.; Bregman, A.

    2013-01-01

    This paper aims to outline the stimuli provided by European law for promoting integrated planning of distributed renewable energy installations on the one hand and its limitations thereof on the other hand, also with regard to the role of local governments.

  10. Optimal Electricity Distribution Framework for Public Space: Assessing Renewable Energy Proposals for Freshkills Park, New York City

    Directory of Open Access Journals (Sweden)

    Kaan Ozgun

    2015-03-01

    Full Text Available Integrating renewable energy into public space is becoming more common as a climate change solution. However, this approach is often guided by the environmental pillar of sustainability, with less focus on the economic and social pillars. The purpose of this paper is to examine this issue in the speculative renewable energy propositions for Freshkills Park in New York City submitted for the 2012 Land Art Generator Initiative (LAGI competition. This paper first proposes an optimal electricity distribution (OED framework in and around public spaces based on relevant ecology and energy theory (Odum’s fourth and fifth law of thermodynamics. This framework addresses social engagement related to public interaction, and economic engagement related to the estimated quantity of electricity produced, in conjunction with environmental engagement related to the embodied energy required to construct the renewable energy infrastructure. Next, the study uses the OED framework to analyse the top twenty-five projects submitted for the LAGI 2012 competition. The findings reveal an electricity distribution imbalance and suggest a lack of in-depth understanding about sustainable electricity distribution within public space design. The paper concludes with suggestions for future research.

  11. A trial of distributed portable data acquisition and processing system implementation: the qdpb - data processing with branchpoints

    International Nuclear Information System (INIS)

    Gritsaj, K.I.; Isupov, A.Yu.

    2001-01-01

    A trial of distributed portable data acquisition and processing system qdpb is issued. An experimental setup data and hardware dependent code is separated from the generic part of the qdpb system. The generic part implementation is described

  12. Describing the implementation of an innovative intervention and evaluating its effectiveness in increasing research capacity of advanced clinical nurses: using the consolidated framework for implementation research.

    Science.gov (United States)

    McKee, Gabrielle; Codd, Margaret; Dempsey, Orla; Gallagher, Paul; Comiskey, Catherine

    2017-01-01

    Despite advanced nursing roles having a research competency, participation in research is low. There are many barriers to participation in research and few interventions have been developed to address these. This paper aims to describe the implementation of an intervention to increase research participation in advanced clinical nursing roles and evaluate its effectiveness. The implementation of the intervention was carried out within one hospital site. The evaluation utilised a mixed methods design and a implementation science framework. All staff in advanced nursing roles were invited to take part, all those who were interested and had a project in mind could volunteer to participate in the intervention. The intervention consisted of the development of small research groups working on projects developed by the nurse participant/s and supported by an academic and a research fellow. The main evaluation was through focus groups. Output was analysed using thematic analysis. In addition, a survey questionnaire was circulated to all participants to ascertain their self-reported research skills before and after the intervention. The results of the survey were analysed using descriptive statistics. Finally an inventory of research outputs was collated. In the first year, twelve new clinical nurse-led research projects were conducted and reported in six peer reviewed papers, two non-peer reviewed papers and 20 conference presentations. The main strengths of the intervention were its promptness to complete research, to publish and to showcase clinical innovations. Main barriers identified were time, appropriate support from academics and from peers. The majority of participants had increased experience at scientific writing and data analysis. This study shows that an intervention, with minor financial resources; a top down approach; support of a hands on research fellow; peer collaboration with academics; strong clinical ownership by the clinical nurse researcher

  13. Online Data Monitoring Framework Based on Histogram Packaging in Network Distributed Data Acquisition Systems

    International Nuclear Information System (INIS)

    Konno, T; Ishitsuka, M; Kuze, M; Cabarera, A; Sakamoto, Y

    2011-01-01

    O nline monitor frameworkis a new general software framework for online data monitoring, which provides a way to collect information from online systems, including data acquisition, and displays them to shifters far from experimental sites. 'Monitor Server', a core system in this framework gathers the monitoring information from the online subsystems and the information is handled as collections of histograms named H istogram Package . Monitor Server broadcasts the histogram packages to 'Monitor Viewers', graphical user interfaces in the framework. We developed two types of the viewers with different technologies: Java and web browser. We adapted XML based file for the configuration of GUI components on the windows and graphical objects on the canvases. Monitor Viewer creates its GUIs automatically with the configuration files.This monitoring framework has been developed for the Double Chooz reactor neutrino oscillation experiment in France, but can be extended for general application to be used in other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  14. Stress distribution in Co-Cr implant frameworks after laser or TIG welding.

    Science.gov (United States)

    de Castro, Gabriela Cassaro; de Araújo, Cleudmar Amaral; Mesquita, Marcelo Ferraz; Consani, Rafael Leonardo Xediek; Nóbilo, Mauro Antônio de Arruda

    2013-01-01

    Lack of passivity has been associated with biomechanical problems in implant-supported prosthesis. The aim of this study was to evaluate the passivity of three techniques to fabricate an implant framework from a Co-Cr alloy by photoelasticity. The model was obtained from a steel die simulating an edentulous mandible with 4 external hexagon analog implants with a standard platform. On this model, five frameworks were fabricated for each group: a monoblock framework (control), laser and TIG welding frameworks. The photoelastic model was made from a flexible epoxy resin. On the photoelastic analysis, the frameworks were bolted onto the model for the verification of maximum shear stress at 34 selected points around the implants and 5 points in the middle of the model. The stresses were compared all over the photoelastic model, between the right, left, and center regions and between the cervical and apical regions. The values were subjected to two-way ANOVA, and Tukey's test (α=0.05). There was no significant difference among the groups and studied areas (p>0.05). It was concluded that the stresses generated around the implants were similar for all techniques.

  15. Designing and Implementing a Distributed System Architecture for the Mars Rover Mission Planning Software (Maestro)

    Science.gov (United States)

    Goldgof, Gregory M.

    2005-01-01

    Distributed systems allow scientists from around the world to plan missions concurrently, while being updated on the revisions of their colleagues in real time. However, permitting multiple clients to simultaneously modify a single data repository can quickly lead to data corruption or inconsistent states between users. Since our message broker, the Java Message Service, does not ensure that messages will be received in the order they were published, we must implement our own numbering scheme to guarantee that changes to mission plans are performed in the correct sequence. Furthermore, distributed architectures must ensure that as new users connect to the system, they synchronize with the database without missing any messages or falling into an inconsistent state. Robust systems must also guarantee that all clients will remain synchronized with the database even in the case of multiple client failure, which can occur at any time due to lost network connections or a user's own system instability. The final design for the distributed system behind the Mars rover mission planning software fulfills all of these requirements and upon completion will be deployed to MER at the end of 2005 as well as Phoenix (2007) and MSL (2009).

  16. Achieving behaviour change for detection of Lynch syndrome using the Theoretical Domains Framework Implementation (TDFI) approach: a study protocol.

    Science.gov (United States)

    Taylor, Natalie; Long, Janet C; Debono, Deborah; Williams, Rachel; Salisbury, Elizabeth; O'Neill, Sharron; Eykman, Elizabeth; Braithwaite, Jeffrey; Chin, Melvin

    2016-03-12

    Lynch syndrome is an inherited disorder associated with a range of cancers, and found in 2-5 % of colorectal cancers. Lynch syndrome is diagnosed through a combination of significant family and clinical history and pathology. The definitive diagnostic germline test requires formal patient consent after genetic counselling. If diagnosed early, carriers of Lynch syndrome can undergo increased surveillance for cancers, which in turn can prevent late stage cancers, optimise treatment and decrease mortality for themselves and their relatives. However, over the past decade, international studies have reported that only a small proportion of individuals with suspected Lynch syndrome were referred for genetic consultation and possible genetic testing. The aim of this project is to use behaviour change theory and implementation science approaches to increase the number and speed of healthcare professional referrals of colorectal cancer patients with a high-likelihood risk of Lynch syndrome to appropriate genetic counselling services. The six-step Theoretical Domains Framework Implementation (TDFI) approach will be used at two large, metropolitan hospitals treating colorectal cancer patients. Steps are: 1) form local multidisciplinary teams to map current referral processes; 2) identify target behaviours that may lead to increased referrals using discussion supported by a retrospective audit; 3) identify barriers to those behaviours using the validated Influences on Patient Safety Behaviours Questionnaire and TDFI guided focus groups; 4) co-design interventions to address barriers using focus groups; 5) co-implement interventions; and 6) evaluate intervention impact. Chi square analysis will be used to test the difference in the proportion of high-likelihood risk Lynch syndrome patients being referred for genetic testing before and after intervention implementation. A paired t-test will be used to assess the mean time from the pathology test results to referral for high

  17. Design of the HELICS High-Performance Transmission-Distribution-Communication-Market Co-Simulation Framework

    Energy Technology Data Exchange (ETDEWEB)

    Palmintier, Bryan S [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Krishnamurthy, Dheepak [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Top, Philip [Lawrence Livermore National Laboratories; Smith, Steve [Lawrence Livermore National Laboratories; Daily, Jeff [Pacific Northwest National Laboratory; Fuller, Jason [Pacific Northwest National Laboratory

    2017-10-12

    This paper describes the design rationale for a new cyber-physical-energy co-simulation framework for electric power systems. This new framework will support very large-scale (100,000+ federates) co-simulations with off-the-shelf power-systems, communication, and end-use models. Other key features include cross-platform operating system support, integration of both event-driven (e.g. packetized communication) and time-series (e.g. power flow) simulation, and the ability to co-iterate among federates to ensure model convergence at each time step. After describing requirements, we begin by evaluating existing co-simulation frameworks, including HLA and FMI, and conclude that none provide the required features. Then we describe the design for the new layered co-simulation architecture.

  18. Implementation of continuous-variable quantum key distribution with composable and one-sided-device-independent security against coherent attacks

    DEFF Research Database (Denmark)

    Gehring, Tobias; Haendchen, Vitus; Duhme, Joerg

    2015-01-01

    Secret communication over public channels is one of the central pillars of a modern information society. Using quantum key distribution this is achieved without relying on the hardness of mathematical problems, which might be compromised by improved algorithms or by future quantum computers. State......-of-the-art quantum key distribution requires composable security against coherent attacks for a finite number of distributed quantum states as well as robustness against implementation side channels. Here we present an implementation of continuous-variable quantum key distribution satisfying these requirements. Our...... with conventional optical communication technology, our work is a step towards practical implementations of quantum key distribution with state-of-the-art security based solely on telecom components....

  19. NASA System Safety Handbook. Volume 1; System Safety Framework and Concepts for Implementation

    Science.gov (United States)

    Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Smith, Curtis; Stamatelatos, Michael; Youngblood, Robert

    2011-01-01

    System safety assessment is defined in NPR 8715.3C, NASA General Safety Program Requirements as a disciplined, systematic approach to the analysis of risks resulting from hazards that can affect humans, the environment, and mission assets. Achievement of the highest practicable degree of system safety is one of NASA's highest priorities. Traditionally, system safety assessment at NASA and elsewhere has focused on the application of a set of safety analysis tools to identify safety risks and formulate effective controls.1 Familiar tools used for this purpose include various forms of hazard analyses, failure modes and effects analyses, and probabilistic safety assessment (commonly also referred to as probabilistic risk assessment (PRA)). In the past, it has been assumed that to show that a system is safe, it is sufficient to provide assurance that the process for identifying the hazards has been as comprehensive as possible and that each identified hazard has one or more associated controls. The NASA Aerospace Safety Advisory Panel (ASAP) has made several statements in its annual reports supporting a more holistic approach. In 2006, it recommended that "... a comprehensive risk assessment, communication and acceptance process be implemented to ensure that overall launch risk is considered in an integrated and consistent manner." In 2009, it advocated for "... a process for using a risk-informed design approach to produce a design that is optimally and sufficiently safe." As a rationale for the latter advocacy, it stated that "... the ASAP applauds switching to a performance-based approach because it emphasizes early risk identification to guide designs, thus enabling creative design approaches that might be more efficient, safer, or both." For purposes of this preface, it is worth mentioning three areas where the handbook emphasizes a more holistic type of thinking. First, the handbook takes the position that it is important to not just focus on risk on an individual

  20. Modelling altered revenue function based on varying power consumption distribution and electricity tariff charge using data analytics framework

    Science.gov (United States)

    Zainudin, W. N. R. A.; Ramli, N. A.

    2017-09-01

    In 2010, Energy Commission (EC) had introduced Incentive Based Regulation (IBR) to ensure sustainable Malaysian Electricity Supply Industry (MESI), promotes transparent and fair returns, encourage maximum efficiency and maintains policy driven end user tariff. To cater such revolutionary transformation, a sophisticated system to generate policy driven electricity tariff structure is in great need. Hence, this study presents a data analytics framework that generates altered revenue function based on varying power consumption distribution and tariff charge function. For the purpose of this study, the power consumption distribution is being proxy using proportion of household consumption and electricity consumed in KwH and the tariff charge function is being proxy using three-tiered increasing block tariff (IBT). The altered revenue function is useful to give an indication on whether any changes in the power consumption distribution and tariff charges will give positive or negative impact to the economy. The methodology used for this framework begins by defining the revenue to be a function of power consumption distribution and tariff charge function. Then, the proportion of household consumption and tariff charge function is derived within certain interval of electricity power. Any changes in those proportion are conjectured to contribute towards changes in revenue function. Thus, these changes can potentially give an indication on whether the changes in power consumption distribution and tariff charge function are giving positive or negative impact on TNB revenue. Based on the finding of this study, major changes on tariff charge function seems to affect altered revenue function more than power consumption distribution. However, the paper concludes that power consumption distribution and tariff charge function can influence TNB revenue to some great extent.

  1. A Unified Framework for Verification and Complexity Analysis of Real-Time and Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy

    1997-01-01

    .... These examples arise from a diverse set of application areas, including connection management protocols, clock synchronization, fault-tolerant distributed consensus, group communication, and real...

  2. A Distributed Control Framework for Integrated Photovoltaic-Battery-Based Islanded Microgrids

    DEFF Research Database (Denmark)

    Golsorkhi, Mohammad; Shafiee, Qobad; Lu, Dylan Dah-Chuan

    2017-01-01

    This paper proposes a new cooperative control framework for coordination of energy storage units (ESUs), photovoltaic (PV) panels and controllable load units in singlephase low voltage microgrids (MGs). The control objectives are defined and acted upon using a two level structure; primary...

  3. Distributed Power System Virtual Inertia Implemented by Grid-Connected Power Converters

    DEFF Research Database (Denmark)

    Fang, Jingyang; Li, Hongchang; Tang, Yi

    2018-01-01

    Renewable energy sources (RESs), e.g. wind and solar photovoltaics, have been increasingly used to meet worldwide growing energy demands and reduce greenhouse gas emissions. However, RESs are normally coupled to the power grid through fast-response power converters without any inertia, leading...... to decreased power system inertia. As a result, the grid frequency may easily go beyond the acceptable range under severe frequency events, resulting in undesirable load-shedding, cascading failures, or even large-scale blackouts. To address the ever-decreasing inertia issue, this paper proposes the concept...... of distributed power system virtual inertia, which can be implemented by grid-connected power converters. Without modifications of system hardware, power system inertia can be emulated by the energy stored in the dc-link capacitors of grid-connected power converters. By regulating the dc-link voltages...

  4. Implementation of Parallel Dynamic Simulation on Shared-Memory vs. Distributed-Memory Environments

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Shuangshuang; Chen, Yousu; Wu, Di; Diao, Ruisheng; Huang, Zhenyu

    2015-12-09

    Power system dynamic simulation computes the system response to a sequence of large disturbance, such as sudden changes in generation or load, or a network short circuit followed by protective branch switching operation. It consists of a large set of differential and algebraic equations, which is computational intensive and challenging to solve using single-processor based dynamic simulation solution. High-performance computing (HPC) based parallel computing is a very promising technology to speed up the computation and facilitate the simulation process. This paper presents two different parallel implementations of power grid dynamic simulation using Open Multi-processing (OpenMP) on shared-memory platform, and Message Passing Interface (MPI) on distributed-memory clusters, respectively. The difference of the parallel simulation algorithms and architectures of the two HPC technologies are illustrated, and their performances for running parallel dynamic simulation are compared and demonstrated.

  5. A multistage framework for reliability-based distribution expansion planning considering distributed generations by a self-adaptive global-based harmony search algorithm

    International Nuclear Information System (INIS)

    Shivaie, Mojtaba; Ameli, Mohammad T.; Sepasian, Mohammad S.; Weinsier, Philip D.; Vahidinasab, Vahid

    2015-01-01

    In this paper, the authors present a new multistage framework for reliability-based Distribution Expansion Planning (DEP) in which expansion options are a reinforcement and/or installation of substations, feeders, and Distributed Generations (DGs). The proposed framework takes into account not only costs associated with investment, maintenance, and operation, but also expected customer interruption cost in the optimization as four problem objectives. At the same time, operational restrictions, Kirchhoff's laws, radial structure limitation, voltage limits, and capital expenditure budget restriction are considered as problem constraints. The proposed model is a non-convex optimization problem having a non-linear, mixed-integer nature. Hence, a hybrid Self-adaptive Global-based Harmony Search Algorithm (SGHSA) and Optimal Power Flow (OPF) were used and followed by a fuzzy satisfying method in order to obtain the final optimal solution. The SGHSA is a recently developed optimization algorithm which imitates the music improvisation process. In this process, the harmonists improvise their instrument pitches, searching for the perfect state of harmony. The planning methodology was demonstrated on the 27-node, 13.8-kV test system in order to demonstrate the feasibility and capability of the proposed model. Simulation results illustrated the sufficiency and profitableness of the newly developed framework, when compared with other methods. - Highlights: • A new multistage framework is presented for reliability-based DEP problem. • In this paper, DGs are considered as an expansion option to increase the flexibility of the proposed model. • In this paper, effective factors of DEP problem are incorporated as a multi-objective model. • In this paper, three new algorithms HSA, IHSA and SGHSA are proposed. • Results obtained by the proposed SGHSA algorithm are better than others

  6. Direction dependence analysis: A framework to test the direction of effects in linear models with an implementation in SPSS.

    Science.gov (United States)

    Wiedermann, Wolfgang; Li, Xintong

    2018-04-16

    In nonexperimental data, at least three possible explanations exist for the association of two variables x and y: (1) x is the cause of y, (2) y is the cause of x, or (3) an unmeasured confounder is present. Statistical tests that identify which of the three explanatory models fits best would be a useful adjunct to the use of theory alone. The present article introduces one such statistical method, direction dependence analysis (DDA), which assesses the relative plausibility of the three explanatory models on the basis of higher-moment information about the variables (i.e., skewness and kurtosis). DDA involves the evaluation of three properties of the data: (1) the observed distributions of the variables, (2) the residual distributions of the competing models, and (3) the independence properties of the predictors and residuals of the competing models. When the observed variables are nonnormally distributed, we show that DDA components can be used to uniquely identify each explanatory model. Statistical inference methods for model selection are presented, and macros to implement DDA in SPSS are provided. An empirical example is given to illustrate the approach. Conceptual and empirical considerations are discussed for best-practice applications in psychological data, and sample size recommendations based on previous simulation studies are provided.

  7. Web Services Implementations at Land Process and Goddard Earth Sciences Distributed Active Archive Centers

    Science.gov (United States)

    Cole, M.; Bambacus, M.; Lynnes, C.; Sauer, B.; Falke, S.; Yang, W.

    2007-12-01

    NASA's vast array of scientific data within its Distributed Active Archive Centers (DAACs) is especially valuable to both traditional research scientists as well as the emerging market of Earth Science Information Partners. For example, the air quality science and management communities are increasingly using satellite derived observations in their analyses and decision making. The Air Quality Cluster in the Federation of Earth Science Information Partners (ESIP) uses web infrastructures of interoperability, or Service Oriented Architecture (SOA), to extend data exploration, use, and analysis and provides a user environment for DAAC products. In an effort to continually offer these NASA data to the broadest research community audience, and reusing emerging technologies, both NASA's Goddard Earth Science (GES) and Land Process (LP) DAACs have engaged in a web services pilot project. Through these projects both GES and LP have exposed data through the Open Geospatial Consortiums (OGC) Web Services standards. Reusing several different existing applications and implementation techniques, GES and LP successfully exposed a variety data, through distributed systems to be ingested into multiple end-user systems. The results of this project will enable researchers world wide to access some of NASA's GES & LP DAAC data through OGC protocols. This functionality encourages inter-disciplinary research while increasing data use through advanced technologies. This paper will concentrate on the implementation and use of OGC Web Services, specifically Web Map and Web Coverage Services (WMS, WCS) at GES and LP DAACs, and the value of these services within scientific applications, including integration with the DataFed air quality web infrastructure and in the development of data analysis web applications.

  8. How do small groups make decisions? : A theoretical framework to inform the implementation and study of clinical competency committees.

    Science.gov (United States)

    Chahine, Saad; Cristancho, Sayra; Padgett, Jessica; Lingard, Lorelei

    2017-06-01

    In the competency-based medical education (CBME) approach, clinical competency committees are responsible for making decisions about trainees' competence. However, we currently lack a theoretical model for group decision-making to inform this emerging assessment phenomenon. This paper proposes an organizing framework to study and guide the decision-making processes of clinical competency committees.This is an explanatory, non-exhaustive review, tailored to identify relevant theoretical and evidence-based papers related to small group decision-making. The search was conducted using Google Scholar, Web of Science, MEDLINE, ERIC, and PsycINFO for relevant literature. Using a thematic analysis, two researchers (SC & JP) met four times between April-June 2016 to consolidate the literature included in this review.Three theoretical orientations towards group decision-making emerged from the review: schema, constructivist, and social influence. Schema orientations focus on how groups use algorithms for decision-making. Constructivist orientations focus on how groups construct their shared understanding. Social influence orientations focus on how individual members influence the group's perspective on a decision. Moderators of decision-making relevant to all orientations include: guidelines, stressors, authority, and leadership.Clinical competency committees are the mechanisms by which groups of clinicians will be in charge of interpreting multiple assessment data points and coming to a shared decision about trainee competence. The way in which these committees make decisions can have huge implications for trainee progression and, ultimately, patient care. Therefore, there is a pressing need to build the science of how such group decision-making works in practice. This synthesis suggests a preliminary organizing framework that can be used in the implementation and study of clinical competency committees.

  9. Using the Consolidated Framework for Implementation Research to Identify Barriers and Facilitators for the Implementation of an Internet-Based Patient-Provider Communication Service in Five Settings: A Qualitative Study.

    Science.gov (United States)

    Varsi, Cecilie; Ekstedt, Mirjam; Gammon, Deede; Ruland, Cornelia M

    2015-11-18

    Although there is growing evidence of the positive effects of Internet-based patient-provider communication (IPPC) services for both patients and health care providers, their implementation into clinical practice continues to be a challenge. The 3 aims of this study were to (1) identify and compare barriers and facilitators influencing the implementation of an IPPC service in 5 hospital units using the Consolidated Framework for Implementation Research (CFIR), (2) assess the ability of the different constructs of CFIR to distinguish between high and low implementation success, and (3) compare our findings with those from other studies that used the CFIR to discriminate between high and low implementation success. This study was based on individual interviews with 10 nurses, 6 physicians, and 1 nutritionist who had used the IPPC to answer messages from patients. Of the 36 CFIR constructs, 28 were addressed in the interviews, of which 12 distinguished between high and low implementation units. Most of the distinguishing constructs were related to the inner setting domain of CFIR, indicating that institutional factors were particularly important for successful implementation. Health care providers' beliefs in the intervention as useful for themselves and their patients as well as the implementation process itself were also important. A comparison of constructs across ours and 2 other studies that also used the CFIR to discriminate between high and low implementation success showed that 24 CFIR constructs distinguished between high and low implementation units in at least 1 study; 11 constructs distinguished in 2 studies. However, only 2 constructs (patient need and resources and available resources) distinguished consistently between high and low implementation units in all 3 studies. The CFIR is a helpful framework for illuminating barriers and facilitators influencing IPPC implementation. However, CFIR's strength of being broad and comprehensive also limits its

  10. Traffic and trend analysis of local- and wide-area networks for a distributed PACS implementation

    Science.gov (United States)

    Gac, Robert J., Jr.; Harding, Douglas, Jr.; Weiser, John C.; Chacko, Anna K.; Radvany, Martin; Romlein, John R.

    2000-05-01

    Inductive Modeling Techniques (IMT) in a stand alone, distributed Picture Archiving and Communication System (PACS) or telemedicine environment can be utilized to monitor SNMP (Simple Network Management Protocol) enabled devices such as network switches, servers or workstations. A comprehensive approach using IMT is presented across the stages of the PACS lifecycle: Pre-PACS, Implementation, and Clinical Use. At each stage of the cycle, the results of IMT can be utilized to assist in assessing and forecasting future system loading. This loading represents a clinical trend analysis equating to the clinical workflow and delivery of services. Specific attention is directed to an understanding and thorough depiction of IMT methodology, focusing on the use of SNMP, the Management Information Base (MIB), and the data stream output that is mapped and placed in an object-oriented database and made available for web-based, real time, in-depth viewing and/or analysis. A thorough description of these outputs is presented, spotlighting potential report applications such as system failures; existing system, CPU, workstation, server and LAN/WAN link utilization; packet rates; application isolation; notification of system alarms; fault isolation; high/low bandwidth users; and data transfer rates. These types of data are increasingly required for programming LAN/WAN upgrades as digital imaging and PACS are implemented.

  11. Recognition of simple visual images using a sparse distributed memory: Some implementations and experiments

    Science.gov (United States)

    Jaeckel, Louis A.

    1990-01-01

    Previously, a method was described of representing a class of simple visual images so that they could be used with a Sparse Distributed Memory (SDM). Herein, two possible implementations are described of a SDM, for which these images, suitably encoded, will serve both as addresses to the memory and as data to be stored in the memory. A key feature of both implementations is that a pattern that is represented as an unordered set with a variable number of members can be used as an address to the memory. In the 1st model, an image is encoded as a 9072 bit string to be used as a read or write address; the bit string may also be used as data to be stored in the memory. Another representation, in which an image is encoded as a 256 bit string, may be used with either model as data to be stored in the memory, but not as an address. In the 2nd model, an image is not represented as a vector of fixed length to be used as an address. Instead, a rule is given for determining which memory locations are to be activated in response to an encoded image. This activation rule treats the pieces of an image as an unordered set. With this model, the memory can be simulated, based on a method of computing the approximate result of a read operation.

  12. Using the Quadruple Aim Framework to Measure Impact of Heath Technology Implementation: A Case Study of eConsult.

    Science.gov (United States)

    Liddy, Clare; Keely, Erin

    2018-01-01

    Health technology solutions are too often implemented without a true understanding of the system-level problem they seek to address, resulting in excessive costs, poor adoption, ineffectiveness, and ultimately failure. Before implementing or adopting health care innovations, stakeholders should complete a thorough assessment to ensure effectiveness and value. In this article, we describe how to evaluate the impact of a health technology innovation through the 4 dimensions of care outlined by the Quadruple Aim Framework, using our experience with the Champlain Building Access to Specialists through eConsultation (BASE) eConsult service as a case example. A descriptive overview of data was collected between April 1, 2011, and August 31, 2017, using 4 dimensions of care outlined by the Quadruple Aim Framework: patient experience, provider experience, costs, and population health. Findings were drawn from use data, primary care provider closeout surveys, surveys/interviews with patients and provider, and costing data. Overall, patients have received access to specialist advice within days and find the advice useful in 86% of cases. Provider experience is very positive, with satisfaction ratings of high/very high value in 94% of cases. The service cost a weighted average of $47.35/case, compared with $133.60/case for traditional referrals. In total, 1,299 primary care providers have enrolled in the service, completing 28,838 cases since 2011. Monthly case volumes have grown from an average of 13 cases/month in 2011 to 969 cases/month in 2016. The eConsult service has been widely adopted in our region and is currently expanding to new jurisdictions across Canada. However, although we successfully demonstrated eConsult's impact on patient experience, provider satisfaction, and reducing costs, we met several challenges in evaluating its impact on population health. More work is needed to evaluate eConsult's impact on key population health metrics (eg, mortality, morbidity

  13. Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System

    Science.gov (United States)

    Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan

    2009-01-01

    A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.

  14. A framework for crafting and implementing a congregational strategy in the local congregations of the Reformed Churches of South Africa

    Directory of Open Access Journals (Sweden)

    Aldeon B. Grobler

    2012-12-01

    Full Text Available The church is not like any other institution or organisation in society. Although the church is primarily invisible and spiritual, it is a visible organisation in the world, and it spans across borders of nations, languages and countries. John Calvin strongly rejected the notion that the church is only a spiritual organisation of which the visible administrative side is downplayed. The fellowship of the church must not only be seen as a mystical relation with Jesus Christ. During 2010, an empirical study was done on the extent to which congregations of the Reformed Churches of South Africa (RCSA adhere to the request to have a well-designed congregational strategy. The knowledge gained from a literature study on the science of Strategic Management and the results of the empirical study was combined into a framework for crafting and executing a congregational strategy. This framework can be used by congregational leaders to guide them through their own process of crafting and executing their unique congregational strategy.  The research was concluded with a recommendation that the Theological School of the RCSA should consider including a course on Strategic Management in the training syllabus of aspiring ministers. Considering that Strategic Management is a specialised management science, and external Strategic Management consultants tend to be expensive, the research also recommended that the Administrative Bureau of the RCSA consider employing their own Strategic Management consultant for the RCSA with the specific assignment of assisting and guiding all congregations with their congregational strategy. ’n Raamwerk vir die ontwerp en implementering van ’n gemeentestrategie in die plaaslike gemeentes van die Gereformeerde Kerke in Suid-Afrika. ’n Kerk is ’n unieke organisasie. Die kerk is primêr onsigbaar en geestelik van  aard. Tog  funksioneer dit as ’n instituut in die wêreld en moet dit doelmatig en doeltreffend bestuur word

  15. An HDF5-based framework for the distribution and analysis of ultrasonic concrete data

    Science.gov (United States)

    Prince, Luke; Clayton, Dwight; Santos-Villalobos, Hector

    2017-02-01

    There are many commercial ultrasonic tomography devices (UTDs) available for use in nondestructive evaluation (NDE) of reinforced concrete structures. These devices emit, measure, and store ultrasonic signals typically in the 25 kHz to 5 MHz frequency range. UTDs are characterized by a composition of multiple transducers, also known as a transducer array or phased array. Often, UTDs data are in a proprietary format. Consequently, NDE research data is limited to those who have prior non-disclosure agreements or the appropriate licenses. Thus, there is a need for a proper universal data framework to exist such that proprietary file datasets for different concrete specimens can be converted, organized, and stored with relative metadata for individual or collaborative NDE research. Building upon the Hierarchical Data Format (HDF5) model, we have developed a UTD data management framework and Graphic User Interface (GUI) to promote the algorithmic reconstruction of ultrasonic data in a controlled environment for easily reproducible and publishable results.

  16. INCOME DISTRIBUTIONAL IMPACTS OF TRADE POLICIES IN A MULTI-MARKET FRAMEWORK: A CASE IN PAKISTAN

    OpenAIRE

    Hudson, Darren; Ethridge, Don E.

    2000-01-01

    The impacts of using export taxes as a price control in a multi-market framework are explored using the cotton and yarn sectors in Pakistan as examples. Results show that the export tax on cotton increased domestic consumption and decreased exports of cotton in Pakistan, transferring income from cotton producers to yarn spinners and the government. There was a social loss to Pakistan in the cotton sector. The export tax on cotton increased domestic yarn production, consumption, exports, and i...

  17. Lessons from the implementation of LLIN distribution campaign in Ilorin Kwara State, Nigeria.

    Science.gov (United States)

    Obembe, Abiodun; Anyaele, Okorie Okogbue; Oduola, Adedayo Olatunbosun

    2014-05-28

    Studies implemented to evaluate the success of Long-lasting insecticidal nets (LLIN) distribution campaigns are often limited to ownership and utilization rates, neglecting other factors that directly affect the efficacy of the tool in malaria control. This study investigates sleeping habits and net maintenance behaviour in addition to LLIN ownership, utilization and the challenges associated with LLIN use among residents in Ilorin City where the tool has been massively distributed. A cross-sectional survey was conducted using pre-tested interviewer-administered questionnaire to obtain information from randomly selected household respondents in Ilorin, the Kwara State Capital. The study was conducted in July 2012, about sixteen months after the March 2011 distribution of LLIN in the locality. The results were analyzed using the EPI INFO 2007 version. LLIN ownership (85%) and utilization (37%) rates improved compared to earlier reports, though 29% of net users have noticed holes in the nets even as 26% claimed to have actually experienced mosquito bites under it. Most (92%) of the respondents who slept under LLIN the previous night before the study spent the first five hours of the night (19.00-23.00 hr) outdoors while 88% also engage in inappropriate net washing practices. All the LLIN users claimed to have experienced at least one malaria episode while 43% have had two or more episodes within the past twelve months. The use of LLIN among the respondents in this study was accompanied by chancy sleeping habits, inappropriate net maintenance practices and repeated experience of mosquito bites under the nets. This shows the need to sustain the will and confidence of LLIN users in this area through frequent monitoring and surveillance visits targeted at enlightening the people on habits that increase malaria exposure risks as well as proper use and maintenance of LLIN for maximum malaria vector control benefits.

  18. Barriers and enablers to implementing antenatal magnesium sulphate for fetal neuroprotection guidelines: a study using the theoretical domains framework.

    Science.gov (United States)

    Bain, Emily; Bubner, Tanya; Ashwood, Pat; Van Ryswyk, Emer; Simmonds, Lucy; Reid, Sally; Middleton, Philippa; Crowther, Caroline A

    2015-08-18

    Strong evidence supports administration of magnesium sulphate prior to birth at less than 30 weeks' gestation to prevent very preterm babies dying or developing cerebral palsy. This study was undertaken as part of The WISH (Working to Improve Survival and Health for babies born very preterm) Project, to assess health professionals' self-reported use of antenatal magnesium sulphate, and barriers and enablers to implementation of 2010 Australian and New Zealand clinical practice guidelines. Semi-structured, one-to-one interviews were conducted with obstetric and neonatal consultants and trainees, and midwives in 2011 (n = 24) and 2012-2013 (n = 21) at the Women's and Children's Hospital, South Australia. Transcribed interview data were coded using the Theoretical Domains Framework (describing 14 domains related to behaviour change) for analysis of barriers and enablers. In 2012-13, health professionals more often reported 'routinely' or 'sometimes' administering or advising their colleagues to administer magnesium sulphate for fetal neuroprotection (86% in 2012-13 vs. 46% in 2011). 'Knowledge and skills', 'memory, attention and decision processes', 'environmental context and resources', 'beliefs about consequences' and 'social influences' were key domains identified in the barrier and enabler analysis. Perceived barriers were the complex administration processes, time pressures, and the unpredictability of preterm birth. Enablers included education for staff and women at risk of very preterm birth, reminders and 'prompts', simplified processes for administration, and influential colleagues. This study has provided valuable data on barriers and enablers to implementing magnesium sulphate for fetal neuroprotection, with implications for designing and modifying future behaviour change strategies, to ensure optimal uptake of this neuroprotective therapy for very preterm infants.

  19. A QDWH-Based SVD Software Framework on Distributed-Memory Manycore Systems

    KAUST Repository

    Sukkari, Dalal; Ltaief, Hatem; Esposito, Aniello; Keyes, David E.

    2017-01-01

    , the inherent high level of concurrency associated with Level 3 BLAS compute-bound kernels ultimately compensates for the arithmetic complexity overhead. Using the ScaLAPACK two-dimensional block cyclic data distribution with a rectangular processor topology

  20. Research on the framework and key technologies of panoramic visualization for smart distribution network

    Science.gov (United States)

    Du, Jian; Sheng, Wanxing; Lin, Tao; Lv, Guangxian

    2018-05-01

    Nowadays, the smart distribution network has made tremendous progress, and the business visualization becomes even more significant and indispensable. Based on the summarization of traditional visualization technologies and demands of smart distribution network, a panoramic visualization application is proposed in this paper. The overall architecture, integrated architecture and service architecture of panoramic visualization application is firstly presented. Then, the architecture design and main functions of panoramic visualization system are elaborated in depth. In addition, the key technologies related to the application is discussed briefly. At last, two typical visualization scenarios in smart distribution network, which are risk warning and fault self-healing, proves that the panoramic visualization application is valuable for the operation and maintenance of the distribution network.