WorldWideScience

Sample records for framework implementing distributed

  1. On Design and Implementation of the Distributed Modular Audio Recognition Framework: Requirements and Specification Design Document

    OpenAIRE

    Mokhov, Serguei A.

    2009-01-01

    We present the requirements and design specification of the open-source Distributed Modular Audio Recognition Framework (DMARF), a distributed extension of MARF. The distributed version aggregates a number of distributed technologies (e.g. Java RMI, CORBA, Web Services) in a pluggable and modular model along with the provision of advanced distributed systems algorithms. We outline the associated challenges incurred during the design and implementation as well as overall specification of the p...

  2. A novel optimal distribution system planning framework implementing distributed generation in a deregulated electricity market

    Energy Technology Data Exchange (ETDEWEB)

    Porkar, S. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Poure, P. [Laboratoire d' Instrumentation Electronique de Nancy, LIEN, EA 3440, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France); Abbaspour-Tehrani-fard, A. [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran); Saadate, S. [Groupe de Recherches en Electrotechnique et Electronique de Nancy, GREEN-UHP, Universite Henri Poincare de Nancy I, BP 239, 54506 Vandoeuvre les Nancy Cedex (France)

    2010-07-15

    This paper introduces a new framework included mathematical model and a new software package interfacing two powerful softwares (MATLAB and GAMS) for obtaining the optimal distributed generation (DG) capacity sizing and sitting investments with capability to simulate large distribution system planning. The proposed optimization model allows minimizing total system planning costs for DG investment, DG operation and maintenance, purchase of power by the distribution companies (DISCOs) from transmission companies (TRANSCOs) and system power losses. The proposed model provides not only the DG size and site but also the new market price as well. Three different cases depending on system conditions and three different scenarios depending on different planning alternatives and electrical market structures, have been considered. They have allowed validating the economical and electrical benefits of introducing DG by solving the distribution system planning problem and by improving power quality of distribution system. DG installation increases the feeders' lifetime by reducing their loading and adds the benefit of using the existing distribution system for further load growth without the need for feeders upgrading. More, by investing in DG, the DISCO can minimize its total planning cost and reduce its customers' bills. (author)

  3. Implementation of Portion Approach in Distributed Firewall Application for Network Security Framework

    Directory of Open Access Journals (Sweden)

    Harleen Kaur

    2011-11-01

    Full Text Available The stimulate of this research seeks collaboration of firewalls which, could reach to the capability of distributed points of security policy; the front-end entity may much interact by the invaders so the separation between this entity and back-end entity to make the secure domain protection is necessary; collaborative security entity has the various task in the organization and there is a certain security policy to apply in; the entities like DPFF have to be protected from outsiders. Firewalls are utilized typically to be the main layer of security in the network framework. The research is presented the particular segment of the proposed framework that DPFF based on the developed iptable firewall to be the layers of defense, which is protected front and backend of the framework with a dynamic security and policy update to control the frameworks safeguard through proposed portion approach algorithm that utilize to reduce the traffic and efficiency in detection and policy update mechanism. The policy update mechanism for DPFF is given the way of its employment. The complete framework signifies a distributed firewall, where the administrator configures the policy rules set, which could be separately or else from administration nodes side.

  4. FleCSPH - a parallel and distributed SPH implementation based on the FleCSI framework

    Energy Technology Data Exchange (ETDEWEB)

    2017-06-20

    FleCSPH is a multi-physics compact application that exercises FleCSI parallel data structures for tree-based particle methods. In particular, FleCSPH implements a smoothed-particle hydrodynamics (SPH) solver for the solution of Lagrangian problems in astrophysics and cosmology. FleCSPH includes support for gravitational forces using the fast multipole method (FMM).

  5. Modeling and Implementation of Cattle/Beef Supply Chain Traceability Using a Distributed RFID-Based Framework in China

    Science.gov (United States)

    Liang, Wanjie; Cao, Jing; Fan, Yan; Zhu, Kefeng; Dai, Qiwei

    2015-01-01

    In recent years, traceability systems have been developed as effective tools for improving the transparency of supply chains, thereby guaranteeing the quality and safety of food products. In this study, we proposed a cattle/beef supply chain traceability model and a traceability system based on radio frequency identification (RFID) technology and the EPCglobal network. First of all, the transformations of traceability units were defined and analyzed throughout the cattle/beef chain. Secondly, we described the internal and external traceability information acquisition, transformation, and transmission processes throughout the beef supply chain in detail, and explained a methodology for modeling traceability information using the electronic product code information service (EPCIS) framework. Then, the traceability system was implemented based on Fosstrak and FreePastry software packages, and animal ear tag code and electronic product code (EPC) were employed to identify traceability units. Finally, a cattle/beef supply chain included breeding business, slaughter and processing business, distribution business and sales outlet was used as a case study to evaluate the beef supply chain traceability system. The results demonstrated that the major advantages of the traceability system are the effective sharing of information among business and the gapless traceability of the cattle/beef supply chain. PMID:26431340

  6. Design and Implement a MapReduce Framework for Executing Standalone Software Packages in Hadoop-based Distributed Environments

    Directory of Open Access Journals (Sweden)

    Chao-Chun Chen

    2013-12-01

    Full Text Available The Hadoop MapReduce is the programming model of designing the auto scalable distributed computing applications. It provides developer an effective environment to attain automatic parallelization. However, most existing manufacturing systems are arduous and restrictive to migrate to MapReduce private cloud, due to the platform incompatible and tremendous complexity of system reconstruction. For increasing the efficiency of manufacturing systems with minimum modification of existing systems, we design a framework in this thesis, called MC-Framework: Multi-uses-based Cloudizing-Application Framework. It provides the simple interface to users for fairly executing requested tasks worked with traditional standalone software packages in MapReduce-based private cloud environments. Moreover, this thesis focuses on the multiuser workloads, but the default Hadoop scheduling scheme, i.e., FIFO, would increase delay under multiuser scenarios. Hence, we also propose a new scheduling mechanism, called Job-Sharing Scheduling, to explore and fairly share the jobs to machines in the MapReduce-based private cloud. Then, we prototype an experimental virtual-metrology module of a manufacturing system as a case study to verify and analysis the proposed MC-Framework. The results of our experiments indicate that our proposed framework enormously improved the time performance compared with the original package.

  7. Distributed Energy Implementation Options

    Energy Technology Data Exchange (ETDEWEB)

    Shah, Chandralata N [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-13

    This presentation covers the options for implementing distributed energy projects. It distinguishes between options available for distributed energy that is government owned versus privately owned, with a focus on the privately owned options including Energy Savings Performance Contract Energy Sales Agreements (ESPC ESAs). The presentation covers the new ESPC ESA Toolkit and other Federal Energy Management Program resources.

  8. A coordination-based framework for distributed constraint solving

    NARCIS (Netherlands)

    P. Zoeteweij (Peter)

    2002-01-01

    textabstractDICE (DIstributed Constraint Environment) is a framework for the construction of distributed constraint solvers from software components in a number of predefined categories. The framework is implemented using the Manifold coordination language, and coordinates the components of a

  9. Invertebrate distribution patterns and river typology for the implementation of the water framework directive in Martinique, French Lesser Antilles

    Directory of Open Access Journals (Sweden)

    Bernadet C.

    2013-03-01

    Full Text Available Over the past decade, Europe’s Water Framework Directive provided compelling reasons for developing tools for the biological assessment of freshwater ecosystem health in member States. Yet, the lack of published study for Europe’s overseas regions reflects minimal knowledge of the distribution patterns of aquatic species in Community’s outermost areas. Benthic invertebrates (84 taxa and land-cover, physical habitat and water chemistry descriptors (26 variables were recorded at fifty-one stations in Martinique, French Lesser Antilles. Canonical Correspondence Analysis and Ward’s algorithm were used to bring out patterns in community structure in relation to environmental conditions, and variation partitioning was used to specify the influence of geomorphology and anthropogenic disturbance on invertebrate communities. Species richness decreased from headwater to lowland streams, and species composition changed from northern to southern areas. The proportion of variation explained by geomorphological variables was globally higher than that explained by anthropogenic variables. Geomorphology and land cover played key roles in delineating ecological sub-regions for the freshwater biota. Despite this and the small surface area of Martinique (1080 km2, invertebrate communities showed a clear spatial turnover in composition and biological traits (e.g., insects, crustaceans and molluscs in relation to natural conditions.

  10. Implementation of Parallelization Contract Mechanism Extension of Map Reduce Framework for the Efficient Execution Time over Geo-Distributed Dataset

    Directory of Open Access Journals (Sweden)

    Ms. Kirtimalini N.

    2014-12-01

    Full Text Available The world is surrounded by technology and Internet with extreme dynamic changes day by day in it.Due to that quintillion bytes of data is created. Source of this data is in the form of petabytes and zettabytes,which is known as Big data.Examples of such data are climate information, trajectory information, transaction records, web site usage data etc .As this data is in the abundant form so that not easy to process and require more time to execute.Hadoop is only scalable that is it can reliably store and process petabytes. Hadoop plays an important role in processing and handling big data It includes MapReduce – offline computing engine, HDFS – Hadoop Distributed file system, HBase – online data access.Map Reduce functions as dividing input files into chunks and processing these in a series of parallelizable steps., mapping and reducing constitute the essential phases for a Map Reduce job. As this freamework provides solution for large data nodes by providing distributed environment. Moving all input data to a single datacenter before processing the data is expensive. Hence we concentrate on geographical distribution of geo-distributed data for sequential execution of map reduce jobs to optimize the execution time. But it is observed from various results that mapping and reducing function is not sufficient for all type of data processing. The fixed execution strategy of map reduce program is not optimal for many task as it does not know about the behavior of the functions. Thus, to overcome these issues, we are enhancing our proposed work with parallelization contracts. These contracts help to capture a reasonable amount of semantics for executing any type of task with reduced time consumption. The parallelization contracts include input and output contract which includes the constraints and functions of data execution The main aim of this paper is to discuss various known Map reduce technology techniques available for geodistributed data

  11. DIRAC distributed secure framework

    Science.gov (United States)

    Casajus, A.; Graciani, R.; LHCb DIRAC Team

    2010-04-01

    DIRAC, the LHCb community Grid solution, provides access to a vast amount of computing and storage resources to a large number of users. In DIRAC users are organized in groups with different needs and permissions. In order to ensure that only allowed users can access the resources and to enforce that there are no abuses, security is mandatory. All DIRAC services and clients use secure connections that are authenticated using certificates and grid proxies. Once a client has been authenticated, authorization rules are applied to the requested action based on the presented credentials. These authorization rules and the list of users and groups are centrally managed in the DIRAC Configuration Service. Users submit jobs to DIRAC using their local credentials. From then on, DIRAC has to interact with different Grid services on behalf of this user. DIRAC has a proxy management service where users upload short-lived proxies to be used when DIRAC needs to act on behalf of them. Long duration proxies are uploaded by users to a MyProxy service, and DIRAC retrieves new short delegated proxies when necessary. This contribution discusses the details of the implementation of this security infrastructure in DIRAC.

  12. A conceptual framework for implementation fidelity

    Directory of Open Access Journals (Sweden)

    Booth Andrew

    2007-11-01

    Full Text Available Abstract Background Implementation fidelity refers to the degree to which an intervention or programme is delivered as intended. Only by understanding and measuring whether an intervention has been implemented with fidelity can researchers and practitioners gain a better understanding of how and why an intervention works, and the extent to which outcomes can be improved. Discussion The authors undertook a critical review of existing conceptualisations of implementation fidelity and developed a new conceptual framework for understanding and measuring the process. The resulting theoretical framework requires testing by empirical research. Summary Implementation fidelity is an important source of variation affecting the credibility and utility of research. The conceptual framework presented here offers a means for measuring this variable and understanding its place in the process of intervention implementation.

  13. National Authentication Framework Implementation Study

    Science.gov (United States)

    2009-12-01

    mapped to the user at the SP end. b. Criterion #2 In terms of criteron 2, for the Centralized Model, the use of a global identifier may require a...PPI to map currently-existing IDs with SPs to the new global identifier. The nature of the Federated Model supports this requirement organically...at 10 cm. Large-scale implementations include the Octopus Card in Hong Kong and the Ezylink card in Singapore. While the cost of a contactless smart

  14. Implementing the water framework directive in Denmark

    DEFF Research Database (Denmark)

    Jacobsen, Brian H.; Anker, Helle Tegner; Baaner, Lasse

    2017-01-01

    One of the major challenges in the implementation of the Water Framework Directive (WFD) is how to address diffuse agricultural pollution of the aquatic environment. In Denmark the implementation of agricultural measures has been fraught with difficulty in the form of delays and legal proceedings...

  15. A systematic review of implementation frameworks of innovations in healthcare and resulting generic implementation framework.

    Science.gov (United States)

    Moullin, Joanna C; Sabater-Hernández, Daniel; Fernandez-Llimos, Fernando; Benrimoj, Shalom I

    2015-03-14

    Implementation science and knowledge translation have developed across multiple disciplines with the common aim of bringing innovations to practice. Numerous implementation frameworks, models, and theories have been developed to target a diverse array of innovations. As such, it is plausible that not all frameworks include the full range of concepts now thought to be involved in implementation. Users face the decision of selecting a single or combining multiple implementation frameworks. To aid this decision, the aim of this review was to assess the comprehensiveness of existing frameworks. A systematic search was undertaken in PubMed to identify implementation frameworks of innovations in healthcare published from 2004 to May 2013. Additionally, titles and abstracts from Implementation Science journal and references from identified papers were reviewed. The orientation, type, and presence of stages and domains, along with the degree of inclusion and depth of analysis of factors, strategies, and evaluations of implementation of included frameworks were analysed. Frameworks were assessed individually and grouped according to their targeted innovation. Frameworks for particular innovations had similar settings, end-users, and 'type' (descriptive, prescriptive, explanatory, or predictive). On the whole, frameworks were descriptive and explanatory more often than prescriptive and predictive. A small number of the reviewed frameworks covered an implementation concept(s) in detail, however, overall, there was limited degree and depth of analysis of implementation concepts. The core implementation concepts across the frameworks were collated to form a Generic Implementation Framework, which includes the process of implementation (often portrayed as a series of stages and/or steps), the innovation to be implemented, the context in which the implementation is to occur (divided into a range of domains), and influencing factors, strategies, and evaluations. The selection of

  16. Corporate compliance: framework and implementation.

    Science.gov (United States)

    Fowler, N

    1999-01-01

    The federal government has created numerous programs to combat fraud and abuse. The government now encourages healthcare facilities to have a corporate compliance program (CCP), a plan that reduces the chances that the facility will violate laws or regulations. A CCP is an organization-wide program comprised of a code of conduct and written policies, internal monitoring and auditing standards, employee training, feedback mechanisms and other features, all designed to prevent and detect violations of governmental laws, regulations and policies. It is a system or method ensuring that employees understand and will comply with laws that apply to what they do every day. Seven factors, based on federal sentencing guidelines, provide the framework for developing a CCP. First, a facility must establish rules that are reasonably capable of reducing criminal conduct. Second, high-level personnel must oversee the compliance effort. Third, a facility must use due care in delegating authority in the compliance initiative. Fourth, standards must be communicated effectively to employees, and fifth, a facility must take reasonable steps to achieve compliance. Sixth, standards must be enforced consistently across the organization and last, standards must be modified or changed for reported concerns, to ensure they are not repeated. PROMINA Health System, Inc. in Atlanta, Ga., designed a program to meet federal guidelines. It started with a self-assessment to define its areas or risk. Next, it created the internal structure and assigned organizational responsibility for running the CCP. PROMINA then developed standards of business and professional conduct, established vehicles of communication and trained employees on the standards. Finally, it continues to develop evidence of the program's effectiveness by monitoring and documenting its compliance activities.

  17. A Review of Telehealth Service Implementation Frameworks

    Directory of Open Access Journals (Sweden)

    Liezl Van Dyk

    2014-01-01

    Full Text Available Despite the potential of telehealth services to increase the quality and accessibility of healthcare, the success rate of such services has been disappointing. The purpose of this paper is to find and compare existing frameworks for the implementation of telehealth services that can contribute to the success rate of future endeavors. After a thorough discussion of these frameworks, this paper outlines the development methodologies in terms of theoretical background, methodology and validation. Finally, the common themes and formats are identified for consideration in future implementation. It was confirmed that a holistic implementation approach is needed, which includes technology, organizational structures, change management, economic feasibility, societal impacts, perceptions, user-friendliness, evaluation and evidence, legislation, policy and governance. Furthermore, there is some scope for scientifically rigorous framework development and validation approaches.

  18. Designing Directories in Distributed Systems: A Systematic Framework

    OpenAIRE

    Chandy, K. Mani; Schooler, Eve M.

    1996-01-01

    This paper proposes a framework for the systematic design of directory-based distributed applications. We evaluate a space of directory designs using our framework. We present a case study consisting of design, implementation and analysis of directories for a multicast application. Our framework is based on a model that extends the formal concept of process knowledge in distributed systems. This concept is used informally in phrases such as "process p knows when it is in state s that process ...

  19. Implementing a National Qualifications Framework in Lithuania

    Science.gov (United States)

    Tutlys, Vidmantas; Spudyte, Irma

    2011-01-01

    The design of the national qualifications framework (NQF) in Lithuania started in 2006. The NQF was officially approved by the government decree in May 2010. This article explores the influence of the processes of institutional change on the reform of the national system of qualifications in Lithuania through the implementation of the NQF, looking…

  20. THE LABVIEW RADE FRAMEWORK DISTRIBUTED ARCHITECTURE

    CERN Document Server

    Andreassen, O O; Raimondo, A; Rijllart, A; Shaipov, V; Sorokoletov, R

    2011-01-01

    For accelerator GUI applications there is a need for a rapid development environment to create expert tools or to prototype operator applications. Typically a variety of tools are being used, such as Matlab or Excel, but their scope is limited, either because of their low flexibility or limited integration into the accelerator infrastructure. In addition, having several tools obliges users to deal with different programming techniques and data structures. We have addressed these limitations by using LabVIEW, extending it with interfaces to C++ and Java. In this way it fulfils requirements of ease of use, flexibility and connectivity, which makes up what we refer to as the RADE framework. Recent application requirements could only be met by implementing a distributed architecture with multiple servers running multiple services. This brought us the additional advantage to implement redundant services, to increase the availability and to make transparent updates. We will present two applications requiring high a...

  1. A Framework of Semantic Information Representation in Distributed Environments

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An information representation framework is designed to overcome the problem of semantic heterogeneity in distributed environments in this paper. Emphasis is placed on establishing an XML-oriented semantic data model and the mapping between XML data based on a global ontology semantic view. The framework is implemented in Web Service, which enhances information process efficiency, accuracy and the semantic interoperability as well.

  2. Distributed Computing Framework for Synthetic Radar Application

    Science.gov (United States)

    Gurrola, Eric M.; Rosen, Paul A.; Aivazis, Michael

    2006-01-01

    We are developing an extensible software framework, in response to Air Force and NASA needs for distributed computing facilities for a variety of radar applications. The objective of this work is to develop a Python based software framework, that is the framework elements of the middleware that allows developers to control processing flow on a grid in a distributed computing environment. Framework architectures to date allow developers to connect processing functions together as interchangeable objects, thereby allowing a data flow graph to be devised for a specific problem to be solved. The Pyre framework, developed at the California Institute of Technology (Caltech), and now being used as the basis for next-generation radar processing at JPL, is a Python-based software framework. We have extended the Pyre framework to include new facilities to deploy processing components as services, including components that monitor and assess the state of the distributed network for eventual real-time control of grid resources.

  3. Knowledge Framework Implementation with Multiple Architectures - 13090

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P. [Applied Research Center, Florida International University, Miami, FL 33174 (United States); DeGregory, J. [Office of D and D and Facility Engineering, Environmental Management, Department of Energy (United States)

    2013-07-01

    Multiple kinds of knowledge management systems are operational in public and private enterprises, large and small organizations with a variety of business models that make the design, implementation and operation of integrated knowledge systems very difficult. In recent days, there has been a sweeping advancement in the information technology area, leading to the development of sophisticated frameworks and architectures. These platforms need to be used for the development of integrated knowledge management systems which provides a common platform for sharing knowledge across the enterprise, thereby reducing the operational inefficiencies and delivering cost savings. This paper discusses the knowledge framework and architecture that can be used for the system development and its application to real life need of nuclear industry. A case study of deactivation and decommissioning (D and D) is discussed with the Knowledge Management Information Tool platform and framework. D and D work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with DOE sites, the Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. (authors)

  4. Arcade: A Web-Java Based Framework for Distributed Computing

    Science.gov (United States)

    Chen, Zhikai; Maly, Kurt; Mehrotra, Piyush; Zubair, Mohammad; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    Distributed heterogeneous environments are being increasingly used to execute a variety of large size simulations and computational problems. We are developing Arcade, a web-based environment to design, execute, monitor, and control distributed applications. These targeted applications consist of independent heterogeneous modules which can be executed on a distributed heterogeneous environment. In this paper we describe the overall design of the system and discuss the prototype implementation of the core functionalities required to support such a framework.

  5. A Software Rejuvenation Framework for Distributed Computing

    Science.gov (United States)

    Chau, Savio

    2009-01-01

    A performability-oriented conceptual framework for software rejuvenation has been constructed as a means of increasing levels of reliability and performance in distributed stateful computing. As used here, performability-oriented signifies that the construction of the framework is guided by the concept of analyzing the ability of a given computing system to deliver services with gracefully degradable performance. The framework is especially intended to support applications that involve stateful replicas of server computers.

  6. Managing Risks in Distributed Software Projects: An Integrative Framework

    DEFF Research Database (Denmark)

    Persson, John Stouby; Mathiassen, Lars; Boeg, Jesper

    2009-01-01

    development, there are currently no frameworks available for managing risks related to geographical distribution. On this background, we systematically review the literature on geographically distributed software projects. Based on the review, we synthesize what we know about risks and risk resolution......Software projects are increasingly geographically distributed with limited face-to-face interaction between participants. These projects face particular challenges that need carefulmanagerial attention. While risk management has been adopted with success to address other challenges within software...... techniques into an integrative framework for managing risks in distributed contexts. Subsequent implementation of a Web-based tool helped us refine the framework based on empirical evaluation of its practical usefulness.We conclude by discussing implications for both research and practice....

  7. Making sense of implementation theories, models and frameworks

    National Research Council Canada - National Science Library

    Nilsen, Per

    2015-01-01

    .... The aim of this article is to propose a taxonomy that distinguishes between different categories of theories, models and frameworks in implementation science, to facilitate appropriate selection...

  8. Maintenance Management in Network Utilities Framework and Practical Implementation

    CERN Document Server

    Gómez Fernández, Juan F

    2012-01-01

    In order to satisfy the needs of their customers, network utilities require specially developed maintenance management capabilities. Maintenance Management information systems are essential to ensure control, gain knowledge and improve-decision making in companies dealing with network infrastructure, such as distribution of gas, water, electricity and telecommunications. Maintenance Management in Network Utilities studies specified characteristics of maintenance management in this sector to offer a practical approach to defining and implementing  the best management practices and suitable frameworks.   Divided into three major sections, Maintenance Management in Network Utilities defines a series of stages which can be followed to manage maintenance frameworks properly. Different case studies provide detailed descriptions which illustrate the experience in real company situations. An introduction to the concepts is followed by main sections including: • A Literature Review: covering the basic concepts an...

  9. A framework for distributed manufacturing applications

    OpenAIRE

    Leitão, Paulo; Restivo, Francisco

    2000-01-01

    The new organisational structures used in world wide manufacturing systems require the development of distributed applications, which present solutions to their requirements. The work research in the distributed manufacturing control leads to emergent paradigms, such as Holonic Manufacturing Systems (HMS) and Bionic Manufacturing Systems (BMS), which translates the concepts from social organisations and biological systems to the manufacturing world. This paper present a Framework for the deve...

  10. Development of a South African cybersecurity policy implementation framework

    CSIR Research Space (South Africa)

    Jansen van Vuuren, JC

    2013-03-01

    Full Text Available of the policy is still in its very early stages. In this paper, the authors propose and describe a possible cybersecurity implementation framework for South Africa. This implementation framework is based on previous analysis of structures in other countries, a...

  11. Assessing citation networks for dissemination and implementation research frameworks.

    Science.gov (United States)

    Skolarus, Ted A; Lehmann, Todd; Tabak, Rachel G; Harris, Jenine; Lecy, Jesse; Sales, Anne E

    2017-07-28

    A recent review of frameworks used in dissemination and implementation (D&I) science described 61 judged to be related either to dissemination, implementation, or both. The current use of these frameworks and their contributions to D&I science more broadly has yet to be reviewed. For these reasons, our objective was to determine the role of these frameworks in the development of D&I science. We used the Web of Science™ Core Collection and Google Scholar™ to conduct a citation network analysis for the key frameworks described in a recent systematic review of D&I frameworks (Am J Prev Med 43(3):337-350, 2012). From January to August 2016, we collected framework data including title, reference, publication year, and citations per year and conducted descriptive and main path network analyses to identify those most important in holding the current citation network for D&I frameworks together. The source article contained 119 cited references, with 50 published articles and 11 documents identified as a primary framework reference. The average citations per year for the 61 frameworks reviewed ranged from 0.7 to 103.3 among articles published from 1985 to 2012. Citation rates from all frameworks are reported with citation network analyses for the framework review article and ten highly cited framework seed articles. The main path for the D&I framework citation network is presented. We examined citation rates and the main paths through the citation network to delineate the current landscape of D&I framework research, and opportunities for advancing framework development and use. Dissemination and implementation researchers and practitioners may consider frequency of framework citation and our network findings when planning implementation efforts to build upon this foundation and promote systematic advances in D&I science.

  12. Post Implementation Review Framework and Procedures

    Data.gov (United States)

    Social Security Administration — This template outlines the Social Security Administration's (SSA) approach to initiating, conducting, and completing Post Implementation Reviews (PIRs). The template...

  13. Object-oriented framework for distributed simulation

    Science.gov (United States)

    Hunter, Julia; Carson, John A.; Colley, Martin; Standeven, John; Callaghan, Victor

    1999-06-01

    The benefits of object-oriented technology are widely recognized in software engineering. This paper describes the use of the object-oriented paradigm to create distributed simulations. The University of Essex Robotics and Intelligent Machines group has been carrying out research into distributed vehicle simulation since 1992. Part of this research has focused on the development of simulation systems to assist in the design of robotic vehicles. This paper describes the evolution of these systems, from an early toolkit used for teaching robotics to recent work on using simulation as a design tool in the creation of a new generation of unmanned underwater vehicles. It outlines experiences gained in using PVM, and ongoing research into the use of the emerging High Level Architecture as the basis for these frameworks. The paper concludes with the perceived benefits of adopting object-oriented methodologies as the basis for simulation frameworks.

  14. Distributed Particle Filter Implementation with Intermittent/Irregular Consensus Convergence

    CERN Document Server

    Mohammadi, Arash

    2011-01-01

    Motivated by non-linear, non-Gaussian, distributed multi-sensor/agent navigation and tracking applications, we propose a multi-rate consensus/fusion based framework for distributed implementation of the particle filter (CF/DPF). The CF/DPF framework is based on running localized particle filters to estimate the overall state vector at each observation node. Separate fusion filters are designed to consistently assimilate the local filtering distributions into the global posterior by compensating for the common past information between neighbouring nodes. The CF/DPF offers two distinct advantages over its counterparts. First, the CF/DPF framework is suitable for scenarios where network connectivity is intermittent and consensus can not be reached between two consecutive observations. Second, the CF/DPF is not limited to the Gaussian approximation for the global posterior density. A third contribution of the paper is the derivation of the optimal posterior Cram\\'er-Rao lower bound (PCRLB) for the distributed arc...

  15. MES Development Framework: Concepts, Ideas, Implementation

    DEFF Research Database (Denmark)

    Caponetti, Fabio; Papaleo, Massimiliano

    to the definition of a novel MES Development Framework, MDF. MDF empowers Visual Studio .NET with an Addin for MES definition. A custom class generator (CCG) generates from scratch to a fully customized ISA-95 compliant database and related C#.NET project in a few seconds. Customized classes are created as a data...

  16. Framework for the implementation of Business Cybersecurity

    CSIR Research Space (South Africa)

    Jacobs, PC

    2015-05-01

    Full Text Available typically prescribe sets of controls to be implemented, such as technical controls, administrative controls and physical controls. Most of these documents also describe very specific capabilities that a business has to develop in securing their cyberdomain...

  17. Global sourcing practices: a framework to improve sourcing strategy implementation

    OpenAIRE

    Mohamad, Marini Nurbanum Binti

    2009-01-01

    The aim of the research reported in this thesis is to gain understanding of global sourcing practices of companies in the UK and to develop a framework to improve sourcing strategy implementation. This research was conducted by carrying out literature review, analysis of case studies through semi-structured interviews, analysis of an online-based survey, development of a global sourcing framework, feedback process and finally the refinement of the framework. The global sourc...

  18. MES Development Framework: Concepts, Ideas, Implementation

    DEFF Research Database (Denmark)

    Caponetti, Fabio; Papaleo, Massimiliano

    through product design, coding and testing. Cutting down the time between early analysis and working releases makes the adoption of an iterative development procedure possible and effective, like Unified Process. The key idea relies on the definition of a system. It should be able to use as input a MES...... to the definition of a novel MES Development Framework, MDF. MDF empowers Visual Studio .NET with an Addin for MES definition. A custom class generator (CCG) generates from scratch to a fully customized ISA-95 compliant database and related C#.NET project in a few seconds. Customized classes are created as a data...... interface and as a model of physical entities. Through the C# partial class techniques, further customisations are possible. Even if the MES is regenerated several times, any custom change will remain. MES behavior is described by a work-flow in a graphical way as a work-flow foundation project. High...

  19. Implementation framework for e-solutions for trade facilitation

    NARCIS (Netherlands)

    Stijn, E. van; Phuaphanthong, T.; Keretho, S.; Pikart, M.; Hofman, W.J.; Tan, Y.-H.

    2011-01-01

    To offer practical guidelines for the implementation of e-Solutions for Trade Facilitation (e-ST), such as e-Customs and Single Window, we provide the Implementation Framework for e-Solutions for Trade facilitation (e-STIF). The e-STIF is meant for policy managers, who are responsible for overseeing

  20. Implementation of Computational Electromagnetic on Distributed Systems

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Now the new generation of technology could raise the bar for distributed computing. It seems to be a trend to solve computational electromagnetic work on a distributed system with parallel computing techniques. In this paper, we analyze the parallel characteristics of the distributed system and the possibility of setting up a tightly coupled distributed system by using LAN in our lab. The analysis of the performance of different computational methods, such as FEM, MOM, FDTD and finite difference method, are given. Our work on setting up a distributed system and the performance of the test bed is also included. At last, we mention the implementation of one of our computational electromagnetic codes.

  1. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2014-05-01

    The GFCS is a global partnership of governments and UN and international agencies that produce and use climate information and services. WMO, which is leading the initiative in collaboration with UN ISDR, WHO, WFP, FAO, UNESCO, UNDP and other UN and international partners are pooling their expertise and resources in order to co-design and co-produce knowledge, information and services to support effective decision making in response to climate variability and change in four priority areas (agriculture and fod security, water, health and disaster risk reduction). To address the entire value chain for the effective production and application of climate services the GFCS main components or pillars are being implemented, namely: • User Interface Platform — to provide ways for climate service users and providers to interact to identify needs and capacities and improve the effectiveness of the Framework and its climate services; • Climate Services Information System — to produce and distribute climate data, products and information according to the needs of users and to agreed standards; • Observations and Monitoring - to generate the necessary data for climate services according to agreed standards; • Research, Modelling and Prediction — to harness science capabilities and results and develop appropriate tools to meet the needs of climate services; • Capacity Building — to support the systematic development of the institutions, infrastructure and human resources needed for effective climate services. Activities are being implemented in various countries in Africa, the Caribbean and South pacific Islands. This paper will provide details on the status of implementation of the GFCS worldwider.

  2. Distributed tactical reasoning framework for intelligent vehicles

    Science.gov (United States)

    Sukthankar, Rahul; Pomerleau, Dean A.; Thorpe, Chuck E.

    1998-01-01

    In independent vehicle concepts for the Automated Highway System (AHS), the ability to make competent tactical-level decisions in real-time is crucial. Traditional approaches to tactical reasoning typically involve the implementation of large monolithic systems, such as decision trees or finite state machines. However, as the complexity of the environment grows, the unforeseen interactions between components can make modifications to such systems very challenging. For example, changing an overtaking behavior may require several, non-local changes to car-following, lane changing and gap acceptance rules. This paper presents a distributed solution to the problem. PolySAPIENT consists of a collection of autonomous modules, each specializing in a particular aspect of the driving task - classified by traffic entities rather than tactical behavior. Thus, the influence of the vehicle ahead on the available actions is managed by one reasoning object, while the implications of an approaching exit are managed by another. The independent recommendations form these reasoning objects are expressed in the form of votes and vetos over a 'tactical action space', and are resolved by a voting arbiter. This local independence enables PolySAPIENT reasoning objects to be developed independently, using a heterogenous implementation. PolySAPIENT vehicles are implemented in the SHIVA tactical highway simulator, whose vehicles are based on the Carnegie Mellon Navlab robots.

  3. Implementing Distributed Controllers for Systems with Priorities

    CERN Document Server

    Ben-Hafaiedh, Imene; Khairallah, Hammadi; 10.4204/EPTCS.30.3

    2010-01-01

    Implementing a component-based system in a distributed way so that it ensures some global constraints is a challenging problem. We consider here abstract specifications consisting of a composition of components and a controller given in the form of a set of interactions and a priority order amongst them. In the context of distributed systems, such a controller must be executed in a distributed fashion while still respecting the global constraints imposed by interactions and priorities. We present in this paper an implementation of an algorithm that allows a distributed execution of systems with (binary) interactions and priorities. We also present a comprehensive simulation analysis that shows how sensitive to changes our algorithm is, in particular changes related to the degree of conflict in the system.

  4. ARC Control Tower: A flexible generic distributed job management framework

    Science.gov (United States)

    Nilsen, J. K.; Cameron, D.; Filipčič, A.

    2015-12-01

    While current grid middleware implementations are quite advanced in terms of connecting jobs to resources, their client tools are generally quite minimal and features for managing large sets of jobs are left to the user to implement. The ARC Control Tower (aCT) is a very flexible job management framework that can be run on anything from a single users laptop to a multi-server distributed setup. aCT was originally designed to enable ATLAS jobs to be submitted to the ARC CE. However, with the recent redesign of aCT where the ATLAS specific elements are clearly separated from the ARC job management parts, the control tower can now easily be reused as a flexible generic distributed job manager for other communities. This paper will give a detailed explanation how aCT works as a job management framework and go through the steps needed to create a simple job manager using aCT and show that it can easily manage thousands of jobs.

  5. 2016-2020 Strategic Plan and Implementing Framework

    Energy Technology Data Exchange (ETDEWEB)

    None

    2015-11-01

    The 2016-2020 Strategic Plan and Implementing Framework from the Office of Energy Efficiency and Renewable Energy (EERE) is the blueprint for launching the nation’s leadership in the global clean energy economy. This document will guide the organization to build on decades of progress in powering our nation from clean, affordable and secure energy.

  6. An Empirical Framework for Implementing Lifelong Learning Systems.

    Science.gov (United States)

    Law, Song Seng; Low, Sock Hwee

    Based on a literature review of factors that affect the provision of learning opportunities for adults and the experiences of Singapore's Institute of Technical Education (ITE), this paper proposes an empirical framework for developing and implementing lifelong learning systems. Following an introduction, the theoretical foundation for the…

  7. A Framework for Implementing TQM in Higher Education Programs

    Science.gov (United States)

    Venkatraman, Sitalakshmi

    2007-01-01

    Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…

  8. A Framework for Implementing TQM in Higher Education Programs

    Science.gov (United States)

    Venkatraman, Sitalakshmi

    2007-01-01

    Purpose: This paper aims to provide a TQM framework that stresses continuous improvements in teaching as a plausible means of TQM implementation in higher education programs. Design/methodology/approach: The literature survey of the TQM philosophies and the comparative analysis of TQM adoption in industry versus higher education provide the…

  9. Forum on Implementing Accessibility Frameworks for ALL Students

    Science.gov (United States)

    Warren, S.; Christensen, L.; Chartrand, A.; Shyyan, V.; Lazarus, S.; Thurlow, M.

    2015-01-01

    Sixty individuals representing staff from state departments of education, school districts, other countries, testing and testing-related companies, and other educational organizations participated in a forum on June 22, 2015 in San Diego, California, to discuss implementing accessibility frameworks for all students, including students in general…

  10. A Framework for Distributed Problem Solving

    Science.gov (United States)

    Leone, Joseph; Shin, Don G.

    1989-03-01

    This work explores a distributed problem solving (DPS) approach, namely the AM/AG model, to cooperative memory recall. The AM/AG model is a hierarchic social system metaphor for DPS based on the Mintzberg's model of organizations. At the core of the model are information flow mechanisms, named amplification and aggregation. Amplification is a process of expounding a given task, called an agenda, into a set of subtasks with magnified degree of specificity and distributing them to multiple processing units downward in the hierarchy. Aggregation is a process of combining the results reported from multiple processing units into a unified view, called a resolution, and promoting the conclusion upward in the hierarchy. The combination of amplification and aggregation can account for a memory recall process which primarily relies on the ability of making associations between vast amounts of related concepts, sorting out the combined results, and promoting the most plausible ones. The amplification process is discussed in detail. An implementation of the amplification process is presented. The process is illustrated by an example.

  11. Distributed Framework for Prototyping of Observability Concepts in Smart Grids

    DEFF Research Database (Denmark)

    Prostejovsky, Alexander; Gehrke, Oliver; Kosek, Anna Magdalena

    2015-01-01

    —Development and testing of distributed monitoring, visualisation, and decision support concepts for future power systems require appropriate modelling tools that represent both the electrical side of the grid, as well as the communication and logical relations between the acting entities...... via an abstract information model. Data is acquired dynamically over low-level data interfaces that allow for easy integration within heterogeneous environments. A Multi-Agent System platform was chosen for implementation, where agents represent the different electrical and logical grid elements...... and perform data acquisition, processing, and exchange. The basic capabilities of the framework together with a simple visualisation concept are demonstrated using a simulation of the Power Networks Demonstration Centre (PNDC) laboratory distribution grid....

  12. On ASGS framework: general requirements and an example of implementation

    Institute of Scientific and Technical Information of China (English)

    KULESZA Kamil; KOTULSKI Zbigniew

    2007-01-01

    In the paper we propose a general, abstract framework for Automatic Secret Generation and Sharing (ASGS) that should be independent of underlying Secret Sharing Scheme (SSS). ASGS allows to prevent the Dealer from knowing the secret.The Basic Property Conjecture (BPC) forms the base of the framework. Due to the level of abstraction, results are portable into the realm of quantum computing.Two situations are discussed. First concerns simultaneous generation and sharing of the random, prior nonexistent secret.Such a secret remains unknown until it is reconstructed. Next, we propose the framework for automatic sharing of a known secret.In this case the Dealer does not know the secret and the secret Owner does not know the shares. We present opportunities for joining ASGS with other extended capabilities, with special emphasis on PVSS and pre-positioned secret sharing. Finally, we illustrate framework with practical implementation.

  13. Quality Assurance Framework Implementation Guide for Isolated Community Power Systems

    Energy Technology Data Exchange (ETDEWEB)

    Esterly, Sean R. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Baring-Gould, Edward I. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Burman, Kari A. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-08-15

    This implementation guide is a companion document to the 'Quality Assurance Framework for Mini-Grids' technical report. This document is intended to be used by one of the many stakeholder groups that take part in the implementation of isolated power systems. Although the QAF could be applied to a single system, it was designed primarily to be used within the context of a larger national or regional rural electrification program in which many individual systems are being installed. This guide includes a detailed overview of the Quality Assurance Framework and provides guidance focused on the implementation of the Framework from the perspective of the different stakeholders that are commonly involved in expanding energy development within specific communities or regions. For the successful long-term implementation of a specific rural electrification program using mini-grid systems, six key stakeholders have been identified that are typically engaged, each with a different set of priorities 1. Regulatory agency 2. Governmental ministry 3. System developers 4. Mini-utility 5. Investors 6. Customers/consumers. This document is broken into two distinct sections. The first focuses on the administrative processes in the development and operation of community-based mini-grid programs, while the second focuses on the process around the installation of the mini-grid project itself.

  14. Architectural notes: a framework for distributed systems development

    NARCIS (Netherlands)

    Pires, L.F.; Ferreira Pires, Luis

    1994-01-01

    This thesis develops a framework of methods and techniques for distributed systems development. This framework consists of two related domains in which design concepts for distributed systems are defined: the entity domain and the behaviour domain. In the entity domain we consider structures of

  15. Private cloud implementation and security : using Eucalyptus and Xen Frameworks

    OpenAIRE

    2013-01-01

    Private Cloud Implementation and Security Private cloud is a new improved way to organize and manage IT resources and services within an enterprise. This can be achieved by establishing a private cloud framework behind the corporate firewall of an enterprise. This is done in order to promote better efficiency in determining workload, usage priorities, and security threats in an enterprise as well as secure sensitive company information, which other cloud offerings may not always effective...

  16. Implementing a Mobile Social Media Framework for Designing Creative Pedagogies

    Directory of Open Access Journals (Sweden)

    Thomas Cochrane

    2014-08-01

    Full Text Available The rise of mobile social media provides unique opportunities for new and creative pedagogies. Pedagogical change requires a catalyst, and we argue that mobile social media can be utilized as such a catalyst. However, the mobile learning literature is dominated by case studies that retrofit traditional pedagogical strategies and pre-existing course activities onto mobile devices and social media. From our experiences of designing and implementing a series of mobile social media projects, the authors have developed a mobile social media framework for creative pedagogies. We illustrate the implementation of our mobile social media framework within the development of a new media minor (an elective set of four courses that explicitly integrates the unique technical and pedagogical affordances of mobile social media, with a focus upon student-generated content and student-determined learning (heutagogy. We argue that our mobile social media framework is potentially transferable to a range of educational contexts, providing a simple design framework for new pedagogies.

  17. Architectural frameworks: defining the structures for implementing learning health systems.

    Science.gov (United States)

    Lessard, Lysanne; Michalowski, Wojtek; Fung-Kee-Fung, Michael; Jones, Lori; Grudniewicz, Agnes

    2017-06-23

    The vision of transforming health systems into learning health systems (LHSs) that rapidly and continuously transform knowledge into improved health outcomes at lower cost is generating increased interest in government agencies, health organizations, and health research communities. While existing initiatives demonstrate that different approaches can succeed in making the LHS vision a reality, they are too varied in their goals, focus, and scale to be reproduced without undue effort. Indeed, the structures necessary to effectively design and implement LHSs on a larger scale are lacking. In this paper, we propose the use of architectural frameworks to develop LHSs that adhere to a recognized vision while being adapted to their specific organizational context. Architectural frameworks are high-level descriptions of an organization as a system; they capture the structure of its main components at varied levels, the interrelationships among these components, and the principles that guide their evolution. Because these frameworks support the analysis of LHSs and allow their outcomes to be simulated, they act as pre-implementation decision-support tools that identify potential barriers and enablers of system development. They thus increase the chances of successful LHS deployment. We present an architectural framework for LHSs that incorporates five dimensions-goals, scientific, social, technical, and ethical-commonly found in the LHS literature. The proposed architectural framework is comprised of six decision layers that model these dimensions. The performance layer models goals, the scientific layer models the scientific dimension, the organizational layer models the social dimension, the data layer and information technology layer model the technical dimension, and the ethics and security layer models the ethical dimension. We describe the types of decisions that must be made within each layer and identify methods to support decision-making. In this paper, we outline

  18. The Astrophysics Simulation Collaboratory portal: A framework foreffective distributed research

    Energy Technology Data Exchange (ETDEWEB)

    Bondarescu, Ruxandra; Allen, Gabrielle; Daues, Gregory; Kelly,Ian; Russell, Michael; Seidel, Edward; Shalf, John; Tobias, Malcolm

    2003-03-03

    We describe the motivation, architecture, and implementation of the Astrophysics Simulation Collaboratory (ASC) portal. The ASC project provides a web-based problem solving framework for the astrophysics community that harnesses the capabilities of emerging computational grids.

  19. Framework of ERP System Implementation for SMEs In Punjab

    Directory of Open Access Journals (Sweden)

    Sarvjit Singh

    2012-08-01

    Full Text Available Enterprise resource planning (ERP system can effectively reduce product cost, improve customer service experience, and increase enterprise competitiveness is one of the most significant information systems for Punjab enterprises, However, the successful implementation rate of ERP system/is much lower than initially planned and many enterprises did not achieve their intended goals. There are a lot of factors (e.g., high implementation costs, technical complexity, lack of well-trained employees, lack of incentive mechanisms, etc. resulting in this situation, but the key element is the separation of the relationship between ERP system and enterprise leadership, organizational structure and business processes. Aiming at the key issue, an IT governance framework of ERP system implementation is developed based on information technology governance methodology, which includes six components (i.e., enterprise strategy and organization, ERP-related organization and desirable behavior, ERP-related governance arrangements, ERP-related governance mechanisms, ERP-related business performance goal, ERP-related metrics and resource accountabilities.. The requirements of each component are described. Furthermore, a case of ERP system implementation is used to illustrate the proposed framework, which takes into account sufficiently the relationships among ERP system, business objectives, business processes, and business performances.

  20. Development of framework for sustainable Lean implementation: an ISM approach

    Science.gov (United States)

    Jadhav, Jagdish Rajaram; Mantha, S. S.; Rane, Santosh B.

    2014-07-01

    The survival of any organization depends upon its competitive edge. Even though Lean is one of the most powerful quality improvement methodologies, nearly two-thirds of the Lean implementations results in failures and less than one-fifth of those implemented have sustained results. One of the most significant tasks of top management is to identify, understand and deploy the significant Lean practices like quality circle, Kanban, Just-in-time purchasing, etc. The term `bundle' is used to make groups of inter-related and internally consistent Lean practices. Eight significant Lean practice bundles have been identified based on literature reviewed and opinion of the experts. The order of execution of Lean practice bundles is very important. Lean practitioners must be able to understand the interrelationship between these practice bundles. The objective of this paper is to develop framework for sustainable Lean implementation using interpretive structural modelling approach.

  1. Implementing invasive species management in an adaptive management framework

    Directory of Open Access Journals (Sweden)

    Llewellyn C. Foxcroft

    2011-05-01

    Full Text Available Adaptive management theory has attracted substantial interest in recent years, in natural resource management in general and also for invasive alien species management. However, whilst many theoretical and conceptual advances have been made, documented cases of practical applications are rare. Coupling invasive species management components with adaptive feedback processes is not without challenges, requiring a substantial change in the thinking and practice of all those involved. Drawing on a decade of experience in South African National Parks, we suggest an approach to implementing adaptive management for controlling invasive alien species. Whilst efforts have been made to advance components of the overall management strategy, the absence of a framework for decision making and feedback mechanisms, inflexibility in the system and shortcomings in the governance structure are all identified as barriers to learning and knowledge integration for the purposes of effective invasive alien species management. The framework provided here, encompassing documents, committees and processes, is aimed at addressing these shortcomings.Conservation implication: Adaptive management theory offers a robust tool for managing inherently complex systems. Its practical application, however, requires distilling the theory into useable functions. We offer a framework to advance implementation of strategic adaptive management for the control of invasive alien species using experiences gained from South African National Parks.

  2. Distributed Video Coding: CODEC Architecture and Implementation

    Directory of Open Access Journals (Sweden)

    Vijay Kumar Kodavalla

    2011-03-01

    Full Text Available Distributed Video Coding (DVC is a new coding paradigm for video compression, based on Slepian-Wolf (lossless coding and Wyner-Ziv (lossy coding information theoretic results. DVC is useful foremerging applications such as wireless video cameras, wireless low-power surveillance networks anddisposable video cameras for medical applications etc. The primary objective of DVC is low-complexityvideo encoding, where bulk of computation is shifted to the decoder, as opposed to low-complexitydecoder in conventional video compression standards such as H.264 and MPEG etc. There are couple ofearly architectures and implementations of DVC from Stanford University[2][3] in 2002, BerkeleyUniversity PRISM (Power-efficient, Robust, hIgh-compression, Syndrome-based Multimedia coding[4][5]in 2002 and European project DISCOVER (DIStributed COding for Video SERvices[6] in 2007.Primarily there are two types of DVC techniques namely pixel domain and transform domain based.Transform domain design will have better rate-distortion (RD performance as it exploits spatialcorrelation between neighbouring samples and compacts the block energy into as few transformcoefficients as possible (aka energy compaction. In this paper, architecture, implementation details and“C” model results of our transform domain DVC are presented.

  3. A conceptual framework for the vehicle-to-grid (V2G) implementation

    Energy Technology Data Exchange (ETDEWEB)

    Guille, Christophe; Gross, George [Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61801 (United States)

    2009-11-15

    The paper focuses on presenting a proposed framework to effectively integrate the aggregated battery vehicles into the grid as distributed energy resources to act as controllable loads to levelize the demand on the system during off-peak conditions and as a generation/storage device during the day to provide capacity and energy services to the grid. The paper also presents practical approaches for two key implementation steps - computer/communication/control network and incentive program. (author)

  4. Parallel Implementation of a Semidefinite Programming Solver based on CSDP in a distributed memory cluster

    NARCIS (Netherlands)

    Ivanov, I.D.; de Klerk, E.

    2007-01-01

    In this paper we present the algorithmic framework and practical aspects of implementing a parallel version of a primal-dual semidefinite programming solver on a distributed memory computer cluster. Our implementation is based on the CSDP solver and uses a message passing interface (MPI), and the Sc

  5. Parallel Implementation of a Semidefinite Programming Solver based on CSDP in a distributed memory cluster

    NARCIS (Netherlands)

    Ivanov, I.D.; de Klerk, E.

    2007-01-01

    In this paper we present the algorithmic framework and practical aspects of implementing a parallel version of a primal-dual semidefinite programming solver on a distributed memory computer cluster. Our implementation is based on the CSDP solver and uses a message passing interface (MPI), and the Sc

  6. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt

    2013-01-01

    This paper describes work being done towards a Framework for Applying the Reference Model for an Open Archival Information System (OAIS) to Distributed Digital Preservation (DDP). Such a Framework will be helpful for future analyses and/or audits of repositories that are performing digital...... information on this set of work, describe the research carried out to-date, and explain the proposed Framework components, including concepts and terminology, placement of OAIS functional entities, and roles and responsibilities for carrying out DDP....

  7. Supporting Collective Inquiry: A Technology Framework for Distributed Learning

    Science.gov (United States)

    Tissenbaum, Michael

    This design-based study describes the implementation and evaluation of a technology framework to support smart classrooms and Distributed Technology Enhanced Learning (DTEL) called SAIL Smart Space (S3). S3 is an open-source technology framework designed to support students engaged in inquiry investigations as a knowledge community. To evaluate the effectiveness of S3 as a generalizable technology framework, a curriculum named PLACE (Physics Learning Across Contexts and Environments) was developed to support two grade-11 physics classes (n = 22; n = 23) engaged in a multi-context inquiry curriculum based on the Knowledge Community and Inquiry (KCI) pedagogical model. This dissertation outlines three initial design studies that established a set of design principles for DTEL curricula, and related technology infrastructures. These principles guided the development of PLACE, a twelve-week inquiry curriculum in which students drew upon their community-generated knowledge base as a source of evidence for solving ill-structured physics problems based on the physics of Hollywood movies. During the culminating smart classroom activity, the S3 framework played a central role in orchestrating student activities, including managing the flow of materials and students using real-time data mining and intelligent agents that responded to emergent class patterns. S3 supported students' construction of knowledge through the use individual, collective and collaborative scripts and technologies, including tablets and interactive large-format displays. Aggregate and real-time ambient visualizations helped the teacher act as a wondering facilitator, supporting students in their inquiry where needed. A teacher orchestration tablet gave the teacher some control over the flow of the scripted activities, and alerted him to critical moments for intervention. Analysis focuses on S3's effectiveness in supporting students' inquiry across multiple learning contexts and scales of time, and in

  8. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt

    2013-01-01

    This paper describes work being done towards a Framework for Applying the Reference Model for an Open Archival Information System (OAIS) to Distributed Digital Preservation (DDP). Such a Framework will be helpful for future analyses and/or audits of repositories that are performing digital preser...

  9. A Framework of Auto-Adapting Distributed Object for Mobile Computing

    Institute of Scientific and Technical Information of China (English)

    WANG Chen; ZHOU Ying; ZHANG Defu

    1999-01-01

    The low bandwidth hinders thedevelopment of mobile computing. Besides providing relatively higherbandwidth on communication layer, constructing adaptable upperapplication is important. In this paper, a framework of auto-adaptingdistributed object is proposed, and evaluating methods of objectperformance are given as well. Distributed objects can adjust theirbehaviors automatically in the framework and keep in relatively goodperformance to serve requests of remote applications. It is an efficientway to implement the performance transparency for mobile clients.

  10. Practical Framework: Implementing OEE Method in Manufacturing Process Environment

    Science.gov (United States)

    Maideen, N. C.; Sahudin, S.; Mohd Yahya, N. H.; Norliawati, A. O.

    2016-02-01

    Manufacturing process environment requires reliable machineries in order to be able to satisfy the market demand. Ideally, a reliable machine is expected to be operated and produce a quality product at its maximum designed capability. However, due to some reason, the machine usually unable to achieved the desired performance. Since the performance will affect the productivity of the system, a measurement technique should be applied. Overall Equipment Effectiveness (OEE) is a good method to measure the performance of the machine. The reliable result produced from OEE can then be used to propose a suitable corrective action. There are a lot of published paper mentioned about the purpose and benefit of OEE that covers what and why factors. However, the how factor not yet been revealed especially the implementation of OEE in manufacturing process environment. Thus, this paper presents a practical framework to implement OEE and a case study has been discussed to explain in detail each steps proposed. The proposed framework is beneficial to the engineer especially the beginner to start measure their machine performance and later improve the performance of the machine.

  11. Implementing a collaborative framework for academic support for registered nurses.

    Science.gov (United States)

    Elliott, Debra; Ugboma, Debra; Knight, Jessica

    2013-12-01

    This paper describes the collaboration between a national health service acute hospital trust and a higher education institution, to implement a framework for academic support for registered nurses undertaking learning beyond registration. A small percentage of the educational budget was utilised to fund two academic staff (0.6 whole time equivalent) to work within the trusts' own learning and development department. The initial aim of the project was to maximise the utilisation of the funding available for learning beyond registration study. The focus of the project was at both a strategic level and with individual staff. Embedding within the culture of the trust was important for the academic staff to understand and gain the service/user perspective to some of the barriers or issues concerning learning beyond registration. Following a scoping exercise, the multiplicity of issues that required action led to the creation of an academic support framework. This framework identified potential for intervention in 4 phases: planning for study, application and access to learning, during study and outcome of study. Interventions were identified that were complimentary and adjuncts to the academic support provided by the higher education institution. New resources and services were also developed such as pathway planning support and study skill workshops. One important resource was a dedicated point of contact for staff. A "live" database also proved useful in tracking and following-up students.

  12. Towards a Better Distributed Framework for Learning Big Data

    Science.gov (United States)

    2017-06-14

    AFRL-AFOSR-JP-TR-2017-0046 Towards a Better Distributed Framework for Learning Big Data Shou-de Lin NATIONAL TAIWAN UNIVERSITY Final Report 06/14...DATES COVERED (From - To) 14 May 2015 to 13 May 2017 4. TITLE AND SUBTITLE Towards a Better Distributed Framework for Learning Big Data 5a...Framework for Learning Big Data 05/11/2017 Name of Principal Investigators (PI and Co-PIs): Shou-De Lin - e-mail address : sdlin@csie.ntu.edu.tw

  13. Distributed Software Development Modelling and Control Framework

    Directory of Open Access Journals (Sweden)

    Yi Feng

    2012-10-01

    Full Text Available With the rapid progress of internet technology, more and more software projects adopt e-development tofacilitate the software development process in a world-wide context. However, distributed softwaredevelopment activity itself is a complex orchestration. It involves many people working together without thebarrier of time and space difference. Therefore, how to efficiently monitor and control software edevelopmentin a global perspective becomes an important issue for any internet-based softwaredevelopment project. In this paper, we present a novel approach to tackle this crucial issue by means ofcontrolling e-development process, collaborative task progress and communication quality. Meanwhile, wealso present our e-development supporting environment prototype: Caribou, to demonstrate the viability ofour approach.

  14. An ORM-Driven Implementation Framework for Database Federations

    Science.gov (United States)

    Balsters, Herman; Haarsma, Bouke

    Database federations are employed more and more in situations involving virtual and integrated information on demand, e.g., real-time integration of two databases. Heterogeneity in hardware and software platforms, as well heterogeneity in underlying semantics of participating local databases, makes it a hard challenge to design a consistent and well-performing global database federation. The ORM modeling framework allows not only for precise modeling of a data federation, but also hosts tools for reverse engineering, enabling local databases to recapture their intended meanings on a linguistic basis. We will demonstrate how ORM models together with reverse engineering techniques can be used in combination with actual, industrial-strength implementation platforms to develop semantically consistent and high performance database federations.

  15. Open heavy flavour production: conceptual framework and implementation issues

    Energy Technology Data Exchange (ETDEWEB)

    Tung Wuki [Department of Physics and Astronomy, Michigan State University, MI (United States)]. E-mail: Tung@pa.msu.edu; Kretzer, Stefan; Schmidt, Carl [Department of Physics and Astronomy, Michigan State University, MI (United States)

    2002-05-01

    Heavy flavour production is an important quantum chromodynamics (QCD) process both in its own right and as a key component of precision global QCD analysis. Apparent disagreements between fixed-flavour scheme calculations of b-production rate with experimental measurements in hadro-, lepto- and photo-production provide new impetus for a thorough examination of the theory and phenomenology of this process. We review existing methods of calculation and place them in the context of the general perturbative QCD framework of Collins. A distinction is drawn between scheme dependence and implementation issues related to quark mass effects near threshold. We point out a so far overlooked kinematic constraint on the threshold behaviour, which greatly simplifies the variable flavour number scheme. This obviates the need for the elaborate existing prescriptions and leads to robust predictions. It can facilitate the study of current issues on heavy flavour production as well as precision global QCD analysis. (author)

  16. Kodiak: An Implementation Framework for Branch and Bound Algorithms

    Science.gov (United States)

    Smith, Andrew P.; Munoz, Cesar A.; Narkawicz, Anthony J.; Markevicius, Mantas

    2015-01-01

    Recursive branch and bound algorithms are often used to refine and isolate solutions to several classes of global optimization problems. A rigorous computation framework for the solution of systems of equations and inequalities involving nonlinear real arithmetic over hyper-rectangular variable and parameter domains is presented. It is derived from a generic branch and bound algorithm that has been formally verified, and utilizes self-validating enclosure methods, namely interval arithmetic and, for polynomials and rational functions, Bernstein expansion. Since bounds computed by these enclosure methods are sound, this approach may be used reliably in software verification tools. Advantage is taken of the partial derivatives of the constraint functions involved in the system, firstly to reduce the branching factor by the use of bisection heuristics and secondly to permit the computation of bifurcation sets for systems of ordinary differential equations. The associated software development, Kodiak, is presented, along with examples of three different branch and bound problem types it implements.

  17. Quantum key distribution: vulnerable if imperfectly implemented

    Science.gov (United States)

    Leuchs, G.

    2013-10-01

    We report several vulnerabilities found in Clavis2, the flagship quantum key distribution (QKD) system from ID Quantique. We show the hacking of a calibration sequence run by Clavis2 to synchronize the Alice and Bob devices before performing the secret key exchange. This hack induces a temporal detection efficiency mismatch in Bob that can allow Eve to break the security of the cryptosystem using faked states. We also experimentally investigate the superlinear behaviour in the single-photon detectors (SPDs) used by Bob. Due to this superlinearity, the SPDs feature an actual multi-photon detection probability which is generally higher than the theoretically-modelled value. We show how this increases the risk of detector control attacks on QKD systems (including Clavis2) employing such SPDs. Finally, we review the experimental feasibility of Trojan-horse attacks. In the case of Clavis2, the objective is to read Bob's phase modulator to acquire knowledge of his basis choice as this information suffices for constructing the raw key in the Scarani-Acin-Ribordy-Gisin 2004 (SARG04) protocol. We work in close collaboration with ID Quantique and for all these loopholes, we notified them in advance. Wherever possible, we or ID Quantique proposed countermeasures and they implemented suitable patches and upgrade their systems.

  18. MVC Design Pattern for the multi framework distributed applications using XML, spring and struts framework

    Directory of Open Access Journals (Sweden)

    Praveen Gupta,

    2010-07-01

    Full Text Available The model view controller (MVC is a fundamental design pattern for the separation between user interface logic and business logic. Since applications are very large in size these days and the MVC designpattern can weak the coupling among the different application tiers of application. this paper presents a web application framework based on MVC in J2EE platform, and extends it with XML so that the framework is more flexible, expansible and easy to maintain. This is a multi tier system including presentation layer, business layer, data persistence layer and database layer. This separate codes, and improve maintainability and reusability of the application. In this paper, we implemented MVC using spring and struts framework. Our research study show that applying multiple framework to design theapplications using MVC concepts makes applications easier compare to a single framework.

  19. Load balancing in distributed framework for frequency based thread pools

    Directory of Open Access Journals (Sweden)

    Sheraz Ahmad

    2016-12-01

    Full Text Available The consequence of load balancing algorithms on a thread pool framework name is distributed load balancing frequency based optimization scheme (LDFBOS to increase its execution. Load balancing in distributed frequency based thread pool scheme is residential towards the ground of synchronizing overhead crude named LDFBOS in Java that slows down its execution due to framework exchange and synchronizing overhead in nodes, we are demonstrating the contrive and execution of load balancing in distributed frequency based thread pool LDFBOS to does usage from distributed in frequency based thread pool (DFBOS, synchronizing primitives that propose benefits of significant scalable moreover, dynamism. We have got resembled the execution of some schemes by Thread Pool Tester which is a Java application simulator and the consequence have demonstrated that load balancing in distributed frequency based thread pool LDFBOS exceeds preceding DFBOS scheme.

  20. Online Monitor Framework for Network Distributed Data Acquisition Systems

    Science.gov (United States)

    Konno, Tomoyuki; Cabrera, Anatael; Ishitsuka, Masaki; Kuze, Masahiro; Sakamoto, Yasunobu; the Double Chooz Collaboration

    Data acquisition (DAQ) systems for recent high energy physics experiments consist of lots of subsystems distributed in the local area network. Therefore, scalability for the number of connections from subsystems and availability of access via the Internet are required. "Online monitor framework" is a general software framework for online data monitoring, which provides a way to collect monitoring information distributed in the network and pass them though the firewalls. The framework consists of two subsystems; "Monitor Sever" and "Monitor Viewer". Monitor Server is a core system of the framework. The server collects monitoring information from the DAQ subsystems to provide them to Monitor Viewer. Monitor Viewer is a graphical user interface of the monitor framework, which displays plots in itself. We adapted two types of technologies; Java and HTML5 with Google Web Toolkit, which are independent of operating systems or plugin-libraries like ROOT and contain some functionalities of communicating via the Internet and drawing graphics. The monitoring framework was developed for the Double Chooz reactor neutrino oscillation experiment but is general enough for other experiments. This document reports the structure of the online monitor framework with some examples from the adaption to the Double Chooz experiment.

  1. Global Framework for Climate Services (GFCS): implementation approach

    Science.gov (United States)

    Lucio, Filipe

    2013-04-01

    The Extraordinary Session of the World Meteorological Congress, held from 29 to 31 October 2012, adopted the Implementation Plan of the Global Framework for Climate Services, for the subsequent consideration by the Intergovernmental Board on Climate Services, which will host its first session in July 2013. The Extraordinary Congress called for an immediate move to action, so that the work undertaken can result in activities on the ground which will benefit, in particular, vulnerable countries. The development of the GFCS through a broad consultation process accross the pillars of the GFCS (User Interface Platform; Observations and Monitoring; Climate Services Information System; Research, Modelling and Prediction; and Capacity Development) and the initial four priority areas (Agriculture and Food Security; Water; Health and Disaster Risk Reductio) identified a number of challenges, which in some cases constitute barries to implementation: - Accessibility: many countries do not have climate services at all, and all countries have scope to improve access to such services; - Capacity: many countries lack the capacity to anticipate and managed climate-related risks and opportunities; - Data: the current availability and quality of climate observations and impacts data are inadequate for large parts of the globe; - Partnerships: mechanisms to enhance interaction between climate users and providers are not always well developed, and user requirements are not always adequately understood and addressed; - Quality: operational climate services are lagging advances in climate and applications science, and the spatial and temporal resolution of information to support decision-making is often insufficient to match user requirements. To address these challenges, the Implementation Plan of the GFCS identified initial implementation projects and activities. The initial priority is to establish the leadership and management capacity to take the GFCS forward at all levels. Capacity

  2. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt; Skinner, Katherine

    apparatuses in order to achieve the reliable persistence of digital content. Although the use of distribution is common within the preservation field, there is not yet an accepted definition for “distributed digital preservation”. As the preservation field has matured, the term “distributed digital...... Library, Internet Archive, and Archivematica are developing a framework that identifies, defines, and provides a documented model for the range of relationships and interactions that occur in Distributed Digital Preservation (DDP) environments. This effort seeks to: 1. Provide a set of concepts...

  3. Framework for the Design and Implementation of Fault Detection and Isolation Project

    Data.gov (United States)

    National Aeronautics and Space Administration — SySense, Inc. proposes to develop a framework for the design and implementation of fault detection and isolation (FDI) systems. The framework will include protocols...

  4. Creating a Framework for Applying OAIS to Distributed Digital Preservation

    DEFF Research Database (Denmark)

    Zierau, Eld; Schultz, Matt; Skinner, Katherine

    2013-01-01

    This poster describes the creation of a framework that addresses the challenges in applying the Reference Model for an Open Archival Information System (OAIS) to the Distributed Digital Preservation (DDP) environment. The purpose of this initiative is to identify, define, and provide a documented...... Library, Internet Archive, and Archivematica are developing a framework that identifies, defines, and provides a documented model for the range of relationships and interactions that occur in Distributed Digital Preservation (DDP) environments. This effort seeks to: 1. Provide a set of concepts...... framework for the range of technical and organizational relationships and interactions that occur in DDP environments. A white paper regarding this work was recently written, and the aim of this poster is to inform the the broader digital preservation community of developers, designers, architects, archival...

  5. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science

    Directory of Open Access Journals (Sweden)

    Alexander Jeffery A

    2009-08-01

    Full Text Available Abstract Background Many interventions found to be effective in health services research studies fail to translate into meaningful patient care outcomes across multiple contexts. Health services researchers recognize the need to evaluate not only summative outcomes but also formative outcomes to assess the extent to which implementation is effective in a specific setting, prolongs sustainability, and promotes dissemination into other settings. Many implementation theories have been published to help promote effective implementation. However, they overlap considerably in the constructs included in individual theories, and a comparison of theories reveals that each is missing important constructs included in other theories. In addition, terminology and definitions are not consistent across theories. We describe the Consolidated Framework For Implementation Research (CFIR that offers an overarching typology to promote implementation theory development and verification about what works where and why across multiple contexts. Methods We used a snowball sampling approach to identify published theories that were evaluated to identify constructs based on strength of conceptual or empirical support for influence on implementation, consistency in definitions, alignment with our own findings, and potential for measurement. We combined constructs across published theories that had different labels but were redundant or overlapping in definition, and we parsed apart constructs that conflated underlying concepts. Results The CFIR is composed of five major domains: intervention characteristics, outer setting, inner setting, characteristics of the individuals involved, and the process of implementation. Eight constructs were identified related to the intervention (e.g., evidence strength and quality, four constructs were identified related to outer setting (e.g., patient needs and resources, 12 constructs were identified related to inner setting (e.g., culture

  6. Distributed implementation of functional program evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Fasel, J.H.; Douglass, R.J.; Michelsen, R.; Hudak, P.

    1985-01-01

    In this paper, we explore the potential of the functional model, particularly as it pertains to architecture. In Section 2, we describe the graph-reduction operational model of computation and its relation to AI problems. In Section 3, we discuss a class of architectures that implement graph reduction and a prototype implementation in this class being developed at Los Alamos. Finally, we speculate on the applicability of graph reduction to some other classes of architecture.

  7. Designing and implementing test automation frameworks with QTP

    CERN Document Server

    Bhargava, Ashish

    2013-01-01

    A tutorial-based approach, showing basic coding and designing techniques to build test automation frameworks.If you are a beginner, an automation engineer, an aspiring test automation engineer, a manual tester, a test lead or a test architect who wants to learn, create, and maintain test automation frameworks, this book will accelerate your ability to develop and adapt the framework.

  8. A maximum entropy framework for non-exponential distributions

    CERN Document Server

    Peterson, Jack; Dill, Ken A

    2015-01-01

    Probability distributions having power-law tails are observed in a broad range of social, economic, and biological systems. We describe here a potentially useful common framework. We derive distribution functions $\\{p_k\\}$ for situations in which a `joiner particle' $k$ pays some form of price to enter a `community' of size $k-1$, where costs are subject to economies-of-scale (EOS). Maximizing the Boltzmann-Gibbs-Shannon entropy subject to this energy-like constraint predicts a distribution having a power-law tail; it reduces to the Boltzmann distribution in the absence of EOS. We show that the predicted function gives excellent fits to 13 different distribution functions, ranging from friendship links in social networks, to protein-protein interactions, to the severity of terrorist attacks. This approach may give useful insights into when to expect power-law distributions in the natural and social sciences.

  9. An Extensible and Secure Framework for Distributed Applications

    Directory of Open Access Journals (Sweden)

    Aneesha Sharma

    2012-01-01

    Full Text Available Availability, Scalability, Reliability, Security and resource sharing are the key issues for success of any application, that are well addressed by distributed applications. Distributed applications provide services to different computers located at various locations that are connected by some means of communication network. In distributed systems a particular site consists of various computing facilities and an interface to local users and to a communication network. This paper provides various issues that must be taken into consideration while developing distributed systems. The issues discussed in this paper offer a secure framework for developing any distributed application on the top. Of these issues there are certain most commonly occurring issues that a distributed system fall victim to.

  10. A Framework for Distributed Deep Learning Layer Design in Python

    OpenAIRE

    McLeod, Clay

    2015-01-01

    In this paper, a framework for testing Deep Neural Network (DNN) design in Python is presented. First, big data, machine learning (ML), and Artificial Neural Networks (ANNs) are discussed to familiarize the reader with the importance of such a system. Next, the benefits and detriments of implementing such a system in Python are presented. Lastly, the specifics of the system are explained, and some experimental results are presented to prove the effectiveness of the system.

  11. A Distributed Key Based Security Framework for Private Clouds

    Directory of Open Access Journals (Sweden)

    Ali Shahbazi

    2013-10-01

    Full Text Available Cloud computing in its various forms continues to grow in popularity as organizations of all sizes seek to capitalize on the cloud’s scalability, externalization of infrastructure and administration and generally reduced application deployment costs. But while the attractiveness of these public cloud services is obvious, the ability to capitalize on these benefits is significantly limited for those organization requiring high levels of data security. It is often difficult if not impossible from a legal or regulatory perspective for government agencies or health services organizations for instance to use these cloud services given their many documented data security issues. As a middle ground between the benefits and security concerns of public clouds, hybrid clouds have emerged as an attractive alternative; limiting access, conceptually, to users within an organization or within a specific subset of users within an organization. Private clouds being significant options in hybrid clouds, however, are still susceptible to security vulnerabilities, a fact which points to the necessity of security frameworks capable of addressing these issues. In this paper we introduce the Treasure Island Security Framework (TISF, a conceptual security framework designed to specifically address the security needs of private clouds. We have based our framework on a Distributed Key and Sequentially Addressing Distributed file system (DKASA; itself borrowing heavily from the Google File System and Hadoop. Our approach utilizes a distributed key methodology combined with sequential chunk addressing and dynamic reconstruction of metadata to produce a more secure private cloud. The goal of this work is not to evaluate framework from an operational perspective but to instead provide the conceptual underpinning for the TISF. Experimental findings from our evaluation of the framework within a pilot project will be provided in a subsequent work.

  12. MomConnect: an exemplar implementation of the Health Normative Standards Framework in South Africa

    CSIR Research Space (South Africa)

    Seebregts, C

    2016-05-01

    Full Text Available In August 2014, the National Department of Health implemented MomConnect as a national digital maternal health program that implements the South African mobile health (mHealth) strategy and the National Health Normative Standards Framework...

  13. Framework and implementation for improving physics essential skills via computer-based practice: Vector math

    Science.gov (United States)

    Mikula, Brendon D.; Heckler, Andrew F.

    2017-06-01

    We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with a careful identification of target skills and the study of specific student difficulties with these skills. It then employs computer-based instruction, immediate feedback, mastery grading, and well-researched principles from cognitive psychology such as interleaved training sequences and distributed practice. We implemented this with more than 1500 students over 2 semesters. Students completed the mastery practice for an average of about 13 min /week , for a total of about 2-3 h for the whole semester. Results reveal large (>1 SD ) pretest to post-test gains in accuracy in vector skills, even compared to a control group, and these gains were retained at least 2 months after practice. We also find evidence of improved fluency, student satisfaction, and that awarding regular course credit results in higher participation and higher learning gains than awarding extra credit. In all, we find that simple computer-based mastery practice is an effective and efficient way to improve a set of basic and essential skills for introductory physics.

  14. Framework and implementation for improving physics essential skills via computer-based practice: Vector math

    Directory of Open Access Journals (Sweden)

    Brendon D. Mikula

    2017-05-01

    Full Text Available We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with a careful identification of target skills and the study of specific student difficulties with these skills. It then employs computer-based instruction, immediate feedback, mastery grading, and well-researched principles from cognitive psychology such as interleaved training sequences and distributed practice. We implemented this with more than 1500 students over 2 semesters. Students completed the mastery practice for an average of about 13  min/week, for a total of about 2–3 h for the whole semester. Results reveal large (>1  SD pretest to post-test gains in accuracy in vector skills, even compared to a control group, and these gains were retained at least 2 months after practice. We also find evidence of improved fluency, student satisfaction, and that awarding regular course credit results in higher participation and higher learning gains than awarding extra credit. In all, we find that simple computer-based mastery practice is an effective and efficient way to improve a set of basic and essential skills for introductory physics.

  15. Large-Scale CORBA-Distributed Software Framework for NIF Controls

    Energy Technology Data Exchange (ETDEWEB)

    Carey, R W; Fong, K W; Sanchez, R J; Tappero, J D; Woodruff, J P

    2001-10-16

    The Integrated Computer Control System (ICCS) is based on a scalable software framework that is distributed over some 325 computers throughout the NIF facility. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Various forms of object-oriented software design patterns are implemented as templates to be extended by application software. Developers extend the framework base classes to model the numerous physical control points, thereby sharing the functionality defined by the base classes. About 56,000 software objects each individually addressed through CORBA are to be created in the complete ICCS. Most objects have a persistent state that is initialized at system start-up and stored in a database. Additional framework services are provided by centralized server programs that implement events, alerts, reservations, message logging, database/file persistence, name services, and process management. The ICCS software framework approach allows for efficient construction of a software system that supports a large number of distributed control points representing a complex control application.

  16. Communication Optimizations for a Wireless Distributed Prognostic Framework

    Science.gov (United States)

    Saha, Sankalita; Saha, Bhaskar; Goebel, Kai

    2009-01-01

    Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication overhead is an important design problem for such systems. In this paper we focus on communication issues faced in the distributed implementation of an important class of algorithms for prognostics - particle filters. In spite of being computation and memory intensive, particle filters lend well to distributed implementation except for one significant step - resampling. We propose new resampling scheme called parameterized resampling that attempts to reduce communication between collaborating nodes in a distributed wireless sensor network. Analysis and comparison with relevant resampling schemes is also presented. A battery health management system is used as a target application. A new resampling scheme for distributed implementation of particle filters has been discussed in this paper. Analysis and comparison of this new scheme with existing resampling schemes in the context for minimizing communication overhead have also been discussed. Our proposed new resampling scheme performs significantly better compared to other schemes by attempting to reduce both the communication message length as well as number total communication messages exchanged while not compromising prediction accuracy and precision. Future work will explore the effects of the new resampling scheme in the overall computational performance of the whole system as well as full implementation of the new schemes on the Sun SPOT devices. Exploring different network architectures for efficient communication is an importance future research direction as well.

  17. People, Power, and the Coast: a Conceptual Framework for Understanding and Implementing Benefit Sharing

    Directory of Open Access Journals (Sweden)

    Rachel Wynberg

    2014-03-01

    Full Text Available The concept of benefit sharing has seen growing adoption in recent years by a variety of sectors. However, its conceptual underpinnings, definitions, and framework remain poorly articulated and developed. We aim to help address this gap by presenting a new conceptual approach for enhancing understanding about benefit sharing and its implementation. We use the coast as a lens through which the analysis is framed because of the intricate governance challenges which coastal social-ecological systems present, the increasing development and exploitation pressures on these systems, and the growing need to improve understanding about the way in which greater equity and reduced inequalities could reduce conflicts, protect coastal ecosystems, and ensure greater social justice. Key elements of the framework include the range of actors involved, the natural resources they access and use, the interventions introduced to redistribute benefits, and the benefits and losses that result from these interventions. The framework underscores the importance of process in determining who gets what, as well as the wider institutional, political, social, and economic context. Power relations and imbalances underpin many of these elements and remain the central reason for benefits being distributed in the way that they are. The framework has relevance and application for coastal livelihoods, rural governance, and resource sustainability in a context in which community rights are increasingly undermined through land grabbing, unequal power relations, and externally driven development interventions.

  18. Global Framework for Climate Services (GFCS): status of implementation

    Science.gov (United States)

    Lucio, Filipe

    2015-04-01

    The World Climate Conference-3 (Geneva 2009) unanimously decided to establish the Global Framework for Climate Services (GFCS), a UN-led initiative spearheaded by WMO to guide the development and application of science-based climate information and services in support of decision-making in climate sensitive sectors. By promoting science-based decision-making, the GFCS is empowering governments, communities and companies to build climate resilience, reduce vulnerabilities and adapt to impacts. The initial priority areas of GFCS are Agriculture and Food Security; Disaster Risk Reduction; Health; and Water Resources. The implementation of GFCS is well underway with a governance structure now fully established. The governance structure of GFCS includes the Partner Advisory Committee (PAC), which is GFCS's stakeholder engagement mechanism. The membership of the PAC allows for a broad participation of stakeholders. The European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), the European Commission (EC), the Food and Agriculture Organization of the UN (FAO), the Global Water Partnership (GWP), the International Federation of Red Cross and Red Crescent Societies (IFRC), the International Union of Geodesy and Geophysics (IUGG), United Nations Environment Programme (UNEP), the United Nations Institute for Training and Research (UNITAR), the World Business Council for Sustainable Development (WBCSD), the World Food Programme (WFP) and WMO have already joined the PAC. Activities are being implemented in various countries in Africa, the Caribbean, Asia and Pacific Small Islands Developing States through flagship projects and activities in the four priority areas of GFCS to enable the development of a Proof of Concept. The focus at national level is on strengthening institutional capacities needed for development of capacities for co-design and co-production of climate services and their application in support of decision-making in climate sensitive

  19. Adopting a Resilience Practice Framework: A Case Study in What to Select and How to Implement

    Science.gov (United States)

    Antcliff, Greg; Mildon, Robyn; Baldwin, Laura; Michaux, Annette; Nay, Cherie

    2014-01-01

    This paper describes the collaborative application of three theoretical models for supporting service planning (Hunter, 2006), programme planning (Chorpita et al, 2005a), and implementation (Meyers et al, 2012) to develop and implement a Resilience Practice Framework (RPF). Specifically, we (1) describe a theory of change framework (Hunter, 2006)…

  20. Overview of parton distributions and the quantum chromodynamics (QCD) framework

    Energy Technology Data Exchange (ETDEWEB)

    Tuni, Wu-Ki (Institute of Gas Technology, Chicago, IL (USA) Fermi National Accelerator Lab., Batavia, IL (USA))

    1990-08-01

    The perturbative QCD framework as the basis of the parton model is reviewed with emphasis on several issues pertinent to next-to-leading order (NLO) applications to a wide range of high energy processes. The current status of leading-order and NLO parton distributions is summarized and evaluated. Relevant issues and open questions for second-generation global analyses are discussed in order to provide an overview of topics to be covered by the Workshop. 15 refs., 6 figs., 1 tabs.

  1. Teachers Implementing Context-Based Teaching Materials: A Framework for Case-Analysis in Chemistry

    Science.gov (United States)

    Vos, Martin A. J.; Taconis, Ruurd; Jochems, Wim M. G.; Pilot, Albert

    2010-01-01

    We present a framework for analysing the interplay between context-based teaching material and teachers, and for evaluating the adequacy of the resulting implementation of context-based pedagogy in chemistry classroom practice. The development of the framework is described, including an account of its theoretical foundations. The framework needs…

  2. Teachers Implementing Context-Based Teaching Materials: A Framework for Case-Analysis in Chemistry

    Science.gov (United States)

    Vos, Martin A. J.; Taconis, Ruurd; Jochems, Wim M. G.; Pilot, Albert

    2010-01-01

    We present a framework for analysing the interplay between context-based teaching material and teachers, and for evaluating the adequacy of the resulting implementation of context-based pedagogy in chemistry classroom practice. The development of the framework is described, including an account of its theoretical foundations. The framework needs…

  3. Framework and Implementation for Improving Physics Essential Skills via Computer-Based Practice: Vector Math

    Science.gov (United States)

    Mikula, Brendon D.; Heckler, Andrew F.

    2017-01-01

    We propose a framework for improving accuracy, fluency, and retention of basic skills essential for solving problems relevant to STEM introductory courses, and implement the framework for the case of basic vector math skills over several semesters in an introductory physics course. Using an iterative development process, the framework begins with…

  4. a Framework for Distributed Mixed Language Scientific Applications

    Science.gov (United States)

    Quarrie, D. R.

    The Object Management Group has defined an architecture (CORBA) for distributed object applications based on an Object Request Broker and Interface Definition Language. This project builds upon this architecture to establish a framework for the creation of mixed language scientific applications. A prototype compiler has been written that generates FORTRAN 90 or Eiffel stubs and skeletons and the required C++ glue code from an input IDL file that specifies object interfaces. This generated code can be used directly for non-distributed mixed language applications or in conjunction with the C++ code generated from a commercial IDL compiler for distributed applications. A feasibility study is presently underway to see whether a fully integrated software development environment for distributed, mixed-language applications can be created by modifying the back-end code generator of a commercial CASE tool to emit IDL.

  5. An Effective Framework for Distributed Geospatial Query Processing in Grids

    Directory of Open Access Journals (Sweden)

    CHEN, B.

    2010-08-01

    Full Text Available The emergence of Internet has greatly revolutionized the way that geospatial information is collected, managed, processed and integrated. There are several important research issues to be addressed for distributed geospatial applications. First, the performance of geospatial applications is needed to be considered in the Internet environment. In this regard, the Grid as an effective distributed computing paradigm is a good choice. The Grid uses a series of middleware to interconnect and merge various distributed resources into a super-computer with capability of high performance computation. Secondly, it is necessary to ensure the secure use of independent geospatial applications in the Internet environment. The Grid just provides the utility of secure access to distributed geospatial resources. Additionally, it makes good sense to overcome the heterogeneity between individual geospatial information systems in Internet. The Open Geospatial Consortium (OGC proposes a number of generalized geospatial standards e.g. OGC Web Services (OWS to achieve interoperable access to geospatial applications. The OWS solution is feasible and widely adopted by both the academic community and the industry community. Therefore, we propose an integrated framework by incorporating OWS standards into Grids. Upon the framework distributed geospatial queries can be performed in an interoperable, high-performance and secure Grid environment.

  6. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Science.gov (United States)

    Carrillo, Rafael E.; Aysal, Tuncer C.; Barner, Kenneth E.

    2010-12-01

    Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD) family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD) family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1) filtering for power line communications, (2) estimation in sensor networks with noisy channels, (3) reconstruction methods for compressed sensing, and (4) fuzzy clustering.

  7. A Generalized Cauchy Distribution Framework for Problems Requiring Robust Behavior

    Directory of Open Access Journals (Sweden)

    Carrillo RafaelE

    2010-01-01

    Full Text Available Statistical modeling is at the heart of many engineering problems. The importance of statistical modeling emanates not only from the desire to accurately characterize stochastic events, but also from the fact that distributions are the central models utilized to derive sample processing theories and methods. The generalized Cauchy distribution (GCD family has a closed-form pdf expression across the whole family as well as algebraic tails, which makes it suitable for modeling many real-life impulsive processes. This paper develops a GCD theory-based approach that allows challenging problems to be formulated in a robust fashion. Notably, the proposed framework subsumes generalized Gaussian distribution (GGD family-based developments, thereby guaranteeing performance improvements over traditional GCD-based problem formulation techniques. This robust framework can be adapted to a variety of applications in signal processing. As examples, we formulate four practical applications under this framework: (1 filtering for power line communications, (2 estimation in sensor networks with noisy channels, (3 reconstruction methods for compressed sensing, and (4 fuzzy clustering.

  8. Implementing Value-Based Payment Reform: A Conceptual Framework and Case Examples.

    Science.gov (United States)

    Conrad, Douglas A; Vaughn, Matthew; Grembowski, David; Marcus-Smith, Miriam

    2016-08-01

    This article develops a conceptual framework for implementation of value-based payment (VBP) reform and then draws on that framework to systematically examine six distinct multi-stakeholder coalition VBP initiatives in three different regions of the United States. The VBP initiatives deploy the following payment models: reference pricing, "shadow" primary care capitation, bundled payment, pay for performance, shared savings within accountable care organizations, and global payment. The conceptual framework synthesizes prior models of VBP implementation. It describes how context, project objectives, payment and care delivery strategies, and the barriers and facilitators to translating strategy into implementation affect VBP implementation and value for patients. We next apply the framework to six case examples of implementation, and conclude by discussing the implications of the case examples and the conceptual framework for future practice and research.

  9. Implementation Framework of Green Building for Malaysian Government Building: KKR2 Study

    Directory of Open Access Journals (Sweden)

    Sharif Shiela

    2016-01-01

    Full Text Available The purpose of this study is to develop the Implementation Framework of Green Building for Government Project. Implementation of Green Building Design is very important in Malaysia to conduct and execute green government project. The research intended to answer the questions about the factors involve in the development of the framework, what is the significant relationships exists between the factors involve in the Implementation Framework and whether there is significant relationship exists between the factors in implementation framework. A total 30 respondent selected between multilevel of project team KKR2 including engineers, assistants engineers, technical assistant, stake holders, contractors and consultants. In conclusion the study answered pertaining questions regarding the factors involve in development of Implementation Framework of Green Building through quantitative research and hypothesis testing.

  10. CORBA-based distributed software framework for the NIF integrated computer control system

    Energy Technology Data Exchange (ETDEWEB)

    Stout, E.A. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)], E-mail: stout6@llnl.gov; Carey, R.W.; Estes, C.M.; Fisher, J.M.; Lagin, L.J.; Mathisen, D.G.; Reynolds, C.A.; Sanchez, R.J. [Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, CA 94550 (United States)

    2008-04-15

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8 MJ, 500 TW, ultra-violet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. The NIF is operated by the Integrated Computer Control System (ICCS) which is a scalable, framework-based control system distributed over 800 computers throughout the NIF. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Object-oriented software design patterns are implemented as templates and extended by application software. Developers extend the framework base classes to model the numerous physical control points and implement specializations of common application behaviors. An estimated 140,000 software objects, each individually addressable through CORBA, will be active at full scale. Many of these objects have persistent configuration information stored in a database. The configuration data is used to initialize the objects at system start-up. Centralized server programs that implement events, alerts, reservations, data archival, name service, data access, and process management provide common system wide services. At the highest level, a model-driven, distributed shot automation system provides a flexible and scalable framework for automatic sequencing of workflow for control and monitoring of NIF shots. The shot model, in conjunction with data defining the parameters and goals of an experiment, describes the steps to be performed by each subsystem in order to prepare for and fire a NIF shot. Status and usage of this distributed framework are described.

  11. CORBA-Based Distributed Software Framework for the NIF Integrated Computer Control System

    Energy Technology Data Exchange (ETDEWEB)

    Stout, E A; Carey, R W; Estes, C M; Fisher, J M; Lagin, L J; Mathisen, D G; Reynolds, C A; Sanchez, R J

    2007-11-20

    The National Ignition Facility (NIF), currently under construction at the Lawrence Livermore National Laboratory, is a stadium-sized facility containing a 192-beam, 1.8 Megajoule, 500-Terawatt, ultra-violet laser system together with a 10-meter diameter target chamber with room for nearly 100 experimental diagnostics. The NIF is operated by the Integrated Computer Control System (ICCS) which is a scalable, framework-based control system distributed over 800 computers throughout the NIF. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Object-oriented software design patterns are implemented as templates and extended by application software. Developers extend the framework base classes to model the numerous physical control points and implement specializations of common application behaviors. An estimated 140 thousand software objects, each individually addressable through CORBA, will be active at full scale. Many of these objects have persistent configuration information stored in a database. The configuration data is used to initialize the objects at system start-up. Centralized server programs that implement events, alerts, reservations, data archival, name service, data access, and process management provide common system wide services. At the highest level, a model-driven, distributed shot automation system provides a flexible and scalable framework for automatic sequencing of work-flow for control and monitoring of NIF shots. The shot model, in conjunction with data defining the parameters and goals of an experiment, describes the steps to be performed by each subsystem in order to prepare for and fire a NIF shot. Status and usage of this distributed framework are described.

  12. Making sense of complexity in context and implementation: the Context and Implementation of Complex Interventions (CICI) framework.

    Science.gov (United States)

    Pfadenhauer, Lisa M; Gerhardus, Ansgar; Mozygemba, Kati; Lysdahl, Kristin Bakke; Booth, Andrew; Hofmann, Bjørn; Wahlster, Philip; Polus, Stephanie; Burns, Jacob; Brereton, Louise; Rehfuess, Eva

    2017-02-15

    The effectiveness of complex interventions, as well as their success in reaching relevant populations, is critically influenced by their implementation in a given context. Current conceptual frameworks often fail to address context and implementation in an integrated way and, where addressed, they tend to focus on organisational context and are mostly concerned with specific health fields. Our objective was to develop a framework to facilitate the structured and comprehensive conceptualisation and assessment of context and implementation of complex interventions. The Context and Implementation of Complex Interventions (CICI) framework was developed in an iterative manner and underwent extensive application. An initial framework based on a scoping review was tested in rapid assessments, revealing inconsistencies with respect to the underlying concepts. Thus, pragmatic utility concept analysis was undertaken to advance the concepts of context and implementation. Based on these findings, the framework was revised and applied in several systematic reviews, one health technology assessment (HTA) and one applicability assessment of very different complex interventions. Lessons learnt from these applications and from peer review were incorporated, resulting in the CICI framework. The CICI framework comprises three dimensions-context, implementation and setting-which interact with one another and with the intervention dimension. Context comprises seven domains (i.e., geographical, epidemiological, socio-cultural, socio-economic, ethical, legal, political); implementation consists of five domains (i.e., implementation theory, process, strategies, agents and outcomes); setting refers to the specific physical location, in which the intervention is put into practise. The intervention and the way it is implemented in a given setting and context can occur on a micro, meso and macro level. Tools to operationalise the framework comprise a checklist, data extraction tools for

  13. A Parameterized Design Framework for Hardware Implementation of Particle Filters

    Science.gov (United States)

    2008-03-01

    synchronization operations. 2.2 Design Framework Figure 2 shows the overall design framework. We use Xilinx’s System Generator for design and functional...verification and the Xilinx ISE tool-set for synthesis. Xilinx System Generator pro- vides a hardware library that consists of various architectural units, such...Interface modules Parameterized HDL libraries Xilinx System Generator modules Parameterized particle filter system Synthesis and Code Generation C D B

  14. Telemedicine: what framework, what levels of proof, implementation rules.

    Science.gov (United States)

    Zannad, Faiez; Maugendre, Philippe; Audry, Antoine; Avril, Carole; Blaise, Lucile; Blin, Olivier; Burnel, Philippe; Falise-Mirat, Béatrice; Girault, Danièle; Giri, Isabelle; Goehrs, Jean-Marie; Lassale, Catherine; Le Meur, Roland; Leurent, Pierre; Ratignier-Carbonneil, Christelle; Rossignol, Patrick; Satonnet, Evelyne; Simon, Pierre; Treluyer, Laurent

    2014-01-01

    The concept of telemedicine was formalised in France in the 2009 "Hospital, patients, health territories" (loi hôpital, patients, santé, territoire) law and the 2010 decree through which it was applied. Many experiments have been carried out and the regulatory institutions (Ministry, Regional Health Agency [Agence régionale de santé, ARS], French National Health Authority [Haute autorité de santé, HAS], etc.) have issued various guidance statements and recommendations on its organisation and on the expectations of its evaluation. With this background, the round table wanted to produce recommendations on different areas of medical telemonitoring (the role of telemonitoring, the regulatory system, the principles for assessment, methods of use and conditions for sustained and seamless deployment). Whilst many studies carried out on new medical telemonitoring approaches have led to the postulate that it offers benefit, both clinically and in terms of patient quality of life, more information is needed to demonstrate its impact on the organisation of healthcare and the associated medico-economic benefit (criteria, methods, resources). Similarly, contractual frameworks for deployment of telemonitoring do exist, although they are complicated and involve many different stakeholders (Director General fo the Care Offering [Direction générale de l'offre de soins, DGOS], ARS, HAS, Agency for Shared Health Information Systems [Agence des systèmes d'information partagés de santé, ASIP], French National Data Protection Commission [Commission nationale informatique et libertés, CNIL], French National Medical Council [Conseil national de l'Ordre des médecins, CNOM], etc.) that would benefit from a shared approach and seamless exchange between the partners involved. The current challenge is also to define the conditions required to validate a stable economic model in order to promote organisational change. One topical issue is placing the emphasis on its evaluation and

  15. A Distributed Python HPC Framework: ODIN, PyTrilinos, & Seamless

    Energy Technology Data Exchange (ETDEWEB)

    Grant, Robert [Enthought, Inc., Austin, TX (United States)

    2015-11-23

    Under this grant, three significant software packages were developed or improved, all with the goal of improving the ease-of-use of HPC libraries. The first component is a Python package, named DistArray (originally named Odin), that provides a high-level interface to distributed array computing. This interface is based on the popular and widely used NumPy package and is integrated with the IPython project for enhanced interactive parallel distributed computing. The second Python package is the Distributed Array Protocol (DAP) that enables separate distributed array libraries to share arrays efficiently without copying or sending messages. If a distributed array library supports the DAP, it is then automatically able to communicate with any other library that also supports the protocol. This protocol allows DistArray to communicate with the Trilinos library via PyTrilinos, which was also enhanced during this project. A third package, PyTrilinos, was extended to support distributed structured arrays (in addition to the unstructured arrays of its original design), allow more flexible distributed arrays (i.e., the restriction to double precision data was lifted), and implement the DAP. DAP support includes both exporting the protocol so that external packages can use distributed Trilinos data structures, and importing the protocol so that PyTrilinos can work with distributed data from external packages.

  16. A Framework for Distributed Dynamic Load Balancing in Heterogeneous Cluster

    Directory of Open Access Journals (Sweden)

    Neeraj Nehra

    2007-01-01

    Full Text Available Distributed Dynamic load balancing (DDLB is an important system function destined to distribute workload among available processors to improve throughput and/or execution times of parallel computer in Cluster Computing. Instead of balancing the load in cluster by process migration, or by moving an entire process to a less loaded computer, we make an attempt to balance load by splitting processes into separate jobs and then balance them to nodes. In order to get target, we use mobile agent (MA to distribute load among nodes in a cluster. In this study, a multi-agent framework for load balancing in heterogeneous cluster is given. Total load on node is calculated using queue length which is measured as the total number of processes in queue. We introduce types of agents along with policies needed to meet the requirements of the proposed load-balancing. Different metrics are used to compare load balancing mechanism with the existing message passing technology. The experiment is carried out on cluster of PC's divided into multiple LAN's using PMADE (Platform for Mobile agent distribution and execution. Preliminary experimental results demonstrated that the proposed framework is effective than the existing ones.

  17. A framework for effective implementation of lean production in Small and Medium-sized Enterprises

    Directory of Open Access Journals (Sweden)

    Amine Belhadi

    2016-09-01

    Full Text Available Purpose: The present paper aims at developing an effective framework including all the components necessary for implementing lean production properly in Small and Medium-sized Enterprises. Design/methodology/approach: The paper begins with the review of the main existing framework of lean implementation in order to highlight shortcomings in the literature through a lack of suitable framework for small companies. To overcome this literature gap, data of successful initiatives of lean implementation were collected based on a multiple case study approach. These initiatives has been juxtaposed in order to develop a new, practical and effective framework that includes all the components (process, tools, success factors that are necessary to implement lean in Small and Medium-sized Enterprises. Findings: The proposed framework presents many significant contributions: First, it provides an overcoming for the limitations of the existing frameworks by proposing for consultants, researchers and organizations an effective framework for lean implementation in SMEs that allows SMEs to benefit from competitive advantages  gained by lean. Second, it brings together a set of the more essential and critical elements of lean implementation commonly used by SMEs and derived from the practical experiences of them in lean implementation. Finally, it highlights the successful experiences of small companies in implementing lean programs and then proves that lean can give a relevant results even for SMEs. Research limitations/implications: The proposed framework presents a number of limitations and still evokes extension for further researches: Although it was derived from practical experiences of SMEs, the proposed framework is not supported by practical implementation. On the other hand and even though the elements in the proposed framework from the practical experiences of four SMEs, the identified elements need to be generalized and enriching by conducting

  18. MAPI: a software framework for distributed biomedical applications

    Directory of Open Access Journals (Sweden)

    Karlsson Johan

    2013-01-01

    Full Text Available Abstract Background The amount of web-based resources (databases, tools etc. in biomedicine has increased, but the integrated usage of those resources is complex due to differences in access protocols and data formats. However, distributed data processing is becoming inevitable in several domains, in particular in biomedicine, where researchers face rapidly increasing data sizes. This big data is difficult to process locally because of the large processing, memory and storage capacity required. Results This manuscript describes a framework, called MAPI, which provides a uniform representation of resources available over the Internet, in particular for Web Services. The framework enhances their interoperability and collaborative use by enabling a uniform and remote access. The framework functionality is organized in modules that can be combined and configured in different ways to fulfil concrete development requirements. Conclusions The framework has been tested in the biomedical application domain where it has been a base for developing several clients that are able to integrate different web resources. The MAPI binaries and documentation are freely available at http://www.bitlab-es.com/mapi under the Creative Commons Attribution-No Derivative Works 2.5 Spain License. The MAPI source code is available by request (GPL v3 license.

  19. A scalable distribution network risk evaluation framework via symbolic dynamics.

    Directory of Open Access Journals (Sweden)

    Kai Yuan

    Full Text Available Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk.This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors - device, structure, load and special operation - a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method.Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic.

  20. A Scalable Distribution Network Risk Evaluation Framework via Symbolic Dynamics

    Science.gov (United States)

    Yuan, Kai; Liu, Jian; Liu, Kaipei; Tan, Tianyuan

    2015-01-01

    Background Evaluations of electric power distribution network risks must address the problems of incomplete information and changing dynamics. A risk evaluation framework should be adaptable to a specific situation and an evolving understanding of risk. Methods This study investigates the use of symbolic dynamics to abstract raw data. After introducing symbolic dynamics operators, Kolmogorov-Sinai entropy and Kullback-Leibler relative entropy are used to quantitatively evaluate relationships between risk sub-factors and main factors. For layered risk indicators, where the factors are categorized into four main factors – device, structure, load and special operation – a merging algorithm using operators to calculate the risk factors is discussed. Finally, an example from the Sanya Power Company is given to demonstrate the feasibility of the proposed method. Conclusion Distribution networks are exposed and can be affected by many things. The topology and the operating mode of a distribution network are dynamic, so the faults and their consequences are probabilistic. PMID:25789859

  1. The Consolidated Framework for Implementation Research (CFIR): a useful theoretical framework for guiding and evaluating a guideline implementation process in a hospital-based nursing practice.

    Science.gov (United States)

    Breimaier, Helga E; Heckemann, Birgit; Halfens, Ruud J G; Lohrmann, Christa

    2015-01-01

    Implementing clinical practice guidelines (CPGs) in healthcare settings is a complex intervention involving both independent and interdependent components. Although the Consolidated Framework for Implementation Research (CFIR) has never been evaluated in a practical context, it appeared to be a suitable theoretical framework to guide an implementation process. The aim of this study was to evaluate the comprehensiveness, applicability and usefulness of the CFIR in the implementation of a fall-prevention CPG in nursing practice to improve patient care in an Austrian university teaching hospital setting. The evaluation of the CFIR was based on (1) team-meeting minutes, (2) the main investigator's research diary, containing a record of a before-and-after, mixed-methods study design embedded in a participatory action research (PAR) approach for guideline implementation, and (3) an analysis of qualitative and quantitative data collected from graduate and assistant nurses in two Austrian university teaching hospital departments. The CFIR was used to organise data per and across time point(s) and assess their influence on the implementation process, resulting in implementation and service outcomes. Overall, the CFIR could be demonstrated to be a comprehensive framework for the implementation of a guideline into a hospital-based nursing practice. However, the CFIR did not account for some crucial factors during the planning phase of an implementation process, such as consideration of stakeholder aims and wishes/needs when implementing an innovation, pre-established measures related to the intended innovation and pre-established strategies for implementing an innovation. For the CFIR constructs reflecting & evaluating and engaging, a more specific definition is recommended. The framework and its supplements could easily be used by researchers, and their scope was appropriate for the complexity of a prospective CPG-implementation project. The CFIR facilitated qualitative data

  2. A classification framework for clinical information system implementation in hospitals.

    NARCIS (Netherlands)

    Meulendijks, A.; Batenburg, R.; Wetering, R. van de

    2012-01-01

    In the last decade, many information system (IS) implementations took place in the healthcare organisations. Mainstream reasons for this evolvement are the increase of quality and safety of care, and reducing costs. As in many other sectors IS implementations in healthcare are complex, and confronte

  3. Framework of Distributed Coupled Atmosphere-Ocean-Wave Modeling System

    Institute of Scientific and Technical Information of China (English)

    WEN Yuanqiao; HUANG Liwen; DENG Jian; ZHANG Jinfeng; WANG Sisi; WANG Lijun

    2006-01-01

    In order to research the interactions between the atmosphere and ocean as well as their important role in the intensive weather systems of coastal areas, and to improve the forecasting ability of the hazardous weather processes of coastal areas, a coupled atmosphere-ocean-wave modeling system has been developed.The agent-based environment framework for linking models allows flexible and dynamic information exchange between models. For the purpose of flexibility, portability and scalability, the framework of the whole system takes a multi-layer architecture that includes a user interface layer, computational layer and service-enabling layer. The numerical experiment presented in this paper demonstrates the performance of the distributed coupled modeling system.

  4. Implementing accountability for reasonableness framework at district level in Tanzania

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; SanSebastián, Miguel

    2011-01-01

    Despite the growing importance of the Accountability for Reasonableness (A4R) framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management p...

  5. First Thoughts on Implementing the Framework for Information Literacy

    Science.gov (United States)

    Jacobson, Trudi E.; Gibson, Craig

    2015-01-01

    Following the action of the ACRL Board in February 2015 in accepting the "Framework for Information Literacy for Higher Education" as one of the "constellation of documents" that promote and guide information literacy instruction and program development, discussion in the library community continues about steps in implementing…

  6. Implementing the C3 Framework: Monitoring the Instructional Shifts

    Science.gov (United States)

    Herczog, Michelle M.

    2014-01-01

    The College, Career, and Civic Life (C3) Framework for Social Studies State Standards calls upon social studies teachers to enhance the rigor of civics, economics, geography, history and the other social studies disciplines while building the critical thinking, problem solving, and participatory skills of students to help them become actively…

  7. Combined use of the Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF): a systematic review.

    Science.gov (United States)

    Birken, Sarah A; Powell, Byron J; Presseau, Justin; Kirk, M Alexis; Lorencatto, Fabiana; Gould, Natalie J; Shea, Christopher M; Weiner, Bryan J; Francis, Jill J; Yu, Yan; Haines, Emily; Damschroder, Laura J

    2017-01-05

    Over 60 implementation frameworks exist. Using multiple frameworks may help researchers to address multiple study purposes, levels, and degrees of theoretical heritage and operationalizability; however, using multiple frameworks may result in unnecessary complexity and redundancy if doing so does not address study needs. The Consolidated Framework for Implementation Research (CFIR) and the Theoretical Domains Framework (TDF) are both well-operationalized, multi-level implementation determinant frameworks derived from theory. As such, the rationale for using the frameworks in combination (i.e., CFIR + TDF) is unclear. The objective of this systematic review was to elucidate the rationale for using CFIR + TDF by (1) describing studies that have used CFIR + TDF, (2) how they used CFIR + TDF, and (2) their stated rationale for using CFIR + TDF. We undertook a systematic review to identify studies that mentioned both the CFIR and the TDF, were written in English, were peer-reviewed, and reported either a protocol or results of an empirical study in MEDLINE/PubMed, PsycInfo, Web of Science, or Google Scholar. We then abstracted data into a matrix and analyzed it qualitatively, identifying salient themes. We identified five protocols and seven completed studies that used CFIR + TDF. CFIR + TDF was applied to studies in several countries, to a range of healthcare interventions, and at multiple intervention phases; used many designs, methods, and units of analysis; and assessed a variety of outcomes. Three studies indicated that using CFIR + TDF addressed multiple study purposes. Six studies indicated that using CFIR + TDF addressed multiple conceptual levels. Four studies did not explicitly state their rationale for using CFIR + TDF. Differences in the purposes that authors of the CFIR (e.g., comprehensive set of implementation determinants) and the TDF (e.g., intervention development) propose help to justify the use of CFIR

  8. Rethinking Sustainability, Scaling Up, and Enabling Environment: A Framework for Their Implementation in Drinking Water Supply

    Directory of Open Access Journals (Sweden)

    Urooj Q. Amjad

    2015-04-01

    Full Text Available The terms sustainability, scaling up, and enabling environment are inconsistently used in implementing water supply projects. To clarify these terms we develop a framework based on Normalization Process Theory, and apply the framework to a hypothetical water supply project in schools. The resulting framework provides guidance on how these terms could be implemented and analyzed in water supply projects. We conclude that effective use of the terms sustainability, scaling up, and enabling environment would focus on purpose, process, and perspective. This is the first known attempt to analyze the implementation of the three terms together in the context of water supply services.

  9. The Spatial Development Framework for Implementation of National ...

    African Journals Online (AJOL)

    kagoyire

    methodology for the NUP implementation in Rwanda. ..... economic growth which is sustainable and guided by green economic criteria, whereby urban .... In this meeting also the next step was conceptually presented (A5), the SMCE analysis.

  10. Establishing an intrapreneurial orientation as strategy: A framework for implementation

    Directory of Open Access Journals (Sweden)

    H. Jacobs

    2001-12-01

    Full Text Available This paper describes a study aimed at increasing an organisation's ability to implement a strategy for establishing an intrapreneurial orientation effectively. Establishing an intrapreneurial orientation will be treated from a strategic management point of view, with the emphasis on the implementation phase of strategic management. As such, this study seeks to integrate theory and practice from the fields of strategic management and entrepreneurship.

  11. A CONCEPTUAL FRAMEWORK OF DISTRIBUTIVE JUSTICE IN ISLAMIC ECONOMICS

    Directory of Open Access Journals (Sweden)

    Shafinah Begum Abdul Rahim

    2015-06-01

    Full Text Available itical, behavioural and social sciences both in mainstream or Islam. Given its increasing relevance to the global village we share and the intensity of socio-economic problems invariably related to the distribution of resources amongst us, this work is aimed at adding value through a deeper understanding and appreciation of justice placed by the Syariah in all domains of of our economic lives. The existing works within this area appear to lean mostly towards redistributive mechanisms available in the revealed knowledge. Hence a comprehensive analysis of the notion of distributive justice from the theoretical level translated into practical terms is expected to contribute significantly to policymakers committed towards finding permanent solutions to economic problems especially in the Muslim world. It is a modest yet serious attempt to bridge the gap between distributive justice in letter and spirit as clearly ordained in the Holy Quran. The entire analysis is based on critical reviews and appraisals of the all relevant literary on distributive justice in Islamic Economics. The final product is a conceptual framework that can be used as a blueprint in establishing the notion of justice in the distribution of economic resources, i.e. income and wealth as aspired by the Syariah.

  12. Qualitative study on the implementation of professional pharmacy services in Australian community pharmacies using framework analysis.

    Science.gov (United States)

    Moullin, Joanna C; Sabater-Hernández, Daniel; Benrimoj, Shalom I

    2016-08-25

    Multiple studies have explored the implementation process and influences, however it appears there is no study investigating these influences across the stages of implementation. Community pharmacy is attempting to implement professional services (pharmaceutical care and other health services). The use of implementation theory may assist the achievement of widespread provision, support and integration. The objective was to investigate professional service implementation in community pharmacy to contextualise and advance the concepts of a generic implementation framework previously published. Purposeful sampling was used to investigate implementation across a range of levels of implementation in community pharmacies in Australia. Twenty-five semi-structured interviews were conducted and analysed using a framework methodology. Data was charted using implementation stages as overarching themes and each stage was thematically analysed, to investigate the implementation process, the influences and their relationships. Secondary analyses were performed of the factors (barriers and facilitators) using an adapted version of the Consolidated Framework for Implementation Research (CFIR), and implementation strategies and interventions, using the Expert Recommendations for Implementing Change (ERIC) discrete implementation strategy compilation. Six stages emerged, labelled as development or discovery, exploration, preparation, testing, operation and sustainability. Within the stages, a range of implementation activities/steps and five overarching influences (pharmacys' direction and impetus, internal communication, staffing, community fit and support) were identified. The stages and activities were not applied strictly in a linear fashion. There was a trend towards the greater the number of activities considered, the greater the apparent integration into the pharmacy organization. Implementation factors varied over the implementation stages, and additional factors were added

  13. Framework for measuring the sustainability performance of ecodesign implementation

    DEFF Research Database (Denmark)

    Rodrigues, Vinicius Picanco; Pigosso, Daniela Cristina Antelmi; McAloone, Tim C.

    Companies and academic studies are consistently reporting several potential business benefits gained fromecodesign implementation, such as increased innovation potential, development of new markets and business models, reduction in environmental liability, risks and costs, improvement of organiza......Companies and academic studies are consistently reporting several potential business benefits gained fromecodesign implementation, such as increased innovation potential, development of new markets and business models, reduction in environmental liability, risks and costs, improvement...... of organizational brandand legal compliance, among others. However, there is a number of challenges that still hamper corporate adoption of ecodesign, mainly regarding the capture and measurement of the estimated business benefits. Furthermore, ecodesign efforts have been primarily evaluated in terms...... of environmental performance and product-related (technical) measures, such as shape, material and energy consumption. Because the ecodesign business benefits go beyond the pure environmental performance and its implementation should follow a consistent process-oriented integration, an approach based on the triple...

  14. [Migration policies and population distribution in Mexico: regional implementation and impact].

    Science.gov (United States)

    Reyna Bernal, A

    1991-01-01

    The author begins by analyzing the legal framework of migration and population redistribution policies in Mexico since 1973, when the General Law on Population was passed. In that framework, she points out several institutional deficiencies and contradictions. In the second section, she discusses demographic planning in this field, and conducts a follow-up on implementation based on reports from the institutions involved and official documents. To conclude, she examines practical applications of policy instruments, comparing the behavior of governmental variables that had a bearing on the phenomenon before and after the implementation of said policy (1970 to 1990). One of her findings is that the basic elements making population policies on migration operational did not move in the desired direction, i.e., to modify migratory and population distribution so as to achieve expected goals. Despite the continuity of population policies on migration, the implementation of such policies still poses diverse obstacles.

  15. Establishing a framework to implement 4D XCAT Phantom for 4D radiotherapy research

    Directory of Open Access Journals (Sweden)

    Raj K Panta

    2012-01-01

    Conclusions: An integrated computer program has been developed to generate, review, analyse, process, and export the 4D XCAT images. A framework has been established to implement the 4D XCAT phantom for 4D RT research.

  16. Framework and guidelines for implementing the co-management approach: volume I : context, concept and principles

    National Research Council Canada - National Science Library

    1999-01-01

    The purpose of this document is to develop a common understanding of the co-management approach in fisheries management and to provide an overview of the various frameworks used for its implementation...

  17. Design Of Real-Time Implementable Distributed Suboptimal Control: An LQR Perspective

    KAUST Repository

    Jaleel, Hassan

    2017-09-29

    We propose a framework for multiagent systems in which the agents compute their control actions in real time, based on local information only. The novelty of the proposed framework is that the process of computing a suboptimal control action is divided into two phases: an offline phase and an online phase. In the offline phase, an approximate problem is formulated with a cost function that is close to the optimal cost in some sense and is distributed, i.e., the costs of non-neighboring nodes are not coupled. This phase is centralized and is completed before the deployment of the system. In the online phase, the approximate problem is solved in real time by implementing any efficient distributed optimization algorithm. To quantify the performance loss, we derive upper bounds for the maximum error between the optimal performance and the performance under the proposed framework. Finally, the proposed framework is applied to an example setup in which a team of mobile nodes is assigned the task of establishing a communication link between two base stations with minimum energy consumption. We show through simulations that the performance under the proposed framework is close to the optimal performance and the suboptimal policy can be efficiently implemented online.

  18. A Unified Algebraic and Logic-Based Framework Towards Safe Routing Implementations

    Science.gov (United States)

    2015-08-13

    AFRL-AFOSR-VA-TR-2015-0269 A Unified Algebraic and Logic -Based Framework towards Safe Routing Implementations Boon Loo TRUSTEES OF THE UNIVERSITY OF...2015 4. TITLE AND SUBTITLE "(YIP) A Unified Algebraic & Logic - Based Framework Towards Safe Routing Implementations" 5a. CONTRACT NUMBER 5b...training and a list of publications. 15. SUBJECT TERMS Formal methods, networking, logic , routing algebra, software defined networking 16. SECURITY

  19. Design & implementation of distributed spatial computing node based on WPS

    Science.gov (United States)

    Liu, Liping; Li, Guoqing; Xie, Jibo

    2014-03-01

    Currently, the research work of SIG (Spatial Information Grid) technology mostly emphasizes on the spatial data sharing in grid environment, while the importance of spatial computing resources is ignored. In order to implement the sharing and cooperation of spatial computing resources in grid environment, this paper does a systematical research of the key technologies to construct Spatial Computing Node based on the WPS (Web Processing Service) specification by OGC (Open Geospatial Consortium). And a framework of Spatial Computing Node is designed according to the features of spatial computing resources. Finally, a prototype of Spatial Computing Node is implemented and the relevant verification work under the environment is completed.

  20. Developing a Framework and Implementing User-Driven Innovation in Supply and Value Network

    DEFF Research Database (Denmark)

    Jacobsen, Alexia; Lassen, Astrid Heidemann; Wandahl, Søren

    2011-01-01

    This paper serves to create a framework for and, subsequently, implementing user-driven innovation in a construction material industry network. The research has its outset in Project InnoDoors that consists of a Danish university and a construction material network. The framework and the implemen...

  1. The Road of ERP Success: A Framework Model for Successful ERP Implementation

    Directory of Open Access Journals (Sweden)

    Sevenpri Candra

    2011-11-01

    Full Text Available To compete with nowadays business is to implement technology and align it into their business strategy. One of technology that commonly implement is Enterprise Resource Planning (ERP. This research will examined what are critical success factor of ERP and the impact of their business outcomes. A framework model for ERP Implementation success is constructs from several research or previous study in Implementation ERP. This study will extends in the research field of successful implementation ERP and implication factor for business practice to have more knowledge in term of implementation ERP and their business strategy. 

  2. Effective Implementation of Agile Practices - Ingenious and Organized Theoretical Framework

    Directory of Open Access Journals (Sweden)

    Veerapaneni Esther Jyothi

    2011-03-01

    Full Text Available Delivering software in traditional ways is challenged by agile software development to provide a very different approach to software development. Agile methods aim at fast, light and efficient than any other vigorous method to develop and support customers business without being chaotic. Agile software development methods claim to be people-oriented rather than process-oriented and adaptive rather than predictive. Solid Determination and Dedicated efforts are required in agile development to overcome the disadvantages of predefined set of steps and changing requirements to see the desirable outcome and to avoid the predictable results. These methods reach the target promptly by linking developers and stakeholders. The focus of this research paper is two fold. The first part is to study different agile methodologies, find out the levelheaded difficulties in agile software development and suggests possible solutions with a collaborative and innovative framework. The second part of the research paper concentrates on the importance of handling traceability in agile software development and finally proposes an ingenious and organized theoretical framework with a systematic approach to agile software development.

  3. OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing

    Science.gov (United States)

    Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping

    2017-02-01

    The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.

  4. Design and Implementation of Telemedicine based on Java Media Framework

    Science.gov (United States)

    Xiong, Fengguang; Jia, Zhiyan

    According to analyze the importance and problem of telemedicine in this paper, a telemedicine system based on JMF is proposed to design and implement capturing, compression, storage, transmission, reception and play of a medical audio and video. The telemedicine system can solve existing problems that medical information is not shared, platform-dependent is high, software is incompatibilities and so on. Experimental data prove that the system has low hardware cost, and is easy to transmission and storage, and is portable and powerful.

  5. A portable implementation of ARPACK for distributed memory parallel architectures

    Energy Technology Data Exchange (ETDEWEB)

    Maschhoff, K.J.; Sorensen, D.C.

    1996-12-31

    ARPACK is a package of Fortran 77 subroutines which implement the Implicitly Restarted Arnoldi Method used for solving large sparse eigenvalue problems. A parallel implementation of ARPACK is presented which is portable across a wide range of distributed memory platforms and requires minimal changes to the serial code. The communication layers used for message passing are the Basic Linear Algebra Communication Subprograms (BLACS) developed for the ScaLAPACK project and Message Passing Interface(MPI).

  6. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework.

    Science.gov (United States)

    French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M

    2012-04-24

    There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a

  7. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework

    Directory of Open Access Journals (Sweden)

    French Simon D

    2012-04-01

    Full Text Available Abstract Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s of delivery could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be

  8. Mass customization and sustainability an assessment framework and industrial implementation

    CERN Document Server

    Boër, Claudio R; Bettoni, Andrea; Sorlini, Marzio

    2013-01-01

    To adapt to global competitive pressures, manufacturers must develop methods and enabling technologies towards a personalized, customer oriented and sustainable manufacturing. Mass Customization and Sustainability defines the two concepts of mass customization and sustainability and introduces a framework to establish a link between the two concepts to answer the questions: Are these two aspects empowering one another? Or are they hindering one another?   These questions investigate mass customization as one of the main driving forces to achieve effective sustainability.  A methodology to assess the contribution of mass customization to sustainability is developed, providing an assessment model composed by a set of indicators covering the three aspects of sustainability: social, economical and environmental. This is supported and further explained using ideas and new concepts compiled from recent European research.   Researchers, scientists, managers and industry professionals alike can follow a set of ...

  9. A distributed implementation of a mode switching control program

    DEFF Research Database (Denmark)

    Holdgaard, Michael; Eriksen, Thomas Juul; Ravn, Anders P.

    1995-01-01

    according to the schedule, and a final one monitors the system for exceptions that shall lead to a halt. The implementation uses four transputers with a distribution of phases of the automatons over the individual processors. The main technical result of the paper is calculations that illustrate how......A distributed implementation of a mode switched control program for a robot is described. The design of the control program is given by a set of real-time automatons. One of them plans a schedule for switching between a fixed set of control functions, another dispatches the control functions...

  10. Mobile Autonomous Sensing Unit (MASU: A Framework That Supports Distributed Pervasive Data Sensing

    Directory of Open Access Journals (Sweden)

    Esunly Medina

    2016-07-01

    Full Text Available Pervasive data sensing is a major issue that transverses various research areas and application domains. It allows identifying people’s behaviour and patterns without overwhelming the monitored persons. Although there are many pervasive data sensing applications, they are typically focused on addressing specific problems in a single application domain, making them difficult to generalize or reuse. On the other hand, the platforms for supporting pervasive data sensing impose restrictions to the devices and operational environments that make them unsuitable for monitoring loosely-coupled or fully distributed work. In order to help address this challenge this paper present a framework that supports distributed pervasive data sensing in a generic way. Developers can use this framework to facilitate the implementations of their applications, thus reducing complexity and effort in such an activity. The framework was evaluated using simulations and also through an empirical test, and the obtained results indicate that it is useful to support such a sensing activity in loosely-coupled or fully distributed work scenarios.

  11. Implementing a Trust Overlay Framework for Digital Ecosystems

    Science.gov (United States)

    Malone, Paul; McGibney, Jimmy; Botvich, Dmitri; McLaughlin, Mark

    Digital Ecosystems, being decentralised in nature, are inherently untrustworthy environments. This is due to the fact that these environments lack a centralised gatekeeper and identity provider. In order for businesses to operate in these environments there is a need for security measures to support accountability and traceability. This paper describes a trust overlay network developed in the OPAALS project to allow entities participating in digital ecosystems to share experience through the exchange of trust values and to leverage on this network to determine reputation based trustworthiness of unknown and initially untrusted entities. An overlay network is described together with sample algorithms and a discussion on implementation.

  12. A System for Distributed Mechanisms: Design, Implementation and Applications

    CERN Document Server

    Apt, Krzysztof R; Ma, Huiye

    2007-01-01

    We describe here a structured system for distributed mechanism design. In our approach the players dynamically form a network in which they know neither their neighbours nor the size of the network and interact to jointly take decisions. The only assumption concerning the underlying communication layer is that for each pair of processes there is a path of neighbours connecting them. This allows us to deal with arbitrary network topologies. We also discuss the implementation of this system that consists of a sequence of layers. The lower layers deal with the operations relevant for distributed computing only, while the upper layers are concerned only with communication among players, including broadcasting and multicasting, and distributed decision making. This yields a highly flexible distributed system whose specific applications are realized as instances of a top layer. This design is implemented in Java. The system can be used for a repeated creation of dynamically formed networks of players interested in ...

  13. A Generic Deployment Framework for Grid Computing and Distributed Applications

    CERN Document Server

    Flissi, Areski

    2006-01-01

    Deployment of distributed applications on large systems, and especially on grid infrastructures, becomes a more and more complex task. Grid users spend a lot of time to prepare, install and configure middleware and application binaries on nodes, and eventually start their applications. The problem is that the deployment process is composed of many heterogeneous tasks that have to be orchestrated in a specific correct order. As a consequence, the automatization of the deployment process is currently very difficult to reach. To address this problem, we propose in this paper a generic deployment framework allowing to automatize the execution of heterogeneous tasks composing the whole deployment process. Our approach is based on a reification as software components of all required deployment mechanisms or existing tools. Grid users only have to describe the configuration to deploy in a simple natural language instead of programming or scripting how the deployment process is executed. As a toy example, this framew...

  14. Distributed automata in an assumption-commitment framework

    Indian Academy of Sciences (India)

    Swarup Mohalik; R Ramanujam

    2002-04-01

    We propose a class of finite state systems of synchronizing distributed processes, where processes make assumptions at local states about the state of other processes in the system. This constrains the global states of the system to those where assumptions made by a process about another are compatible with the commitments offered by the other at that state. We model examples like reliable bit transmission and sequence transmission protocols in this framework and discuss how assumption-commitment structure facilitates compositional design of such protocols. We prove a decomposition theorem which states that every protocol specified globally as a finite state system can be decomposed into such an assumption compatible system. We also present a syntactic characterization of this class using top level parallel composition.

  15. Analysis of Grid Based Distributed Data Mining System for Service Oriented Frameworks

    Directory of Open Access Journals (Sweden)

    Praseeda Manoj

    2013-01-01

    Full Text Available Distribution of data and computation allows for solving larger problems and execute applications that are distributed in nature. A Grid is a distributed computing infrastructure that enables to manage large amount of data and run business applications supporting consumers and end users. The Grid can play a significant role in providing an effective computational infrastructure that enables coordinated resource sharing within dynamic organizations. There have been several systems proposed to build distributed data mining. This paper analyses different grid based distributed data mining applications which help to give an overview of how Grid computing can be used to support distributed data mining. In addition, the synergy between data mining and grid technology is also discussed. This concept is implemented in Weka4WS, a framework that extends the widely used open source Weka toolkit to support distributed data mining on WSRF-enabled Grids . Weka4WS adopts the WSRF technology for running remote data mining algorithms and managing distributed computations.

  16. A framework for plasticity implementation on the SpiNNaker neural architecture

    Directory of Open Access Journals (Sweden)

    Francesco eGalluppi

    2015-01-01

    Full Text Available Many of the precise biological mechanisms of synaptic plasticity remain elusive, but simulations of neural networks have greatly enhanced our understanding of how specific global functions arise from the massively parallel computation of neurons and local Hebbian or spike-timing dependent plasticity rules. For simulating large portions of neural tissue, this has created an increasingly strong need for large scale simulations of plastic neural networks on special purpose hardware platforms, because synaptic transmissions and updates are badly matched to computing style supported by current architectures. Because of the great diversity of biological plasticity phenomena and the corresponding diversity of models, there is a great need for testing various hypotheses about plasticity before committing to one hardware implementation. Here we present a novel framework for investigating different plasticity approaches on the SpiNNaker distributed digital neural simulation platform.The key innovation of the proposed architecture is to exploit the reconfigurability of the ARM processors inside SpiNNaker, dedicating a subset of them exclusively to process synaptic plasticity updates, while the rest perform the usual neural and synaptic simulations. We demonstrate the flexibility of the proposed approach by showing the implementation of a variety of spike- and rate-based learning rules, including standard Spike-Timing dependent plasticity (STDP, voltage-dependent STDP, and the rate-based BCM rule.We analyze their performance and validate them by running classical learning experiments in real time on a 4-chip SpiNNaker board. The result is an efficient, modular, flexible and scalable framework, which provides a valuable tool for the fast and easy exploration of learning models of very different kinds on the parallel and reconfigurable SpiNNaker system.

  17. A framework for plasticity implementation on the SpiNNaker neural architecture.

    Science.gov (United States)

    Galluppi, Francesco; Lagorce, Xavier; Stromatias, Evangelos; Pfeiffer, Michael; Plana, Luis A; Furber, Steve B; Benosman, Ryad B

    2014-01-01

    Many of the precise biological mechanisms of synaptic plasticity remain elusive, but simulations of neural networks have greatly enhanced our understanding of how specific global functions arise from the massively parallel computation of neurons and local Hebbian or spike-timing dependent plasticity rules. For simulating large portions of neural tissue, this has created an increasingly strong need for large scale simulations of plastic neural networks on special purpose hardware platforms, because synaptic transmissions and updates are badly matched to computing style supported by current architectures. Because of the great diversity of biological plasticity phenomena and the corresponding diversity of models, there is a great need for testing various hypotheses about plasticity before committing to one hardware implementation. Here we present a novel framework for investigating different plasticity approaches on the SpiNNaker distributed digital neural simulation platform. The key innovation of the proposed architecture is to exploit the reconfigurability of the ARM processors inside SpiNNaker, dedicating a subset of them exclusively to process synaptic plasticity updates, while the rest perform the usual neural and synaptic simulations. We demonstrate the flexibility of the proposed approach by showing the implementation of a variety of spike- and rate-based learning rules, including standard Spike-Timing dependent plasticity (STDP), voltage-dependent STDP, and the rate-based BCM rule. We analyze their performance and validate them by running classical learning experiments in real time on a 4-chip SpiNNaker board. The result is an efficient, modular, flexible and scalable framework, which provides a valuable tool for the fast and easy exploration of learning models of very different kinds on the parallel and reconfigurable SpiNNaker system.

  18. A FRAMEWORK FOR THE IMPLEMENTATION OF E-PROCUREMENT

    Directory of Open Access Journals (Sweden)

    M.V. Jooste

    2012-01-01

    Full Text Available

    ENGLISH ABSTRACT: e-Procurement is a sound solution for companies to initiate e-business initiatives since it promises several distinct benefits, such as process cost and lead time reduction, improving strategic sourcing of commodities and enabling companies to negotiate better prices. Procurement is a broad term that touches on internal and external aspects of the company, complicating the decision to choose the most suitable e-procurement solution. An evaluation framework is proposed to assist in evaluating the susceptibility of a company’s products to e-procurement and choosing the most suitable e-procurement solution.

    AFRIKAANSE OPSOMMING: e-Verkryging is ‘n goeie beginpunt vir maatskappye om hulle aan e-besigheid bloot te stel aangesien dit verskeie voordele soos proseskoste en leitydvermindering, verbetering van strategiese aanskaffing van produkte en die vermoë om beter pryse met verskaffers te onderhandel, beloof. Verkryging is ‘n breë term wat heelwat aspekte binne en buite die onderneming behels. Dit bemoeilik die besluit om die mees geskikte e-verkryging-oplossing te kies. ’n Evalueringsraamwerk word voorgestel ter ondersteuning van die evaluering van die ontvanklikheid van ‘n onderneming se produkte vir e-verkryging en die keuse van die mees geskikte e-verkrygingoplossing.

  19. Introducing the Canadian Thoracic Society Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation

    Directory of Open Access Journals (Sweden)

    Samir Gupta

    2013-01-01

    Full Text Available The Canadian Thoracic Society (CTS is leveraging its strengths in guideline production to enable respiratory guideline implementation in Canada. The authors describe the new CTS Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation, which has three spheres of action: guideline production, implementation infrastructure and knowledge translation (KT methodological support. The Canadian Institutes of Health Research ‘Knowledge-to-Action’ process was adopted as the model of choice for conceptualizing KT interventions. Within the framework, new evidence for formatting guideline recommendations to enhance the intrinsic implementability of future guidelines were applied. Clinical assemblies will consider implementability early in the guideline production cycle when selecting clinical questions, and new practice guidelines will include a section dedicated to KT. The framework describes the development of a web-based repository and communication forum to inventory existing KT resources and to facilitate collaboration and communication among implementation stakeholders through an online discussion board. A national forum for presentation and peer-review of proposed KT projects is described. The framework outlines expert methodological support for KT planning, development and evaluation including a practical guide for implementers and a novel ‘Clinical Assembly – KT Action Team’, and in-kind logistical support and assistance in securing peer-reviewed funding.

  20. Introducing the Canadian Thoracic Society framework for guideline dissemination and implementation, with concurrent evaluation.

    Science.gov (United States)

    Gupta, Samir; Licskai, Christopher; Van Dam, Anne; Boulet, Louis-Philippe

    2013-01-01

    The Canadian Thoracic Society (CTS) is leveraging its strengths in guideline production to enable respiratory guideline implementation in Canada. The authors describe the new CTS Framework for Guideline Dissemination and Implementation, with Concurrent Evaluation, which has three spheres of action: guideline production, implementation infrastructure and knowledge translation (KT) methodological support. The Canadian Institutes of Health Research 'Knowledge-to-Action' process was adopted as the model of choice for conceptualizing KT interventions. Within the framework, new evidence for formatting guideline recommendations to enhance the intrinsic implementability of future guidelines were applied. Clinical assemblies will consider implementability early in the guideline production cycle when selecting clinical questions, and new practice guidelines will include a section dedicated to KT. The framework describes the development of a web-based repository and communication forum to inventory existing KT resources and to facilitate collaboration and communication among implementation stakeholders through an online discussion board. A national forum for presentation and peer-review of proposed KT projects is described. The framework outlines expert methodological support for KT planning, development and evaluation including a practical guide for implementers and a novel 'Clinical Assembly-KT Action Team', and in-kind logistical support and assistance in securing peer-reviewed funding.

  1. Towards Scalable Distributed Framework for Urban Congestion Traffic Patterns Warehousing

    Directory of Open Access Journals (Sweden)

    A. Boulmakoul

    2015-01-01

    Full Text Available We put forward architecture of a framework for integration of data from moving objects related to urban transportation network. Most of this research refers to the GPS outdoor geolocation technology and uses distributed cloud infrastructure with big data NoSQL database. A network of intelligent mobile sensors, distributed on urban network, produces congestion traffic patterns. Congestion predictions are based on extended simulation model. This model provides traffic indicators calculations, which fuse with the GPS data for allowing estimation of traffic states across the whole network. The discovery process of congestion patterns uses semantic trajectories metamodel given in our previous works. The challenge of the proposed solution is to store patterns of traffic, which aims to ensure the surveillance and intelligent real-time control network to reduce congestion and avoid its consequences. The fusion of real-time data from GPS-enabled smartphones integrated with those provided by existing traffic systems improves traffic congestion knowledge, as well as generating new information for a soft operational control and providing intelligent added value for transportation systems deployment.

  2. New Implementation Framework for Saturation-Based Reasoning

    CERN Document Server

    Riazanov, Alexandre

    2008-01-01

    The saturation-based reasoning methods are among the most theoretically developed ones and are used by most of the state-of-the-art first-order logic reasoners. In the last decade there was a sharp increase in performance of such systems, which I attribute to the use of advanced calculi and the intensified research in implementation techniques. However, nowadays we are witnessing a slowdown in performance progress, which may be considered as a sign that the saturation-based technology is reaching its inherent limits. The position I am trying to put forward in this paper is that such scepticism is premature and a sharp improvement in performance may potentially be reached by adopting new architectural principles for saturation. The top-level algorithms and corresponding designs used in the state-of-the-art saturation-based theorem provers have (at least) two inherent drawbacks: the insufficient flexibility of the used inference selection mechanisms and the lack of means for intelligent prioritising of search d...

  3. Minimization of Power Loss in Distribution Systems by Implementation of High Voltage Distribution System

    Directory of Open Access Journals (Sweden)

    PARWAL Arvind

    2013-05-01

    Full Text Available The loads in rustic area are preeminent pump sets used for various applications i.e. lift irrigation system. Minimal power factor and minimal load factor is found in loads. Further, being a factordissemination of loads, load density is found low. The present distribution system dwell of three-phase 11KV/433Volts distribution transformer with extended L.T Lines. In this system, voltage profile and reliability are poor. In this paper, HVDS is unveiled with smallcapacity distribution transformers. A simple load flow technique is used for solving distribution networks before and after implementation of HVDS. The advantages of implementing HVDS against LVDS system are discussed.

  4. A framework for addressing implementation gap in global drowning prevention interventions: experiences from Bangladesh.

    Science.gov (United States)

    Hyder, Adnan A; Alonge, Olakunle; He, Siran; Wadhwaniya, Shirin; Rahman, Fazlur; El Arifeen, Shams

    2014-12-01

    Drowning is the commonest cause of injury-related deaths among under-five children worldwide, and 95% of deaths occur in low- and middle-income countries (LMICs) where there are implementation gaps in the drowning prevention interventions. This article reviews common interventions for drowning prevention, introduces a framework for effective implementation of such interventions, and describes the Saving of Lives from Drowning (SoLiD) Project in Bangladesh, which is based on this framework. A review of the systematic reviews on drowning interventions was conducted, and original research articles were pulled and summarized into broad prevention categories. The implementation framework builds upon two existing frameworks and categorizes the implementing process for drowning prevention interventions into four phases: planning, engaging, executing, and evaluating. Eleven key characteristics are mapped in these phases. The framework was applied to drowning prevention projects that have been undertaken in some LMICs to illustrate major challenges to implementation. The implementation process for the SoLiD Project in Bangladesh is used as an example to illustrate the practical utilization of the framework. Drowning interventions, such as pool fencing and covering of water hazards, are effective in high-income countries; however, most of these interventions have not been tested in LMICs. The critical components of the four phases of implementing drowning prevention interventions may include: (i) planning-global funding, political will, scale, sustainability, and capacity building; (ii) engaging-coordination, involvement of appropriate individuals; (iii) executing-focused action, multisectoral actions, quality of execution; and (iv) evaluating-rigorous monitoring and evaluation. Some of the challenges to implementing drowning prevention interventions in LMICs include insufficient funds, lack of technical capacity, and limited coordination among stakeholders and implementers

  5. Advances in the spatially distributed ages-w model: parallel computation, java connection framework (JCF) integration, and streamflow/nitrogen dynamics assessment

    Science.gov (United States)

    AgroEcoSystem-Watershed (AgES-W) is a modular, Java-based spatially distributed model which implements hydrologic and water quality (H/WQ) simulation components under the Java Connection Framework (JCF) and the Object Modeling System (OMS) environmental modeling framework. AgES-W is implicitly scala...

  6. [Good drug distribution practice and its implementation in drug distribution companies].

    Science.gov (United States)

    Draksiene, Gailute

    2002-01-01

    Good Distribution Practice is based on the Directive of the Board of the European Community 92/25/EEC regarding the wholesale distribution of drugs for human consumption. It is stated in the Directive that the whole drug distribution channel is to be controlled from the point of drug production or import down to the supplies to the end user. In order to reach the goal, the drug distribution company must create the quality assurance system and facilitate its correct functioning. This aim requires development of the rules of the Good Distribution Practice. Those rules set the general requirements of the Good Distribution Practice for distribution companies that they must conduct. The article explains main requirements postulated in the rules of the Good Distribution Practice and implementation of the Good Distribution Practice requirements in drug distribution companies.

  7. Implementing Distributed Operations: A Comparison of Two Deep Space Missions

    Science.gov (United States)

    Mishkin, Andrew; Larsen, Barbara

    2006-01-01

    Two very different deep space exploration missions--Mars Exploration Rover and Cassini--have made use of distributed operations for their science teams. In the case of MER, the distributed operations capability was implemented only after the prime mission was completed, as the rovers continued to operate well in excess of their expected mission lifetimes; Cassini, designed for a mission of more than ten years, had planned for distributed operations from its inception. The rapid command turnaround timeline of MER, as well as many of the operations features implemented to support it, have proven to be conducive to distributed operations. These features include: a single science team leader during the tactical operations timeline, highly integrated science and engineering teams, processes and file structures designed to permit multiple team members to work in parallel to deliver sequencing products, web-based spacecraft status and planning reports for team-wide access, and near-elimination of paper products from the operations process. Additionally, MER has benefited from the initial co-location of its entire operations team, and from having a single Principal Investigator, while Cassini operations have had to reconcile multiple science teams distributed from before launch. Cassini has faced greater challenges in implementing effective distributed operations. Because extensive early planning is required to capture science opportunities on its tour and because sequence development takes significantly longer than sequence execution, multiple teams are contributing to multiple sequences concurrently. The complexity of integrating inputs from multiple teams is exacerbated by spacecraft operability issues and resource contention among the teams, each of which has their own Principal Investigator. Finally, much of the technology that MER has exploited to facilitate distributed operations was not available when the Cassini ground system was designed, although later adoption

  8. Validation of the theoretical domains framework for use in behaviour change and implementation research.

    Science.gov (United States)

    Cane, James; O'Connor, Denise; Michie, Susan

    2012-04-24

    An integrative theoretical framework, developed for cross-disciplinary implementation and other behaviour change research, has been applied across a wide range of clinical situations. This study tests the validity of this framework. Validity was investigated by behavioural experts sorting 112 unique theoretical constructs using closed and open sort tasks. The extent of replication was tested by Discriminant Content Validation and Fuzzy Cluster Analysis. There was good support for a refinement of the framework comprising 14 domains of theoretical constructs (average silhouette value 0.29): 'Knowledge', 'Skills', 'Social/Professional Role and Identity', 'Beliefs about Capabilities', 'Optimism', 'Beliefs about Consequences', 'Reinforcement', 'Intentions', 'Goals', 'Memory, Attention and Decision Processes', 'Environmental Context and Resources', 'Social Influences', 'Emotions', and 'Behavioural Regulation'. The refined Theoretical Domains Framework has a strengthened empirical base and provides a method for theoretically assessing implementation problems, as well as professional and other health-related behaviours as a basis for intervention development.

  9. Adaptive management of river flows in Europe: A transferable framework for implementation

    Science.gov (United States)

    Summers, M. F.; Holman, I. P.; Grabowski, R. C.

    2015-12-01

    The evidence base for defining flow regimes to support healthy river ecosystems is weak, as there are few studies which quantify the ecological impact associated with different degrees of hydrological alteration. As a result, river flow standards used to manage water abstraction are largely based on expert judgement. Planned adaptive management studies on multiple rivers under the European Water Framework Directive represent an opportunity to learn about ecological flow requirements and improve the quantitative evidence base. However, identifying clear ecological responses to flow alteration can be a significant challenge, because of the complexity of river systems and the other factors which may confound the response. This paper describes the Adaptive River Management (ARM) framework, a flexible framework for implementing adaptive management of river flows that is transferable to other regions of the world. Application of the framework will ensure that the effectiveness of implemented management actions is appraised and that transferable quantitative data are collected that can be used in other geographical regions.

  10. Implementing Concurrency Control in Reliable Distributed Object-Oriented Systems

    Science.gov (United States)

    Parrington, Graham D.; Shrivastava, Santosh K.

    One of the key concepts available in many object-oriented programming languages is that of type-inheritance, which permits new types to be derived from, and inherit the capabilities of, old types. This paper describes how to exploit this property in a very simple fashion to implement object-oriented concurrency control. We show how by using type-inheritance, objects may control their own level of concurrency in a type-specific manner. Simple examples demonstrate the applicability of the approach. The implementation technique described here is being used to develop Arjuna, a fault-tolerant distributed programming system supporting atomic actions.

  11. A General SDN-Based IoT Framework with NVF Implementation

    Institute of Scientific and Technical Information of China (English)

    Jie Li; Eitan Altman; Corinne Touati

    2015-01-01

    The emerging technologies of Internet of Things (IoT), soft⁃ware defined networking (SDN), and network function virtual⁃ization (NFV) have great potential for the information service innovation in the cloud and big data era. The architecture models of IoT, SDN with NFV implementation are studied in this paper. A general SDN⁃based IoT framework with NFV implantation is presented. This framework takes advantages of SDN and NFV and improves IoT architecture.

  12. Sociomateriality: a theoretical framework for studying distributed medical education.

    Science.gov (United States)

    MacLeod, Anna; Kits, Olga; Whelan, Emma; Fournier, Cathy; Wilson, Keith; Power, Gregory; Mann, Karen; Tummons, Jonathan; Brown, Peggy Alexiadis

    2015-11-01

    Distributed medical education (DME) is a type of distance learning in which students participate in medical education from diverse geographic locations using Web conferencing, videoconferencing, e-learning, and similar tools. DME is becoming increasingly widespread in North America and around the world.Although relatively new to medical education, distance learning has a long history in the broader field of education and a related body of literature that speaks to the importance of engaging in rigorous and theoretically informed studies of distance learning. The existing DME literature is helpful, but it has been largely descriptive and lacks a critical "lens"-that is, a theoretical perspective from which to rigorously conceptualize and interrogate DME's social (relationships, people) and material (technologies, tools) aspects.The authors describe DME and theories about distance learning and show that such theories focus on social, pedagogical, and cognitive considerations without adequately taking into account material factors. They address this gap by proposing sociomateriality as a theoretical framework allowing researchers and educators to study DME and (1) understand and consider previously obscured actors, infrastructure, and other factors that, on the surface, seem unrelated and even unimportant; (2) see clearly how the social and material components of learning are intertwined in fluid, messy, and often uncertain ways; and (3) perhaps think differently, even in ways that disrupt traditional approaches, as they explore DME. The authors conclude that DME brings with it substantial investments of social and material resources, and therefore needs careful study, using approaches that embrace its complexity.

  13. Systematic Implementation of Innovation Best Practices:Thota Framework for Innovation

    Institute of Scientific and Technical Information of China (English)

    Hamsa; Thota

    2011-01-01

    The Product Development and Management Association(PDMA)comparative performance assessment studies(CPAS)teach us that the new product development(NPD)and innovation management practices continue to evolve.However,advances in the practice of NPD and innovation did not yield significant increases in new product success rates.However,the PDMA 2003 CPAS reported a significant difference in the success rates for top performing businesses versus the"rest".This paper summarizes innovation practices of top performing businesses from CPAS and compares them with the practices of PDMA Outstanding Corporate Innovator(OCI)Award winning companies.The paper verifies that what the top performers say they do in research surveys is actually what the outstanding corporate innovators do in practice.The paper then raises the question of how underperforming companies can implement key learnings from the best to improve their own innovation performance.The development of Thota framework for innovation as a systematic framework to help underperformers implement the best practices is discussed.As a case study,the implementation of the Thota framework for innovation is illustrated with BMW,an OCI award winning company.The Thota Framework for Innovation is based on three organizing principles and eight actions.Underperforming businesses can improve their innovation performance by internalizing the three organizing principles and rigorously implementing eight actions specified in the Thota framework for innovation.

  14. Competing Through Lean – Towards Sustainable Resource-Oriented Implementation Framework

    Directory of Open Access Journals (Sweden)

    Rymaszewska Anna

    2014-11-01

    Full Text Available This paper addresses the needs of SMEs manufacturing companies which due to their limited resources are often unable to introduce radical changes in their strategies. The main focus is on analyzing the principles of lean manufacturing and management regarding their potential contribution to building a company's competitive advantage. The paper analyses lean from a strategic management viewpoint while combining its implementation with achieving a competitive advantage. The ultimate result is a framework for lean implementation aimed at building a competitive advantage for companies. The proposed framework focuses on the idea of a closed loop with embedded sustainability.

  15. Design of A Sustainable Building: A Conceptual Framework for Implementing Sustainability in the Building Sector

    Directory of Open Access Journals (Sweden)

    Paul O. Olomolaiye

    2012-05-01

    Full Text Available This paper presents a conceptual framework aimed at implementing sustainability principles in the building industry. The proposed framework based on the sustainable triple bottom line principle, includes resource conservation, cost efficiency and design for human adaptation. Following a thorough literature review, each principle involving strategies and methods to be applied during the life cycle of building projects is explained and a few case studies are presented for clarity on the methods. The framework will allow design teams to have an appropriate balance between economic, social and environmental issues, changing the way construction practitioners think about the information they use when assessing building projects, thereby facilitating the sustainability of building industry.

  16. Dynamic fragmentation and query translation based security framework for distributed databases

    Directory of Open Access Journals (Sweden)

    Arunabha Sengupta

    2015-09-01

    Full Text Available The existing security models for distributed databases suffer from several drawbacks viz. tight coupling with the choice of database; lack of dynamism, granularity and flexibility; non scalability and vulnerability to intrusion attacks. There is a lack of an integrated flexible and interoperable security framework that can dynamically control access to table, row, column and field level data entity. The objective of this proposed framework is to address the issue of security in distributed query processing using the dynamic fragmentation and query translation methodologies based on a parameterized security model which could be tailored based on the business requirements to take care of relational level, record level, column level as well as the atomic data element level security and access requirements. This solution has been implemented and tested for DML operations on distributed relational databases and the execution results are found to be very promising in terms of restricting access to data elements with higher security clearance; blocking queries that return data at/below user’s level but its evaluation requires accessing columns/rows with higher security clearance; and blocking aggregate queries used for inferring classified information.

  17. A visualization tool for parallel and distributed computing using the Lilith framework

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Wyckoff, P.

    1998-05-01

    The authors present a visualization tool for the monitoring and debugging of codes run in a parallel and distributed computing environment, called Lilith Lights. This tool can be used both for debugging parallel codes as well as for resource management of clusters. It was developed under Lilith, a framework for creating scalable software tools for distributed computing. The use of Lilith provides scalable, non-invasive debugging, as opposed to other commonly used software debugging and visualization tools. Furthermore, by implementing the visualization tool in software rather than in hardware (as available on some MPPs), Lilith Lights is easily transferable to other machines, and well adapted for use on distributed clusters of machines. The information provided in a clustered environment can further be used for resource management of the cluster. In this paper, they introduce Lilith Lights, discussing its use on the Computational Plant cluster at Sandia National Laboratories, show its design and development under the Lilith framework, and present metrics for resource use and performance.

  18. Design and Implementation of a Heterogeneous Distributed Database System

    Institute of Scientific and Technical Information of China (English)

    金志权; 柳诚飞; 等

    1990-01-01

    This paper introduces a heterogeneous distributed database system called LSZ system,where LSZ is an abbreviation of Li Shizhen,an ancient Chinese medical scientist.LSZ system adopts cluster as distributed database node(or site).Each cluster consists of one of several microcomputers and one server.Te paper describes its basic architecture and the prototype implementation,which includes query processing and optimization,transaction manager and data language translation.The system provides a uniform retrieve and update user interface through global relational data language GRDL.

  19. A distributed infrastructure for publishing VO services: an implementation

    Science.gov (United States)

    Cepparo, Francesco; Scagnetto, Ivan; Molinaro, Marco; Smareglia, Riccardo

    2016-07-01

    This contribution describes both the design and the implementation details of a new solution for publishing VO services, enlightening its maintainable, distributed, modular and scalable architecture. Indeed, the new publisher is multithreaded and multiprocess. Multiple instances of the modules can run on different machines to ensure high performance and high availability, and this will be true both for the interface modules of the services and the back end data access ones. The system uses message passing to let its components communicate through an AMQP message broker that can itself be distributed to provide better scalability and availability.

  20. Implementing ATML in Distributed ATS for SG-III Prototype

    Institute of Scientific and Technical Information of China (English)

    CHEN Ming; YANG Cunbang; LU Junfeng; DING Yongkun; YIN Zejie; ZHENG Zhijian

    2007-01-01

    With the forthcoming large-scale scientific experimental systems,we are looking for ways to construct an open,distributed architecture within the new and the existing automatic test systems.The new standard of Automatic Test Markup Language meets our demand for data exchange for this architecture through defining the test routines and resultant data in the XML format. This paper introduces the concept of ATML(Automatic Test Markup Language) and related standards,and the significance of these new standards for a distributed automatic test system.It also describes the implementation of ATML through the integration of this technology among the existing and new test systems.

  1. Distributed learning process: principles of design and implementation

    Directory of Open Access Journals (Sweden)

    G. N. Boychenko

    2016-01-01

    Full Text Available At the present stage, broad information and communication technologies (ICT usage in educational practices is one of the leading trends of global education system development. This trend has led to the instructional interaction models transformation. Scientists have developed the theory of distributed cognition (Salomon, G., Hutchins, E., and distributed education and training (Fiore, S. M., Salas, E., Oblinger, D. G., Barone, C. A., Hawkins, B. L.. Educational process is based on two separated in time and space sub-processes of learning and teaching which are aimed at the organization of fl exible interactions between learners, teachers and educational content located in different non-centralized places.The purpose of this design research is to fi nd a solution for the problem of formalizing distributed learning process design and realization that is signifi cant in instructional design. The solution to this problem should take into account specifi cs of distributed interactions between team members, which becomes collective subject of distributed cognition in distributed learning process. This makes it necessary to design roles and functions of the individual team members performing distributed educational activities. Personal educational objectives should be determined by decomposition of team objectives into functional roles of its members with considering personal and learning needs and interests of students.Theoretical and empirical methods used in the study: theoretical analysis of philosophical, psychological, and pedagogical literature on the issue, analysis of international standards in the e-learning domain; exploration on practical usage of distributed learning in academic and corporate sectors; generalization, abstraction, cognitive modelling, ontology engineering methods.Result of the research is methodology for design and implementation of distributed learning process based on the competency approach. Methodology proposed by

  2. Implementation of evidence-based practices: Applying a goal commitment framework.

    Science.gov (United States)

    Chou, Ann F; Vaughn, Thomas E; McCoy, Kimberly D; Doebbeling, Bradley N

    2011-01-01

    The implementation of evidence-based practices translates research findings into practice to reduce inappropriate care. However, this process is slow and unpredictable. The lack of a coherent theoretical basis for understanding individual and organizational behavior limits our ability to formulate effective implementation strategies. The study objectives are (a) to test the goal commitment framework that explains mechanisms impacting outcomes of major depressive disorder (MDD) screening guideline implementation and (b) to understand the effects of implementation outcomes on provider practice related to MDD screening. Using data from the Determinants of Clinical Practice Guideline Implementation Effectiveness Study, the national sample included 2,438 clinicians from 139 Veteran Affairs acute care hospitals with primary care clinics. We used hierarchical generalized linear modeling to assess the following implementation outcomes: agreement with, adherence to, improvement in knowledge of guidelines, and delivery of best practices as a function of clinician input into implementation, teamwork, involvement in quality improvement activities, participative culture, interdepartmental coordination, frequency, and utility of performance feedback. We then estimated self-reported MDD screening practices as a function of these four implementation outcomes. Results showed that having input into implementation, involvement in quality of care improvement, teamwork, and perceived value of performance feedback were positively associated with implementation outcomes. Provider self-assessed guideline adherence was positively associated with the likelihood of appropriate MDD screening. Factors related to increased goal commitment positively predicted key implementation outcomes, which in turn enhanced care delivery. This study demonstrates that the goal commitment framework is useful in assisting managers to assess factors that facilitate implementation. In particular, participation

  3. Evaluation Framework and Tools for Distributed Energy Resources

    Energy Technology Data Exchange (ETDEWEB)

    Gumerman, Etan Z.; Bharvirkar, Ranjit R.; LaCommare, Kristina Hamachi; Marnay , Chris

    2003-02-01

    The Energy Information Administration's (EIA) 2002 Annual Energy Outlook (AEO) forecast anticipates the need for 375 MW of new generating capacity (or about one new power plant) per week for the next 20 years, most of which is forecast to be fueled by natural gas. The Distributed Energy and Electric Reliability Program (DEER) of the Department of Energy (DOE), has set a national goal for DER to capture 20 percent of new electric generation capacity additions by 2020 (Office of Energy Efficiency and Renewable Energy 2000). Cumulatively, this amounts to about 40 GW of DER capacity additions from 2000-2020. Figure ES-1 below compares the EIA forecast and DEER's assumed goal for new DER by 2020 while applying the same definition of DER to both. This figure illustrates that the EIA forecast is consistent with the overall DEER DER goal. For the purposes of this study, Berkeley Lab needed a target level of small-scale DER penetration upon which to hinge consideration of benefits and costs. Because the AEO2002 forecasted only 3.1 GW of cumulative additions from small-scale DER in the residential and commercial sectors, another approach was needed to estimate the small-scale DER target. The focus here is on small-scale DER technologies under 500 kW. The technology size limit is somewhat arbitrary, but the key results of interest are marginal additional costs and benefits around an assumed level of penetration that existing programs might achieve. Berkeley Lab assumes that small-scale DER has the same growth potential as large scale DER in AEO2002, about 38 GW. This assumption makes the small-scale goal equivalent to 380,000 DER units of average size 100 kW. This report lays out a framework whereby the consequences of meeting this goal might be estimated and tallied up. The framework is built around a list of major benefits and a set of tools that might be applied to estimate them. This study lists some of the major effects of an emerging paradigm shift away from

  4. Implementation and performance of FDPS: a framework for developing parallel particle simulation codes

    Science.gov (United States)

    Iwasawa, Masaki; Tanikawa, Ataru; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-08-01

    We present the basic idea, implementation, measured performance, and performance model of FDPS (Framework for Developing Particle Simulators). FDPS is an application-development framework which helps researchers to develop simulation programs using particle methods for large-scale distributed-memory parallel supercomputers. A particle-based simulation program for distributed-memory parallel computers needs to perform domain decomposition, exchange of particles which are not in the domain of each computing node, and gathering of the particle information in other nodes which are necessary for interaction calculation. Also, even if distributed-memory parallel computers are not used, in order to reduce the amount of computation, algorithms such as the Barnes-Hut tree algorithm or the Fast Multipole Method should be used in the case of long-range interactions. For short-range interactions, some methods to limit the calculation to neighbor particles are required. FDPS provides all of these functions which are necessary for efficient parallel execution of particle-based simulations as "templates," which are independent of the actual data structure of particles and the functional form of the particle-particle interaction. By using FDPS, researchers can write their programs with the amount of work necessary to write a simple, sequential and unoptimized program of O(N2) calculation cost, and yet the program, once compiled with FDPS, will run efficiently on large-scale parallel supercomputers. A simple gravitational N-body program can be written in around 120 lines. We report the actual performance of these programs and the performance model. The weak scaling performance is very good, and almost linear speed-up was obtained for up to the full system of the K computer. The minimum calculation time per timestep is in the range of 30 ms (N = 107) to 300 ms (N = 109). These are currently limited by the time for the calculation of the domain decomposition and communication

  5. A Framework for Process Reengineering in Higher Education: A case study of distance learning exam scheduling and distribution

    Directory of Open Access Journals (Sweden)

    M'hammed Abdous

    2008-10-01

    Full Text Available In this paper, we propose a conceptual and operational framework for process reengineering (PR in higher education (HE institutions. Using a case study aimed at streamlining exam scheduling and distribution in a distance learning (DL unit, we outline a sequential and non-linear four-step framework designed to reengineer processes. The first two steps of this framework – initiating and analyzing – are used to initiate, document, and flowchart the process targeted for reengineering, and the last two steps – reengineering/ implementing and evaluating – are intended to prototype, implement, and evaluate the reengineered process. Our early involvement of all stakeholders, and our in-depth analysis and documentation of the existing process, allowed us to avoid the traditional pitfalls associated with business process reengineering (BPR. Consequently, the outcome of our case study indicates a streamlined and efficient process with a higher faculty satisfaction at substantial cost reduction.

  6. Implementation framework for chronic disease intervention effectiveness in Māori and other indigenous communities.

    Science.gov (United States)

    Oetzel, John; Scott, Nina; Hudson, Maui; Masters-Awatere, Bridgette; Rarere, Moana; Foote, Jeff; Beaton, Angela; Ehau, Terry

    2017-09-05

    About 40% of all health burden in New Zealand is due to cancer, cardiovascular disease, and type 2 diabetes/obesity. Outcomes for Māori (indigenous people) are significantly worse than non-Maori; these inequities mirror those found in indigenous communities elsewhere. Evidence-based interventions with established efficacy may not be effective in indigenous communities without addressing specific implementation challenges. We present an implementation framework for interventions to prevent and treat chronic conditions for Māori and other indigenous communities. The He Pikinga Waiora Implementation Framework has indigenous self-determination at its core and consists of four elements: cultural-centeredness, community engagement, systems thinking, and integrated knowledge translation. All elements have conceptual fit with Kaupapa Māori aspirations (i.e., indigenous knowledge creation, theorizing, and methodology) and all have demonstrated evidence of positive implementation outcomes. A coding scheme derived from the Framework was applied to 13 studies of diabetes prevention in indigenous communities in Australia, Canada, New Zealand, and the United States from a systematic review. Cross-tabulations demonstrated that culture-centeredness (p = .008) and community engagement (p = .009) explained differences in diabetes outcomes and community engagement (p = .098) explained difference in blood pressure outcomes. The He Pikinga Waiora Implementation Framework appears to be well suited to advance implementation science for indigenous communities in general and Māori in particular. The framework has promise as a policy and planning tool to evaluate and design effective interventions for chronic disease prevention in indigenous communities.

  7. Teacher Competencies for the Implementation of Collaborative Learning in the Classroom: A Framework and Research Review

    Science.gov (United States)

    Kaendler, Celia; Wiedmann, Michael; Rummel, Nikol; Spada, Hans

    2015-01-01

    This article describes teacher competencies for implementing collaborative learning in the classroom. Research has shown that the effectiveness of collaborative learning largely depends on the quality of student interaction. We therefore focus on what a "teacher" can do to foster student interaction. First, we present a framework that…

  8. A Conceptual Framework for the Development, Implementation, and Evaluation of Formal Mentoring Programs.

    Science.gov (United States)

    Gaskill, LuAnn Ricketts

    1993-01-01

    Data from a survey of executive development directors were the basis for this mentoring program framework, consisting of (1) program development (protege and mentor selection, training, and linkage); (2) implementation (career and psychosocial functions); and (3) evaluation (formal and informal outcomes assessment). (SK)

  9. Proposing a New Framework and an Innovative Approach to Teaching Reengineering and ERP Implementation Concepts

    Science.gov (United States)

    Pellerin, Robert; Hadaya, Pierre

    2008-01-01

    Recognizing the need to teach ERP implementation and business process reengineering (BPR) concepts simultaneously, as well as the pedagogical limitations of the case teaching method and simulation tools, the objective of this study is to propose a new framework and an innovative teaching approach to improve the ERP training experience for IS…

  10. Implementations of FroboMind using the Robot Operating System framework

    OpenAIRE

    Nielsen, Søren Hundevadt; Bøgild, Anders; Jensen, Kjeld; Bertelsen, Keld Kjærhus

    2011-01-01

    Several different architectures has been suggested for agricultural robotic vehicles like Agriture, The Hortibot and AgroBot linux shared memory based architecture, or the FroboMind.This work present the implementation of the latter conceptual architecture FroboMind utilising the open-source cross language robotic framework ROS.Secondly a rugged hardware platform FroboBox is presented.

  11. Teacher Competencies for the Implementation of Collaborative Learning in the Classroom: A Framework and Research Review

    Science.gov (United States)

    Kaendler, Celia; Wiedmann, Michael; Rummel, Nikol; Spada, Hans

    2015-01-01

    This article describes teacher competencies for implementing collaborative learning in the classroom. Research has shown that the effectiveness of collaborative learning largely depends on the quality of student interaction. We therefore focus on what a "teacher" can do to foster student interaction. First, we present a framework that…

  12. The Use of Ethical Frameworks for Implementing Science as a Human Endeavour in Year 10 Biology

    Science.gov (United States)

    Yap, Siew Fong; Dawson, Vaille

    2014-01-01

    This research focuses on the use of ethical frameworks as a pedagogical model for socio-scientific education in implementing the "Science as a Human Endeavour" (SHE) strand of the Australian Curriculum: Science in a Year 10 biology class in a Christian college in metropolitan Perth, Western Australia. Using a case study approach, a mixed…

  13. Communication Channels as Implementation Determinants of Performance Management Framework in Kenya

    Science.gov (United States)

    Sang, Jane

    2016-01-01

    The purpose of this study to assess communication channels as implementation determinants of performance management framework In Kenya at Moi Teaching and Referral Hospital (MTRH). The communication theory was used to inform the study. This study adopted an explanatory design. The target sampled 510 respondents through simple random and stratified…

  14. Implementation of a Framework for Collaborative Social Networks in E-Learning

    Science.gov (United States)

    Maglajlic, Seid

    2016-01-01

    This paper describes the implementation of a framework for the construction and utilization of social networks in ELearning. These social networks aim to enhance collaboration between all E-Learning participants (i.e. both traineeto-trainee and trainee-to-tutor communication are targeted). E-Learning systems that include a so-called "social…

  15. Let's Talk Learning Analytics: A Framework for Implementation in Relation to Student Retention

    Science.gov (United States)

    West, Deborah; Heath, David; Huijser, Henk

    2016-01-01

    This paper presents a dialogical tool for the advancement of learning analytics implementation for student retention in Higher Education institutions. The framework was developed as an outcome of a project commissioned and funded by the Australian Government's "Office for Learning and Teaching". The project took a mixed-method approach…

  16. Implementing a Multidimensional Poverty Measure Using Mixed Methods and a Participatory Framework

    Science.gov (United States)

    Mitra, Sophie; Jones, Kris; Vick, Brandon; Brown, David; McGinn, Eileen; Alexander, Mary Jane

    2013-01-01

    Recently, there have been advances in the development of multidimensional poverty measures. Work is needed however on how to implement such measures. This paper deals with the process of selecting dimensions and setting weights in multidimensional poverty measurement using qualitative and quantitative methods in a participatory framework. We…

  17. A general framework for implementing NLO calculations in shower Monte Carlo programs. The POWHEG BOX

    Energy Technology Data Exchange (ETDEWEB)

    Alioli, Simone [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany); Nason, Paolo [INFN, Milano-Bicocca (Italy); Oleari, Carlo [INFN, Milano-Bicocca (Italy); Milano-Bicocca Univ. (Italy); Re, Emanuele [Durham Univ. (United Kingdom). Inst. for Particle Physics Phenomenology

    2010-02-15

    In this work we illustrate the POWHEG BOX, a general computer code framework for implementing NLO calculations in shower Monte Carlo programs according to the POWHEG method. Aim of this work is to provide an illustration of the needed theoretical ingredients, a view of how the code is organized and a description of what a user should provide in order to use it. (orig.)

  18. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; Sebastián, Miguel San

    2010-01-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania...

  19. Implementation of a PETN failure model using ARIA's general chemistry framework

    Energy Technology Data Exchange (ETDEWEB)

    Hobbs, Michael L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-01-01

    We previously developed a PETN thermal decomposition model that accurately predicts thermal ignition and detonator failure [1]. This model was originally developed for CALORE [2] and required several complex user subroutines. Recently, a simplified version of the PETN decomposition model was implemented into ARIA [3] using a general chemistry framework without need for user subroutines. Detonator failure was also predicted with this new model using ENCORE. The model was simplified by 1) basing the model on moles rather than mass, 2) simplifying the thermal conductivity model, and 3) implementing ARIA’s new phase change model. This memo briefly describes the model, implementation, and validation.

  20. Spatially-Distributed Cost-Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution.

    Directory of Open Access Journals (Sweden)

    Runzhe Geng

    Full Text Available Best management practices (BMPs for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P index, model simulation techniques (Hydrological Simulation Program-FORTRAN, and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved for four allocation approaches. Selection of a ''best approach" depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001 decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  1. Spatially-Distributed Cost–Effectiveness Analysis Framework to Control Phosphorus from Agricultural Diffuse Pollution

    Science.gov (United States)

    Geng, Runzhe; Wang, Xiaoyan; Sharpley, Andrew N.; Meng, Fande

    2015-01-01

    Best management practices (BMPs) for agricultural diffuse pollution control are implemented at the field or small-watershed scale. However, the benefits of BMP implementation on receiving water quality at multiple spatial is an ongoing challenge. In this paper, we introduce an integrated approach that combines risk assessment (i.e., Phosphorus (P) index), model simulation techniques (Hydrological Simulation Program–FORTRAN), and a BMP placement tool at various scales to identify the optimal location for implementing multiple BMPs and estimate BMP effectiveness after implementation. A statistically significant decrease in nutrient discharge from watersheds is proposed to evaluate the effectiveness of BMPs, strategically targeted within watersheds. Specifically, we estimate two types of cost-effectiveness curves (total pollution reduction and proportion of watersheds improved) for four allocation approaches. Selection of a ‘‘best approach” depends on the relative importance of the two types of effectiveness, which involves a value judgment based on the random/aggregated degree of BMP distribution among and within sub-watersheds. A statistical optimization framework is developed and evaluated in Chaohe River Watershed located in the northern mountain area of Beijing. Results show that BMP implementation significantly (p >0.001) decrease P loss from the watershed. Remedial strategies where BMPs were targeted to areas of high risk of P loss, deceased P loads compared with strategies where BMPs were randomly located across watersheds. Sensitivity analysis indicated that aggregated BMP placement in particular watershed is the most cost-effective scenario to decrease P loss. The optimization approach outlined in this paper is a spatially hierarchical method for targeting nonpoint source controls across a range of scales from field to farm, to watersheds, to regions. Further, model estimates showed targeting at multiple scales is necessary to optimize program

  2. Design and Implementation of an Efficient Software Communications Architecture Core Framework for a Digital Signal Processors Platform

    Directory of Open Access Journals (Sweden)

    Wael A. Murtada

    2011-01-01

    Full Text Available Problem statement: The Software Communications Architecture (SCA was developed to improve software reuse and interoperability in Software Defined Radios (SDR. However, there have been performance concerns since its conception. Arguably, the majority of the problems and inefficiencies associated with the SCA can be attributed to the assumption of modular distributed platforms relying on General Purpose Processors (GPPs to perform all signal processing. Approach: Significant improvements in cost and power consumption can be obtained by utilizing specialized and more efficient platforms. Digital Signal Processors (DSPs present such a platform and have been widely used in the communications industry. Improvements in development tools and middleware technology opened the possibility of fully integrating DSPs into the SCA. This approach takes advantage of the exceptional power, cost and performance characteristics of DSPs, while still enjoying the flexibility and portability of the SCA. Results: This study presents the design and implementation of an SCA Core Framework (CF for a TI TMS320C6416 DSP. The framework is deployed on a C6416 Device Cycle Accurate Simulator and TI C6416 Development board. The SCA CF is implemented by leveraging OSSIE, an open-source implementation of the SCA, to support the DSP platform. OIS’s ORBExpress DSP and DSP/BIOS are used as the middleware and operating system, respectively. A sample waveform was developed to demonstrate the framework’s functionality. Benchmark results for the framework and sample applications are provided. Conclusion: Benchmark results show that, using OIS ORBExpress DSP ORB middleware has an impact for decreasing the Software Memory Footprint and increasing the System Performance compared with PrismTech's e*ORB middleware.

  3. Implementation of Web-based Information Systems in Distributed Organizations

    DEFF Research Database (Denmark)

    Bødker, Keld; Pors, Jens Kaaber; Simonsen, Jesper

    2004-01-01

    This article presents results elicited from studies conducted in relation to implementing a web-based information system throughout a large distributed organization. We demonstrate the kind of expectations and conditions for change that management face in relation to open-ended, configurable......, and context specific web-based information systems like Lotus QuickPlace. Our synthesis from the empirical findings is related to two recent models, the improvisational change management model suggested by Orlikowski and Hofman (1997), and Gallivan's (2001) model for organizational adoption and assimilation...

  4. Implementation of Physical Layer Key Distribution using Software Defined Radios

    Directory of Open Access Journals (Sweden)

    S. Kambala

    2013-01-01

    Full Text Available It was well known from Shannon’s days that characteristics of the physical channel like attenuation, fading and noise can impair reliable communication. But it was more recently that the beneficial side effects of channel characteristics in ensuring secret communication started getting attention. Studies have been made to quantify the amount of secrecy that can be reaped by combining channel coding with security protocols. The Wiretap channel proposed by Wyner is arguably one of the oldest models of physical layer security protocols. In this paper, we present a brief tutorial introduction to the Wiretap channel, followed by an application of the physical layer model to a class of Key Distribution protocols. We present results from an implementation of key distribution protocols using Software Defined Radio tools along with physical RF hardware peripherals. We believe this approach is much more tangible and informative than computer based simulation studies.

  5. [Documentation of good distribution practice of medicines and its implementation in Lithuanian drug distribution companies].

    Science.gov (United States)

    Draksiene, Gailute; Petkevicius, Henrikas; Radziūnas, Raimondas

    2003-01-01

    Good Distribution Practice of medicinal products for human use is a quality warranty system, which includes requirements for purchase, receiving, storage and export of drugs, intended human consumption. A drug is a specific product and its mishandling is dangerous to human health and life. Therefore it is necessary to strictly control the movement of the drug from the producer to the consumer so that poor quality drugs do not have access to the market. Good Distribution Practice rules set the general requirements for good wholesale distribution practice of drugs, intended for human consumption. In order for company to meet the specified requirements, the drug distribution company must have all suitable and necessary premises, machinery, equipment, the required number of employees and specified documentation. The preparation of the Good Distribution Practice documentation is one of the most important and complex aspects when implementing the Good Distribution Practice in the companies. The article deals with the analysis of results obtained during the research of drug distribution companies in Lithuania. The research revealed that drug distribution companies put emphasis on the equipment of storage premises. Less attention is being paid to the preparation of the documents of Good Distribution Practice. The article thus presents the analysis of Good Distribution Practice documents prepared by the drug distribution companies.

  6. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework

    Directory of Open Access Journals (Sweden)

    Lewis Steven

    2012-12-01

    Full Text Available Abstract Background For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. Results We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. Conclusion The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  7. Hydra: a scalable proteomic search engine which utilizes the Hadoop distributed computing framework.

    Science.gov (United States)

    Lewis, Steven; Csordas, Attila; Killcoyne, Sarah; Hermjakob, Henning; Hoopmann, Michael R; Moritz, Robert L; Deutsch, Eric W; Boyle, John

    2012-12-05

    For shotgun mass spectrometry based proteomics the most computationally expensive step is in matching the spectra against an increasingly large database of sequences and their post-translational modifications with known masses. Each mass spectrometer can generate data at an astonishingly high rate, and the scope of what is searched for is continually increasing. Therefore solutions for improving our ability to perform these searches are needed. We present a sequence database search engine that is specifically designed to run efficiently on the Hadoop MapReduce distributed computing framework. The search engine implements the K-score algorithm, generating comparable output for the same input files as the original implementation. The scalability of the system is shown, and the architecture required for the development of such distributed processing is discussed. The software is scalable in its ability to handle a large peptide database, numerous modifications and large numbers of spectra. Performance scales with the number of processors in the cluster, allowing throughput to expand with the available resources.

  8. Implementation of High Speed Distributed Data Acquisition System

    Science.gov (United States)

    Raju, Anju P.; Sekhar, Ambika

    2012-09-01

    This paper introduces a high speed distributed data acquisition system based on a field programmable gate array (FPGA). The aim is to develop a "distributed" data acquisition interface. The development of instruments such as personal computers and engineering workstations based on "standard" platforms is the motivation behind this effort. Using standard platforms as the controlling unit allows independence in hardware from a particular vendor and hardware platform. The distributed approach also has advantages from a functional point of view: acquisition resources become available to multiple instruments; the acquisition front-end can be physically remote from the rest of the instrument. High speed data acquisition system transmits data faster to a remote computer system through Ethernet interface. The data is acquired through 16 analog input channels. The input data commands are multiplexed and digitized and then the data is stored in 1K buffer for each input channel. The main control unit in this design is the 16 bit processor implemented in the FPGA. This 16 bit processor is used to set up and initialize the data source and the Ethernet controller, as well as control the flow of data from the memory element to the NIC. Using this processor we can initialize and control the different configuration registers in the Ethernet controller in a easy manner. Then these data packets are sending to the remote PC through the Ethernet interface. The main advantages of the using FPGA as standard platform are its flexibility, low power consumption, short design duration, fast time to market, programmability and high density. The main advantages of using Ethernet controller AX88796 over others are its non PCI interface, the presence of embedded SRAM where transmit and reception buffers are located and high-performance SRAM-like interface. The paper introduces the implementation of the distributed data acquisition using FPGA by VHDL. The main advantages of this system are high

  9. A C++ framework for active objects in embedded real-time systems-bridging the gap between modeling and implementation

    DEFF Research Database (Denmark)

    Caspersen, Michael Edelgaard

    1999-01-01

    for this is that the predominant object-oriented programming language in industry, C++, does not support concurrency. In this paper we present a simple and powerful approach to extending C++ with constructs for concurrent programming. We discuss the design, application, and implementation of a framework that supports standard...... concurrency constructs and, contrary to what is suggested in several books on object oriented modeling techniques for real-time systems, we demonstrate that it is possible to integrate the notions of object and process and maintain a smooth-virtually non-existing-transition from modeling to implementation......In research communities it is now well-accepted that the object paradigm provides a good foundation for the challenges of concurrent and distributed computing. For practitioners, however, it is not without problems to combine the concepts of processes and objects. A major reason...

  10. How to make more out of community data? A conceptual framework and its implementation as models and software.

    Science.gov (United States)

    Ovaskainen, Otso; Tikhonov, Gleb; Norberg, Anna; Guillaume Blanchet, F; Duan, Leo; Dunson, David; Roslin, Tomas; Abrego, Nerea

    2017-03-20

    Community ecology aims to understand what factors determine the assembly and dynamics of species assemblages at different spatiotemporal scales. To facilitate the integration between conceptual and statistical approaches in community ecology, we propose Hierarchical Modelling of Species Communities (HMSC) as a general, flexible framework for modern analysis of community data. While non-manipulative data allow for only correlative and not causal inference, this framework facilitates the formulation of data-driven hypotheses regarding the processes that structure communities. We model environmental filtering by variation and covariation in the responses of individual species to the characteristics of their environment, with potential contingencies on species traits and phylogenetic relationships. We capture biotic assembly rules by species-to-species association matrices, which may be estimated at multiple spatial or temporal scales. We operationalise the HMSC framework as a hierarchical Bayesian joint species distribution model, and implement it as R- and Matlab-packages which enable computationally efficient analyses of large data sets. Armed with this tool, community ecologists can make sense of many types of data, including spatially explicit data and time-series data. We illustrate the use of this framework through a series of diverse ecological examples.

  11. A modified theoretical framework to assess implementation fidelity of adaptive public health interventions.

    Science.gov (United States)

    Pérez, Dennis; Van der Stuyft, Patrick; Zabala, Maríadel Carmen; Castro, Marta; Lefèvre, Pierre

    2016-07-08

    One of the major debates in implementation research turns around fidelity and adaptation. Fidelity is the degree to which an intervention is implemented as intended by its developers. It is meant to ensure that the intervention maintains its intended effects. Adaptation is the process of implementers or users bringing changes to the original design of an intervention. Depending on the nature of the modifications brought, adaptation could either be potentially positive or could carry the risk of threatening the theoretical basis of the intervention, resulting in a negative effect on expected outcomes. Adaptive interventions are those for which adaptation is allowed or even encouraged. Classical fidelity dimensions and conceptual frameworks do not address the issue of how to adapt an intervention while still maintaining its effectiveness. We support the idea that fidelity and adaptation co-exist and that adaptations can impact either positively or negatively on the intervention's effectiveness. For adaptive interventions, research should answer the question how an adequate fidelity-adaptation balance can be reached. One way to address this issue is by looking systematically at the aspects of an intervention that are being adapted. We conducted fidelity research on the implementation of an empowerment strategy for dengue prevention in Cuba. In view of the adaptive nature of the strategy, we anticipated that the classical fidelity dimensions would be of limited use for assessing adaptations. The typology we used in the assessment-implemented, not-implemented, modified, or added components of the strategy-also had limitations. It did not allow us to answer the question which of the modifications introduced in the strategy contributed to or distracted from outcomes. We confronted our empirical research with existing literature on fidelity, and as a result, considered that the framework for implementation fidelity proposed by Carroll et al. in 2007 could potentially meet

  12. Implementing change in primary care practices using electronic medical records: a conceptual framework

    Directory of Open Access Journals (Sweden)

    Stuart Gail W

    2008-01-01

    Full Text Available Abstract Background Implementing change in primary care is difficult, and little practical guidance is available to assist small primary care practices. Methods to structure care and develop new roles are often needed to implement an evidence-based practice that improves care. This study explored the process of change used to implement clinical guidelines for primary and secondary prevention of cardiovascular disease in primary care practices that used a common electronic medical record (EMR. Methods Multiple conceptual frameworks informed the design of this study designed to explain the complex phenomena of implementing change in primary care practice. Qualitative methods were used to examine the processes of change that practice members used to implement the guidelines. Purposive sampling in eight primary care practices within the Practice Partner Research Network-Translating Researching into Practice (PPRNet-TRIP II clinical trial yielded 28 staff members and clinicians who were interviewed regarding how change in practice occurred while implementing clinical guidelines for primary and secondary prevention of cardiovascular disease and strokes. Results A conceptual framework for implementing clinical guidelines into primary care practice was developed through this research. Seven concepts and their relationships were modelled within this framework: leaders setting a vision with clear goals for staff to embrace; involving the team to enable the goals and vision for the practice to be achieved; enhancing communication systems to reinforce goals for patient care; developing the team to enable the staff to contribute toward practice improvement; taking small steps, encouraging practices' tests of small changes in practice; assimilating the electronic medical record to maximize clinical effectiveness, enhancing practices' use of the electronic tool they have invested in for patient care improvement; and providing feedback within a culture of

  13. MVC Design Pattern for the multi framework distributed applications using XML, spring and struts framework

    OpenAIRE

    Praveen Gupta; Prof. M.C. Govil

    2010-01-01

    The model view controller (MVC) is a fundamental design pattern for the separation between user interface logic and business logic. Since applications are very large in size these days and the MVC designpattern can weak the coupling among the different application tiers of application. this paper presents a web application framework based on MVC in J2EE platform, and extends it with XML so that the framework is more flexible, expansible and easy to maintain. This is a multi tier system includ...

  14. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    Science.gov (United States)

    Zinnecker, Alicia Mae; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2014-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (40,000 pound force thrust) (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a Simulink (R) library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL

  15. A Modular Framework for Modeling Hardware Elements in Distributed Engine Control Systems

    Science.gov (United States)

    Zinnecker, Alicia M.; Culley, Dennis E.; Aretskin-Hariton, Eliot D.

    2015-01-01

    Progress toward the implementation of distributed engine control in an aerospace application may be accelerated through the development of a hardware-in-the-loop (HIL) system for testing new control architectures and hardware outside of a physical test cell environment. One component required in an HIL simulation system is a high-fidelity model of the control platform: sensors, actuators, and the control law. The control system developed for the Commercial Modular Aero-Propulsion System Simulation 40k (C-MAPSS40k) provides a verifiable baseline for development of a model for simulating a distributed control architecture. This distributed controller model will contain enhanced hardware models, capturing the dynamics of the transducer and the effects of data processing, and a model of the controller network. A multilevel framework is presented that establishes three sets of interfaces in the control platform: communication with the engine (through sensors and actuators), communication between hardware and controller (over a network), and the physical connections within individual pieces of hardware. This introduces modularity at each level of the model, encouraging collaboration in the development and testing of various control schemes or hardware designs. At the hardware level, this modularity is leveraged through the creation of a SimulinkR library containing blocks for constructing smart transducer models complying with the IEEE 1451 specification. These hardware models were incorporated in a distributed version of the baseline C-MAPSS40k controller and simulations were run to compare the performance of the two models. The overall tracking ability differed only due to quantization effects in the feedback measurements in the distributed controller. Additionally, it was also found that the added complexity of the smart transducer models did not prevent real-time operation of the distributed controller model, a requirement of an HIL system.

  16. How can we improve guideline use? A conceptual framework of implementability

    Directory of Open Access Journals (Sweden)

    Lemieux-Charles Louise

    2011-03-01

    Full Text Available Abstract Background Guidelines continue to be underutilized, and a variety of strategies to improve their use have been suboptimal. Modifying guideline features represents an alternative, but untested way to promote their use. The purpose of this study was to identify and define features that facilitate guideline use, and examine whether and how they are included in current guidelines. Methods A guideline implementability framework was developed by reviewing the implementation science literature. We then examined whether guidelines included these, or additional implementability elements. Data were extracted from publicly available high quality guidelines reflecting primary and institutional care, reviewed independently by two individuals, who through discussion resolved conflicts, then by the research team. Results The final implementability framework included 22 elements organized in the domains of adaptability, usability, validity, applicability, communicability, accommodation, implementation, and evaluation. Data were extracted from 20 guidelines on the management of diabetes, hypertension, leg ulcer, and heart failure. Most contained a large volume of graded, narrative evidence, and tables featuring complementary clinical information. Few contained additional features that could improve guideline use. These included alternate versions for different users and purposes, summaries of evidence and recommendations, information to facilitate interaction with and involvement of patients, details of resource implications, and instructions on how to locally promote and monitor guideline use. There were no consistent trends by guideline topic. Conclusions Numerous opportunities were identified by which guidelines could be modified to support various types of decision making by different users. New governance structures may be required to accommodate development of guidelines with these features. Further research is needed to validate the proposed

  17. Global health rights: Employing human rights to develop and implement the Framework Convention on Global Health.

    Science.gov (United States)

    Gable, Lance; Meier, Benjamin Mason

    2013-06-14

    The Framework Convention on Global Health (FCGH) represents an important idea for addressing the expanding array of governance challenges in global health. Proponents of the FCGH suggest that it could further the right to health through its incorporation of rights into national laws and policies, using litigation and community empowerment to advance rights claims and prominently establish the right to health as central to global health governance. Building on efforts to expand development and influence of the right to health through the implementation of the FCGH, in this article we find that human rights correspondingly holds promise in justifying the FCGH. By employing human rights as a means to develop and implement the FCGH, the existing and evolving frameworks of human rights can complement efforts to reform global health governance, with the FCGH and human rights serving as mutually reinforcing bases of norms and accountability in global health.

  18. Development and Implementation of a Telecommuting Evaluation Framework, and Modeling the Executive Telecommuting Adoption Process

    Science.gov (United States)

    Vora, V. P.; Mahmassani, H. S.

    2002-02-01

    This work proposes and implements a comprehensive evaluation framework to document the telecommuter, organizational, and societal impacts of telecommuting through telecommuting programs. Evaluation processes and materials within the outlined framework are also proposed and implemented. As the first component of the evaluation process, the executive survey is administered within a public sector agency. The survey data is examined through exploratory analysis and is compared to a previous survey of private sector executives. The ordinal probit, dynamic probit, and dynamic generalized ordinal probit (DGOP) models of telecommuting adoption are calibrated to identify factors which significantly influence executive adoption preferences and to test the robustness of such factors. The public sector DGOP model of executive willingness to support telecommuting under different program scenarios is compared with an equivalent private sector DGOP model. Through the telecommuting program, a case study of telecommuting travel impacts is performed to further substantiate research.

  19. A distributed software architecture design framework based on attributed grammar

    Institute of Scientific and Technical Information of China (English)

    JIA Xiao-lin; QIN Zheng; HE Jian; YU Fan

    2005-01-01

    Software architectures shift the focus of developers from lines-of-code to coarser-grained architectural elements and their overall interconnection structure. There are, however, many features of the distributed software that make the developing methods of distributed software quite different from the traditional ways. Furthermore, the traditional centralized ways with fixed interfaces cannot adapt to the flexible requirements of distributed software. In this paper, the attributed grammar (AG) is extended to refine the characters of distributed software, and a distributed software architecture description language (DSADL) based on attributed grammar is introduced, and then a model of integrated environment for software architecture design is proposed. It can be demonstrated by the practice that DSADL can help the programmers to analyze and design distributed software effectively, so the efficiency of the development can be improved greatly.

  20. Improving district level health planning and priority setting in Tanzania through implementing accountability for reasonableness framework

    DEFF Research Database (Denmark)

    Maluka, Stephen; Kamuzora, Peter; Sebastián, Miguel San;

    2010-01-01

    In 2006, researchers and decision-makers launched a five-year project - Response to Accountable Priority Setting for Trust in Health Systems (REACT) - to improve planning and priority-setting through implementing the Accountability for Reasonableness framework in Mbarali District, Tanzania....... The objective of this paper is to explore the acceptability of Accountability for Reasonableness from the perspectives of the Council Health Management Team, local government officials, health workforce and members of user boards and committees....

  1. Can the theoretical domains framework account for the implementation of clinical quality interventions?

    OpenAIRE

    Lipworth, Wendy; Taylor, Natalie; Braithwaite, Jeffrey

    2013-01-01

    Background The health care quality improvement movement is a complex enterprise. Implementing clinical quality initiatives requires attitude and behaviour change on the part of clinicians, but this has proven to be difficult. In an attempt to solve this kind of behavioural challenge, the theoretical domains framework (TDF) has been developed. The TDF consists of 14 domains from psychological and organisational theory said to influence behaviour change. No systematic research has been conducte...

  2. A framework for training health professionals in implementation and dissemination science.

    Science.gov (United States)

    Gonzales, Ralph; Handley, Margaret A; Ackerman, Sara; Oʼsullivan, Patricia S

    2012-03-01

    The authors describe a conceptual framework for implementation and dissemination science (IDS) and propose competencies for IDS training. Their framework is designed to facilitate the application of theories and methods from the distinct domains of clinical disciplines (e.g., medicine, public health), population sciences (e.g., biostatistics, epidemiology), and translational disciplines (e.g., social and behavioral sciences, business administration education). They explore three principles that guided the development of their conceptual framework: Behavior change among organizations and/or individuals (providers, patients) is inherent in the translation process; engagement of stakeholder organizations, health care delivery systems, and individuals is imperative to achieve effective translation and sustained improvements; and IDS research is iterative, benefiting from cycles and collaborative, bidirectional relationships. The authors propose seven domains for IDS training-team science, context identification, literature identification and assessment, community engagement, intervention design and research implementation, evaluation of effect of translational activity, behavioral change communication strategies-and define 12 IDS training competencies within these domains. As a model, they describe specific courses introduced at the University of California, San Francisco, which they designed to develop these competencies. The authors encourage other training programs and institutions to use or adapt the design principles, conceptual framework, and proposed competencies to evaluate their current IDS training needs and to support new program development.

  3. Design, implementation and validation of a novel open framework for agile development of mobile health applications.

    Science.gov (United States)

    Banos, Oresti; Villalonga, Claudia; Garcia, Rafael; Saez, Alejandro; Damas, Miguel; Holgado-Terriza, Juan A; Lee, Sungyong; Pomares, Hector; Rojas, Ignacio

    2015-01-01

    The delivery of healthcare services has experienced tremendous changes during the last years. Mobile health or mHealth is a key engine of advance in the forefront of this revolution. Although there exists a growing development of mobile health applications, there is a lack of tools specifically devised for their implementation. This work presents mHealthDroid, an open source Android implementation of a mHealth Framework designed to facilitate the rapid and easy development of mHealth and biomedical apps. The framework is particularly planned to leverage the potential of mobile devices such as smartphones or tablets, wearable sensors and portable biomedical systems. These devices are increasingly used for the monitoring and delivery of personal health care and wellbeing. The framework implements several functionalities to support resource and communication abstraction, biomedical data acquisition, health knowledge extraction, persistent data storage, adaptive visualization, system management and value-added services such as intelligent alerts, recommendations and guidelines. An exemplary application is also presented along this work to demonstrate the potential of mHealthDroid. This app is used to investigate on the analysis of human behavior, which is considered to be one of the most prominent areas in mHealth. An accurate activity recognition model is developed and successfully validated in both offline and online conditions.

  4. Implementing Peer Learning in Clinical Education: A Framework to Address Challenges In the "Real World".

    Science.gov (United States)

    Tai, Joanna Hong Meng; Canny, Benedict J; Haines, Terry P; Molloy, Elizabeth K

    2017-01-01

    Phenomenon: Peer learning has many benefits and can assist students in gaining the educational skills required in future years when they become teachers themselves. Peer learning may be particularly useful in clinical learning environments, where students report feeling marginalized, overwhelmed, and unsupported. Educational interventions often fail in the workplace environment, as they are often conceived in the "ideal" rather than the complex, messy real world. This work sought to explore barriers and facilitators to implementing peer learning activities in a clinical curriculum. Previous peer learning research results and a matrix of empirically derived peer learning activities were presented to local clinical education experts to generate discussion around the realities of implementing such activities. Potential barriers and limitations of and strategies for implementing peer learning in clinical education were the focus of the individual interviews. Thematic analysis of the data identified three key considerations for real-world implementation of peer learning: culture, epistemic authority, and the primacy of patient-centered care. Strategies for peer learning implementation were also developed from themes within the data, focusing on developing a culture of safety in which peer learning could be undertaken, engaging both educators and students, and establishing expectations for the use of peer learning. Insights: This study identified considerations and strategies for the implementation of peer learning activities, which took into account both educator and student roles. Reported challenges were reflective of those identified within the literature. The resultant framework may aid others in anticipating implementation challenges. Further work is required to test the framework's application in other contexts and its effect on learner outcomes.

  5. Implementation of EUnetHTA core Model® in Lombardia: the VTS framework.

    Science.gov (United States)

    Radaelli, Giovanni; Lettieri, Emanuele; Masella, Cristina; Merlino, Luca; Strada, Alberto; Tringali, Michele

    2014-01-01

    This study describes the health technology assessment (HTA) framework introduced by Regione Lombardia to regulate the introduction of new technologies. The study outlines the process and dimensions adopted to prioritize, assess and appraise the requests of new technologies. The HTA framework incorporates and adapts elements from the EUnetHTA Core Model and the EVIDEM framework. It includes dimensions, topics, and issues provided by EUnetHTA Core Model to collect data and process the assessment. Decision making is instead supported by the criteria and Multi-Criteria Decision Analysis technique from the EVIDEM consortium. The HTA framework moves along three process stages: (i) prioritization of requests, (ii) assessment of prioritized technology, (iii) appraisal of technology in support of decision making. Requests received by Regione Lombardia are first prioritized according to their relevance along eight dimensions (e.g., costs, efficiency and efficacy, organizational impact, safety). Evidence about the impacts of the prioritized technologies is then collected following the issues and topics provided by EUnetHTA Core Model. Finally, the Multi-Criteria Decision Analysis technique is used to appraise the novel technology and support Regione Lombardia decision making. The VTS (Valutazione delle Tecnologie Sanitarie) framework has been successfully implemented at the end of 2011. From its inception, twenty-six technologies have been processed.

  6. Conceptual framework for the thermal distribution method of test

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, J.W.

    1994-11-01

    A Standard Method of Test for residential thermal distribution efficiency is being developed under the auspices of the American Society of Heating, Refrigerating, and Air-Conditioning Engineers (ASHRAE). Thermal distribution systems are the ductwork, piping, or other means used to transport heat or cooling effect from the equipment that produces this thermal energy to the building spaces that need it. Because thermal distribution systems are embedded in and interact with the larger building system as a whole, a new set of parameters has been developed to describe these systems. This paper was written to fill a perceived need for a concise introduction to this terminology.

  7. Communication Optimizations for a Wireless Distributed Prognostic Framework

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed architecture for prognostics is an essential step in prognostic research in order to enable feasible real-time system health management. Communication...

  8. A workflow-oriented framework-driven implementation and local adaptation of clinical information systems: a case study of nursing documentation system implementation at a tertiary rehabilitation hospital.

    Science.gov (United States)

    Choi, Jeeyae; Kim, Hyeoneui

    2012-08-01

    Health information systems are often designed and developed without integrating users' specific needs and preferences. This decreases the users' productivity, satisfaction, and acceptance of the system and increases the necessity for a local adaptation process to reduce the unwanted outcomes after implementation. A workflow-oriented framework developed in a previous study indicates that users' needs and preferences could be incorporated into the system when implementation follows the steps of the framework, eventually increasing satisfaction with and usefulness of the system. The overall goal of this study was to demonstrate application of the workflow-oriented framework to the implementation of a nursing documentation system at Spaulding Rehabilitation Hospital. In this case study, we present specific steps of implementing and adapting a health information system at a local site and raise critical questions that need to be answered in each step based on the workflow-oriented framework.

  9. A Framework for Modeling and Analyzing Complex Distributed Systems

    Science.gov (United States)

    2005-08-15

    tool Kronos , Hybrid Systems HI, Verification and Control, Springer-Verlag, pages 208-219, LNCS, volume 1066, 1996 [16] Roberto De Prisco, Alan Fekete...Open- Kronos model checker for timed automata. Monte Carlo model checking has already been implemented in Open- Kronos and has demonstrated significant

  10. Participation in the implementation of the Water Framework Directive in Denmark

    DEFF Research Database (Denmark)

    Wright, Stuart Anthony Lewis; Jacobsen, Brian Højland

    2011-01-01

    Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform...... to the requirements for public participation. The first aim of the paper is to establish whether enough is being done regarding participation in Denmark, the conclusion being that whilst Denmark is in line with statutory requirements, consultation appears limited whilst evidence of active involvement is lacking....... The paper then presents the Danish AGWAPLAN project which actively involved farmers in selecting measures to reduce diffuse nutrient pollution from agriculture. The second aim of the paper is to establish whether nationwide implementation of the AGWAPLAN concept is worthwhile. AGWAPLAN resulted in outcomes...

  11. Implementation and performance of FDPS: A Framework Developing Parallel Particle Simulation Codes

    CERN Document Server

    Iwasawa, Masaki; Hosono, Natsuki; Nitadori, Keigo; Muranushi, Takayuki; Makino, Junichiro

    2016-01-01

    We have developed FDPS (Framework for Developing Particle Simulator), which enables researchers and programmers to develop high-performance parallel particle simulation codes easily. The basic idea of FDPS is to separate the program code for complex parallelization including domain decomposition, redistribution of particles, and exchange of particle information for interaction calculation between nodes, from actual interaction calculation and orbital integration. FDPS provides the former part and the users write the latter. Thus, a user can implement a high-performance fully parallelized $N$-body code only in 120 lines. In this paper, we present the structure and implementation of FDPS, and describe its performance on three sample applications: disk galaxy simulation, cosmological simulation and Giant impact simulation. All codes show very good parallel efficiency and scalability on K computer and XC30. FDPS lets the researchers concentrate on the implementation of physics and mathematical schemes, without wa...

  12. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  13. A distributed Canny edge detector: algorithm and FPGA implementation.

    Science.gov (United States)

    Xu, Qian; Varadarajan, Srenivas; Chakrabarti, Chaitali; Karam, Lina J

    2014-07-01

    The Canny edge detector is one of the most widely used edge detection algorithms due to its superior performance. Unfortunately, not only is it computationally more intensive as compared with other edge detection algorithms, but it also has a higher latency because it is based on frame-level statistics. In this paper, we propose a mechanism to implement the Canny algorithm at the block level without any loss in edge detection performance compared with the original frame-level Canny algorithm. Directly applying the original Canny algorithm at the block-level leads to excessive edges in smooth regions and to loss of significant edges in high-detailed regions since the original Canny computes the high and low thresholds based on the frame-level statistics. To solve this problem, we present a distributed Canny edge detection algorithm that adaptively computes the edge detection thresholds based on the block type and the local distribution of the gradients in the image block. In addition, the new algorithm uses a nonuniform gradient magnitude histogram to compute block-based hysteresis thresholds. The resulting block-based algorithm has a significantly reduced latency and can be easily integrated with other block-based image codecs. It is capable of supporting fast edge detection of images and videos with high resolutions, including full-HD since the latency is now a function of the block size instead of the frame size. In addition, quantitative conformance evaluations and subjective tests show that the edge detection performance of the proposed algorithm is better than the original frame-based algorithm, especially when noise is present in the images. Finally, this algorithm is implemented using a 32 computing engine architecture and is synthesized on the Xilinx Virtex-5 FPGA. The synthesized architecture takes only 0.721 ms (including the SRAM READ/WRITE time and the computation time) to detect edges of 512 × 512 images in the USC SIPI database when clocked at 100

  14. A Framework for Implementing the National Diabetes Prevention Program in Los Angeles County.

    Science.gov (United States)

    Mosst, Jennifer T; DeFosset, Amelia; Gase, Lauren; Baetscher, Laura; Kuo, Tony

    2017-08-24

    Preventing type 2 diabetes is a public health priority in the United States. An estimated 86 million Americans aged 20 years or older have prediabetes, 90% of whom are unaware they have it. The National Diabetes Prevention Program (NDPP) has the potential to reduce the incidence of type 2 diabetes; however, little is known about the best way to institutionalize such a program in a jurisdiction with a racially/ethnically diverse population. The objective of this study was to develop a practice-grounded framework for implementing the NDPP in Los Angeles County. In 2015, the Los Angeles County Department of Public Health (LACDPH) partnered with Ad Lucem Consulting to conduct a 3-stage formative assessment that consisted of 1) in-depth interviews with key informants representing community-based organizations to learn about their experiences implementing the NDPP and similar lifestyle-change programs and 2) 2 strategic planning sessions to obtain input and feedback from the Los Angeles County Diabetes Prevention Coalition. LACDPH identified core activities to increase identification of people with type 2 diabetes and referral and enrollment of eligible populations in the NDPP. We worked with LACDPH and key informants to develop a 3-pronged framework of core activities to implement NDPP: expanding outreach and education, improving health care referral systems and protocols, and increasing access to and insurance coverage for NDPP. The framework will use a diverse partner network to advance these strategies. The framework has the potential to identify people with prediabetes and to expand NDPP among priority populations in Los Angeles County and other large jurisdictions by using a diverse partner network.

  15. Implementation of Physical Layer Key Distribution using Software Defined Radios

    Directory of Open Access Journals (Sweden)

    S. Kambala

    2013-01-01

    Full Text Available It was well known from Shannon’s days that characteristics of the physical channel like attenuation, fadingand noise can impair reliable communication. But it was more recently that the beneficial side effects of channelcharacteristics in ensuring secret communication started getting attention. Studies have been made to quantify theamount of secrecy that can be reaped by combining channel coding with security protocols. The Wiretap channelproposed by Wyner is arguably one of the oldest models of physical layer security protocols. In this paper, wepresent a brief tutorial introduction to the Wiretap channel, followed by an application of the physical layer modelto a class of Key Distribution protocols. We present results from an implementation of key distribution protocolsusing Software Defined Radio tools along with physical RF hardware peripherals. We believe this approach is muchmore tangible and informative than computer based simulation studies.Defence Science Journal, 2013, 63(1, pp.6-14, DOI:http://dx.doi.org/10.14429/dsj.63.3758

  16. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  17. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  18. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  19. Science-policy interfacing in support of the Water Framework Directive implementation.

    Science.gov (United States)

    Vaes, G; Willems, P; Swartenbroekx, P; Kramer, K; de Lange, W; Kober, K

    2009-01-01

    Many current water-related RTD projects have established operational links with practitioners, which allow the needs of policy makers to be taken into account. However, RTD results are not easily available to water policy implementers and research scientists may lack insight in the needs of policy makers and implementers (i.e. the European Commission and water managers). The SPI-Water project worked out a number of concrete actions to bridge these gaps in communication by developing and implementing a 'science-policy interface', enhancing the use of RTD results in the Water Framework Directive (WFD) implementation. This project is part of a wider EC perspective aiming to bridge the gap between science and policy, specifically with respect to the WFD implementation. As a first action, existing science-policy links are investigated. RTD and LIFE projects that are of direct relevance for the implementation of the WFD are identified and analysed. Secondly, an information system (Harmoni-CA's WISE RTD Web Portal) has been further developed to cater for an efficient and easy to use tool for dissemination as well as retrieval of RTD results. As third action, this science-policy interfacing of WFD related topics are extended to non-EU countries taking into account their specific needs.

  20. MODELS AND SOLUTIONS FOR THE IMPLEMENTATION OF DISTRIBUTED SYSTEMS

    Directory of Open Access Journals (Sweden)

    Tarca Naiana

    2011-07-01

    Full Text Available Software applications may have different degrees of complexity depending on the problems they try to solve and can integrate very complex elements that bring together functionality that sometimes are competing or conflicting. We can take for example a mobile communications system. Functionalities of such a system are difficult to understand, and they add to the non-functional requirements such as the use in practice, performance, cost, durability and security. The transition from local computer networks to cover large networks that allow millions of machines around the world at speeds exceeding one gigabit per second allowed universal access to data and design of applications that require simultaneous use of computing power of several interconnected systems. The result of these technologies has enabled the evolution from centralized to distributed systems that connect a large number of computers. To enable the exploitation of the advantages of distributed systems one had developed software and communications tools that have enabled the implementation of distributed processing of complex solutions. The objective of this document is to present all the hardware, software and communication tools, closely related to the possibility of their application in integrated social and economic level as a result of globalization and the evolution of e-society. These objectives and national priorities are based on current needs and realities of Romanian society, while being consistent with the requirements of Romania's European orientation towards the knowledge society, strengthening the information society, the target goal representing the accomplishment of e-Romania, with its strategic e-government component. Achieving this objective repositions Romania and gives an advantage for sustainable growth, positive international image, rapid convergence in Europe, inclusion and strengthening areas of high competence, in line with Europe 2020, launched by the

  1. Distributed Optimal Economic Dispatch Based on Multi-Agent System Framework in Combined Heat and Power Systems

    Directory of Open Access Journals (Sweden)

    Yu-Shuai Li

    2016-10-01

    Full Text Available In this paper, a novel distributed method is presented to solve combined heat and power economic dispatch problem, which is formulated as a distributed coupled optimization problem. The optimization goal is achieved by establishing two modified consensus protocols with two corresponding feedback parts while satisfying the electrical and heat supply–demand balance. Moreover, an alternating iterative method is proposed to handle the heat-electrical coupling problem existed in the objective function and the feasible operating regions. In addition, the proposed distributed method is implemented by a multi-agent system framework, which only requires local information exchange among neighboring agents. Simulation results obtained on a 16-bus test system are provided to illustrate the effectiveness of the proposed distributed method.

  2. A guide to using the Theoretical Domains Framework of behaviour change to investigate implementation problems.

    Science.gov (United States)

    Atkins, Lou; Francis, Jill; Islam, Rafat; O'Connor, Denise; Patey, Andrea; Ivers, Noah; Foy, Robbie; Duncan, Eilidh M; Colquhoun, Heather; Grimshaw, Jeremy M; Lawton, Rebecca; Michie, Susan

    2017-06-21

    Implementing new practices requires changes in the behaviour of relevant actors, and this is facilitated by understanding of the determinants of current and desired behaviours. The Theoretical Domains Framework (TDF) was developed by a collaboration of behavioural scientists and implementation researchers who identified theories relevant to implementation and grouped constructs from these theories into domains. The collaboration aimed to provide a comprehensive, theory-informed approach to identify determinants of behaviour. The first version was published in 2005, and a subsequent version following a validation exercise was published in 2012. This guide offers practical guidance for those who wish to apply the TDF to assess implementation problems and support intervention design. It presents a brief rationale for using a theoretical approach to investigate and address implementation problems, summarises the TDF and its development, and describes how to apply the TDF to achieve implementation objectives. Examples from the implementation research literature are presented to illustrate relevant methods and practical considerations. Researchers from Canada, the UK and Australia attended a 3-day meeting in December 2012 to build an international collaboration among researchers and decision-makers interested in the advancing use of the TDF. The participants were experienced in using the TDF to assess implementation problems, design interventions, and/or understand change processes. This guide is an output of the meeting and also draws on the authors' collective experience. Examples from the implementation research literature judged by authors to be representative of specific applications of the TDF are included in this guide. We explain and illustrate methods, with a focus on qualitative approaches, for selecting and specifying target behaviours key to implementation, selecting the study design, deciding the sampling strategy, developing study materials, collecting and

  3. A FRAMEWORK FOR MEASURING AND IMPROVING EFFICIENCY IN DISTRIBUTION CHANNELS

    Directory of Open Access Journals (Sweden)

    Milan Andrejić

    2016-06-01

    Full Text Available Distribution of products is largely conditioned by the efficiency of logistics processes. The efficient logistics processes provide loyal and satisfied customers, dominant position on the market and revenue. In this paper new approach for measuring and improving efficiency of logistics processes in distribution channel is proposed. Model based on the Principal Component Analysis – Data Envelopment Analysis approach evaluates efficiency of ordering, warehousing, packaging, inventory management and transport processes as well as distribution channel efficiency. Proposed approach also gives information about corrective actions for efficiency improvement. According results efficiency should be improved in several ways: information system improvement, failures decreasing, utilization increasing and output increasing. The results of proposed approach testing show great applicability of developed approach.

  4. Platform-level Distributed Warfare Model-based on Multi-Agent System Framework

    Directory of Open Access Journals (Sweden)

    Xiong Li

    2012-05-01

    Full Text Available The multi-agent paradigm has become a useful tool in solving military problems. However, one of key challenges in multi-agent model for distributed warfare could be how to describe the microcosmic  tactical warfare platforms actions. In this paper, a platform-level distributed warfare model based on multi-agent system framework is designed to tackle this challenge. The basic ideas include:  Establishing multi-agent model by mapping from tactical warfare system’s members, i.e., warfare platforms, to respective agents; performing task decomposition and task allocation by using task-tree decomposition method and improved contract net protocol model technique; and implementing simulation by presenting battlefield terrain environment analysis algorithm based on grid approach. The  simulation demonstration results show that our model provides a feasible and effective approach to supporting the abstraction and representation of microcosmic tactical actions for complex warfare system.Defence Science Journal, 2012, 62(1, pp.180-186, DOI:http://dx.doi.org/10.14429/dsj.62.964

  5. New Methods in Acquisition, Update and Dissemination of Nature Conservation Geodata - Implementation of an Integrated Framework

    Science.gov (United States)

    Tintrup gen. Suntrup, G.; Jalke, T.; Streib, L.; Keck, N.; Nieland, S.; Moran, N.; Kleinschmit, B.; Trapp, M.

    2015-04-01

    Within the framework of this project methods are being tested and implemented a) to introduce remote sensing based approaches into the existing process of biotope mapping and b) to develop a framework serving the multiple requirements arising from different users' backgrounds and thus the need for comprehensive data interoperability. Therefore state-wide high resolution land cover vector-data have been generated in an automated object oriented workflow based on aerial imagery and a normalised digital surface models.These data have been enriched by an extensive characterisation of the individual objects by e.g. site specific, contextual or spectral parameters utilising multitemporal satellite images, DEM-derivatives and multiple relevant geo-data. Parameters are tested on relevance in regard to the classification process using different data mining approaches and have been used to formalise categories of the European nature information system (EUNIS) in a semantic framework. The Classification will be realised by ontology-based reasoning. Dissemination and storage of data is developed fully INSPIRE-compatible and facilitated via a web portal. Main objectives of the project are a) maximum exploitation of existing "standard" data provided by state authorities, b) combination of these data with satellite imagery (Copernicus), c) create land cover objects and achieve data interoperability through low number of classes but comprehensive characterisation and d) implement algorithms and methods suitable for automated processing on large scales.

  6. Consolidating tactical planning and implementation frameworks for integrated vector management in Uganda.

    Science.gov (United States)

    Okia, Michael; Okui, Peter; Lugemwa, Myers; Govere, John M; Katamba, Vincent; Rwakimari, John B; Mpeka, Betty; Chanda, Emmanuel

    2016-04-14

    Integrated vector management (IVM) is the recommended approach for controlling some vector-borne diseases (VBD). In the face of current challenges to disease vector control, IVM is vital to achieve national targets set for VBD control. Though global efforts, especially for combating malaria, now focus on elimination and eradication, IVM remains useful for Uganda which is principally still in the control phase of the malaria continuum. This paper outlines the processes undertaken to consolidate tactical planning and implementation frameworks for IVM in Uganda. The Uganda National Malaria Control Programme with its efforts to implement an IVM approach to vector control was the 'case' for this study. Integrated management of malaria vectors in Uganda remained an underdeveloped component of malaria control policy. In 2012, knowledge and perceptions of malaria vector control policy and IVM were assessed, and recommendations for a specific IVM policy were made. In 2014, a thorough vector control needs assessment (VCNA) was conducted according to WHO recommendations. The findings of the VCNA informed the development of the national IVM strategic guidelines. Information sources for this study included all available data and accessible archived documentary records on VBD control in Uganda. The literature was reviewed and adapted to the local context and translated into the consolidated tactical framework. WHO recommends implementation of IVM as the main strategy to vector control and has encouraged member states to adopt the approach. However, many VBD-endemic countries lack IVM policy frameworks to guide implementation of the approach. In Uganda most VBD coexists and could be managed more effectively if done in tandem. In order to successfully control malaria and other VBD and move towards their elimination, the country needs to scale up proven and effective vector control interventions and also learn from the experience of other countries. The IVM strategy is important in

  7. A Framework for Seamless Interoperation of Heterogeneous Distributed Software Components

    Science.gov (United States)

    2005-05-01

    68-87, Idea Group Publishing, 2005. 2. Graham Wilcock, Paul Buitelaar, Antonio Pareja-Lora, Barrett Bryant, Jimmy Lin, Nancy Ide, "The Roles of...associated timestamp, depending on which the queries can be [31 L. Taylor, M. Shields, 1 Wang, and R. Philp , "Distributed P2P propagated to other

  8. Theoretical background for market emergence framework. Case: Electricity distribution industry

    Energy Technology Data Exchange (ETDEWEB)

    Immonen, M.; Laaksonen, P.; Vilko, J.; Tahvanainen, K.; Viljainen, S.; Partanen, J.

    2009-07-01

    Both the competitive environment and the internal structure of an industrial organization are typically included in the processes which describe the strategic management processes of the firm, but less attention has been paid to the interdependence between these views. Therefore, this research focuses on explaining the particular conditions of an industry change, which lead managers to realign the firm in respect of its environment for generating competitive advantage. The research question that directs the development of the theoretical framework is: Why do firms outsource some of their functions? The three general stages of the analysis are related to the following research topics: (i) understanding forces that shape the industry, (ii) estimating the impacts of transforming customer preferences, rivalry, and changing capability bases on the relevance of existing assets and activities, and emergence of new business models, and (iii) developing optional structures for future value chains and understanding general boundaries for market emergence. The defined research setting contributes to the managerial research questions 'Why do firms reorganize their value chains?', 'Why and how are decisions made?' Combining Transaction Cost Economics (TCE) and Resource-Based View (RBV) within an integrated framework makes it possible to evaluate the two dimensions of a company's resources, namely the strategic value and transferability. The final decision of restructuring will be made based on an analysis of the actual business potential of the outsourcing, where benefits and risks are evaluated. The firm focuses on the risk of opportunism, hold-up problems, pricing, and opportunities to reach a complete contract, and finally on the direct benefits and risks for financial performance. The supplier analyzes the business potential of an activity outside the specific customer, the amount of customer-specific investments, the service provider

  9. Risk mitigation in the implementation of AMTs: A guiding framework for future

    Directory of Open Access Journals (Sweden)

    Bhaskar Nagar

    2012-04-01

    Full Text Available The fast industrial development increases different types of risks for the industries. Many risk factors are inherent in the implementation of advanced manufacturing technologies (AMTs. Industries are developing methodologies for risk prevention and protection. The present research focuses to identify various risks that could influence the implementation of AMTs, and develop a framework to mitigate them. For this framework, interpretive structural modeling(ISM has been used to depict the relationship and priority among the various risks. This research provides a path for managers and indicates the dominant risks on the basis of higher driving power. Also, this research classifies the relationship among various risks in AMTs implementation according to their driving power and dependence. The risks have been categorized into four categories as autonomous risks, linkage risks, dependent risks and independent risks. The proposed hierarchal model would help the management to effectively handle and develop strategies against the risks and hence new and latest technologies can be adopted with ease and effectiveness.

  10. Using the "customer service framework" to successfully implement patient- and family-centered care.

    Science.gov (United States)

    Rangachari, Pavani; Bhat, Anita; Seol, Yoon-Ho

    2011-01-01

    Despite the growing momentum toward patient- and family-centered care at the federal policy level, the organizational literature remains divided on its effectiveness, especially in regard to its key dimension of involving patients and families in treatment decisions and safety practices. Although some have argued for the universal adoption of patient involvement, others have questioned both the effectiveness and feasibility of patient involvement. In this article, we apply a well-established theoretical perspective, that is, the Service Quality Model (SQM) (also known as the "customer service framework") to the health care context, to reconcile the debate related to patient involvement. The application helps support the case for universal adoption of patient involvement and also question the arguments against it. A key contribution of the SQM lies in highlighting a set of fundamental service quality determinants emanating from basic consumer service needs. It also provides a simple framework for understanding how gaps between consumer expectations and management perceptions of those expectations can affect the gap between "expected" and "perceived" service quality from a consumer's perspective. Simultaneously, the SQM also outlines "management requirements" for the successful implementation of a customer service strategy. Applying the SQM to the health care context therefore, in addition to reconciling the debate on patient involvement, helps identify specific steps health care managers could take to successfully implement patient- and family-centered care. Correspondingly, the application also provides insights into strategies for the successful implementation of policy recommendations related to patient- and family-centered care in health care organizations.

  11. Developing a Framework for Traceability Implementation in the Textile Supply Chain

    Directory of Open Access Journals (Sweden)

    Vijay Kumar

    2017-04-01

    Full Text Available Traceability has recently gained considerable attention in the textile industry. Traceability stands for information sharing about a product including the product history, specification, or location. With the involvement of globally dispersed actors in the textile supply chain, ensuring appropriate product quality with timely supplies is crucial for surviving in this industry with ever increasing competition. Hence it is of paramount importance for a supply chain actor to track every product and trace its history in the supply chain. In this context, this paper presents a framework to implement traceability in the textile supply chain. A system approach has been followed, where firstly the usage requirement of traceability is defined, and then a framework for implementing intra-actor or internal traceability and inter-actor or external traceability is discussed. This article further presents a sequential diagram to demonstrate the interaction and information exchange between the actors in the supply chain, when the traceability information is requested. An example is also illustrated for data storage using a relational database management system and information exchange using XML for the textile weaver. Finally, the article discusses challenges and future studies required to implement traceability in the textile supply chain.

  12. Possibilities of implementation of bioavailability methods for organic contaminants in the Dutch Soil Quality Assessment Framework.

    Science.gov (United States)

    Brand, Ellen; Lijzen, Johannes; Peijnenburg, Willie; Swartjes, Frank

    2013-10-15

    In the Netherlands, risk assessment of contaminated soils is based on determining the total contaminant concentration. If this measured soil concentration exceeds the Soil Quality Standards (SQS) a higher tier risk evaluation must be performed. Experiences from the field have given rise to the perception that performing risk evaluations based on (measured) total concentrations may lead to an inaccurate assessment of the actual risks. Assuming that only the bioavailable fraction is capable of exerting adverse effects in the soil ecosystem, it is suggested, that by taking bioavailability into account in a (higher tier) risk evaluation, a more effect-based risk assessment can be performed. Bioavailability has been a subject of research for several decades. However up to now bioavailability has not been implemented in the Dutch Soil Quality Assessment Framework. First actions were taken in the Netherlands to determine whether the concept of bioavailability could be implemented in the risk assessment of contaminated soils and to find out how bioavailability can become part of the Dutch Soil Quality Assessment Framework. These actions have led to a concrete proposal for implementation of bioavailability methods in the risk assessment of organic contaminants in soils. This paper focuses on the chemical prediction of bioavailability for ecological risk assessment of contaminated soils.

  13. Sustained Implementation of Evidence-based Programs in Disadvantaged Communities: A Conceptual Framework of Supporting Factors.

    Science.gov (United States)

    Hodge, Lauren M; Turner, Karen M T

    2016-09-01

    This paper presents a review of the empirical literature for studies evaluating factors that facilitate and create barriers to sustained program implementation in disadvantaged communities. It outlines study methodology and sustainment outcomes and proposes a conceptual model that involves implementation sustainment support for providers delivering evidence-based health and family services in disadvantaged communities. Sustained program implementation in the community setting is a significant issue as only 43% of studies reported successfully sustained programs. The review identified 18 factors that facilitate success and create barriers to program sustainment. The factors are synthesized into three themes; program characteristics, workplace capacity, and process and interaction factors. The majority of factors map onto commonly cited sustainability influences in implementation science. However, there was an additional focus for studies included in this review on the importance of factors such as program burden, program familiarity and perceived competence in program skills, workplace support for the program, staff mobility and turnover, supervision and peer support, and ongoing technical assistance. The need to use a conceptual framework and develop measures to guide and evaluate capacity building in EBP implementation and sustainment in low-resource community settings is highlighted.

  14. Implementing the European Marine Strategy Framework Directive: Scientific challenges and opportunities

    Science.gov (United States)

    Newton, Alice; Borja, Angel; Solidoro, Cosimo; Grégoire, Marilaure

    2015-10-01

    The Marine Strategy Framework Directive (MSFD; EC, 2008) is an ambitious European policy instrument that aims to achieve Good Environmental Status (GES) in the 5,720,000 km2 of European seas by 2020, using an Ecosystem Approach. GES is to be assessed using 11 descriptors and up to 56 indicators (European Commission, 2010), and the goal is for clean, healthy and productive seas that are the basis for marine-based development, known as Blue-Growth. The MSFD is one of many policy instruments, such as the Water Framework Directive, the Common Fisheries Policy and the Habitats Directive that, together, should result in "Healthy Oceans and Productive Ecosystems - HOPE". Researchers working together with stakeholders such as the Member States environmental agencies, the European Environmental Agency, and the Regional Sea Conventions, are to provide the scientific knowledge basis for the implementation of the MSFD. This represents both a fascinating challenge and a stimulating opportunity.

  15. Surveillance indicators and their use in implementation of the Marine Strategy Framework Directive

    DEFF Research Database (Denmark)

    Shephard, Samuel; Greenstreet, Simon P. R.; Piet, GerJan J.

    2015-01-01

    warning signals) that presents a broader and more holistic picture of state, and inform and support science, policy, and management. In this study,we (i) present a framework for including surveillance indicators into the Activity–Pressure–State– Response process, (ii) consider a range of possible......The European Union Marine Strategy Framework Directive (MSFD) uses indicators to track ecosystem state in relation to Good Environmental Status (GES). These indicators were initially expected to be “operational”, i.e. to have well-understood relationships between state and specified anthropogenic...... pressure(s), and to have defined targets. Recent discussion on MSFD implementation has highlighted an additional class of “surveillance” indicators. Surveillance indicators monitor key aspects of the ecosystem for which there is: first, insufficient evidence to define targets and support formal state...

  16. European union water policy--tasks for implementing "Water Framework Directive" in pre-accession countries.

    Science.gov (United States)

    Sözen, Seval; Avcioglu, Ebru; Ozabali, Asli; Görgun, Erdem; Orhon, Derin

    2003-08-01

    Water Framework Directive aiming to maintain and improve the aquatic environment in the EU was launched by the European Parliament in 2000. According to this directive, control of quantity is an ancillary element in securing good water quality and therefore measures on quantity, serving the objective of ensuring good quality should also be established. Accordingly, it is a comprehensive and coordinated package that will ensure all European waters to be protected according to a common standard. Therefore, it refers to all other Directives related to water resources management such as Urban Wastewater Treatment Directive Nitrates Directive, Drinking Water Directive, Integrated Pollution Prevention Control etc. Turkey, as a candidate state targeting full-membership, should comply the necessary preparations for the implementation of the "Water Framework Directive" as soon as possible. In this study, the necessary legislative, political, institutional, and technical attempts of the pre-accession countries have been discussed and effective recommendations have been offered for future activities in Turkey.

  17. Measuring elementary educators' understanding and readiness for implementing a new framework in science education

    Science.gov (United States)

    Nollmeyer, Gustave Evan

    The NRC's (2012) report, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, ushered in a new era of science education. It is unclear how prepared elementary educators are for the framework. This study sought to establish measures for assessing inservice educators' self-reported understanding of the new framework and readiness to implement the ideas in their science instruction. Designing and validating an instrument to assess these constructs followed procedures established in the literature. First, literature on science education was examined to identify themes that could be used in constructing instrument items. This item pool was examined and modified through expert review. Next, the modified instrument was piloted with a small sample, N = 13, of inservice educators. After final adjustments to the instrument, it was used in a large scale validation study. Inservice elementary educators from four states, Montana, Idaho, Wyoming, and Utah participated in the validation study, N= 167. Since understanding and readiness were determined to assess separate constructs, the two were handled individually during statistical analyses. Exploratory analysis on both scales, understanding and readiness, revealed stable factor models that were further validated through confirmatory factor analysis. The internal consistency reliability of the scales were determined through Cronbach's Alpha. With solid statistical evidence, conclusions were drawn from the study. Each instrument could be used in similar contexts to measure elementary educators' understanding of or readinessto implement the new framework for science education. The unique factor structures of the two scales suggests important differences between understanding and readiness. These differences should inform professional development efforts.

  18. Implementing accountability for reasonableness framework at district level in Tanzania: a realist evaluation

    Directory of Open Access Journals (Sweden)

    Ndawi Benedict

    2011-02-01

    Full Text Available Abstract Background Despite the growing importance of the Accountability for Reasonableness (A4R framework in priority setting worldwide, there is still an inadequate understanding of the processes and mechanisms underlying its influence on legitimacy and fairness, as conceived and reflected in service management processes and outcomes. As a result, the ability to draw scientifically sound lessons for the application of the framework to services and interventions is limited. This paper evaluates the experiences of implementing the A4R approach in Mbarali District, Tanzania, in order to find out how the innovation was shaped, enabled, and constrained by the interaction between contexts, mechanisms and outcomes. Methods This study draws on the principles of realist evaluation -- a largely qualitative approach, chiefly concerned with testing and refining programme theories by exploring the complex interactions of contexts, mechanisms, and outcomes. Mixed methods were used in data collection, including individual interviews, non-participant observation, and document reviews. A thematic framework approach was adopted for the data analysis. Results The study found that while the A4R approach to priority setting was helpful in strengthening transparency, accountability, stakeholder engagement, and fairness, the efforts at integrating it into the current district health system were challenging. Participatory structures under the decentralisation framework, central government's call for partnership in district-level planning and priority setting, perceived needs of stakeholders, as well as active engagement between researchers and decision makers all facilitated the adoption and implementation of the innovation. In contrast, however, limited local autonomy, low level of public awareness, unreliable and untimely funding, inadequate accountability mechanisms, and limited local resources were the major contextual factors that hampered the full

  19. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  20. Defining the challenges for ecodesign implementation in companies: development and consolidation of a framework

    DEFF Research Database (Denmark)

    Dekoninck, Elies A.; Domingo, Lucie; O'Hare, Jamie Alexander

    2016-01-01

    This study addresses the problem of the slow take-up of ecodesign in industry by identifying and categorising the implementation challenges faced by practitioners. Case studies from nine manufacturing companies from five different countries are reported based on interviews with key ecodesign pers...... the development of a more consolidated body of knowledge in this field....... personnel. A literature-derived framework is used to analyse each case, allowing for robust cross-case analysis. Challenges are identified in five areas: strategy, tools, collaboration, management and knowledge. The management category of challenges is the most frequently mentioned by the companies sampled...

  1. Implementation of the rio conventions in framework of national sustainable development policy in Ukraine

    Directory of Open Access Journals (Sweden)

    L.V. Zharova

    2014-09-01

    Full Text Available The aim of the article. In the article the national regulatory and legal framework for environmental economics and various bylaws and regulations in the area of environmental protection and sustainable management of natural resources were analyzed. The analysis of laws and regulations concerning the environmental economics was aligned along the following main axes representing key activity areas, such as strategic planning, technical regulation, economic situation and macroeconomic forecasting, trade development, financial policy, cooperation between Ukraine and European Union. Grounded, that a major strength in the technical and financial regulation area is the presence of a suite of regulations concerning the nature reserves and protected areas and natural resource cadastres that can be adapted to meet modern needs, as well as legislated sanctions that are in place to prosecute environmental law offenders. The strengths and opportunities for enhancing the capacity for implementing the Rio Conventions were summarized. The results of the analysis. The analysis of gaps allowed to mark out gaps threatening the implementation process; gaps undermining the efficiency of sectoral policy; options to overcome these obstacles; progress achieved to address the identified gaps. Authors grounded that Ukraine has an extensive suite of environmental laws and regulations and at the same time, the level of implementation of the Rio Conventions can be described as insufficient due to objective and subjective factors, which detailed described in the paper. The researchers underlined that legislative reform should involve systemic changes in the legislation including general and specialized measures. General measures are those that aim to change the system of environmental legislation in general, while specialized measures focus on the improvement of specific laws and regulations that form part of this system. Conclusions concerned the range of general measures

  2. Researcher readiness for participating in community-engaged dissemination and implementation research: a conceptual framework of core competencies.

    Science.gov (United States)

    Shea, Christopher M; Young, Tiffany L; Powell, Byron J; Rohweder, Catherine; Enga, Zoe K; Scott, Jennifer E; Carter-Edwards, Lori; Corbie-Smith, Giselle

    2017-03-24

    Participating in community-engaged dissemination and implementation (CEDI) research is challenging for a variety of reasons. Currently, there is not specific guidance or a tool available for researchers to assess their readiness to conduct CEDI research. We propose a conceptual framework that identifies detailed competencies for researchers participating in CEDI and maps these competencies to domains. The framework is a necessary step toward developing a CEDI research readiness survey that measures a researcher's attitudes, willingness, and self-reported ability for acquiring the knowledge and performing the behaviors necessary for effective community engagement. The conceptual framework for CEDI competencies was developed by a team of eight faculty and staff affiliated with a university's Clinical and Translational Science Award (CTSA). The authors developed CEDI competencies by identifying the attitudes, knowledge, and behaviors necessary for carrying out commonly accepted CE principles. After collectively developing an initial list of competencies, team members individually mapped each competency to a single domain that provided the best fit. Following the individual mapping, the group held two sessions in which the sorting preferences were shared and discrepancies were discussed until consensus was reached. During this discussion, modifications to wording of competencies and domains were made as needed. The team then engaged five community stakeholders to review and modify the competencies and domains. The CEDI framework consists of 40 competencies organized into nine domains: perceived value of CE in D&I research, introspection and openness, knowledge of community characteristics, appreciation for stakeholder's experience with and attitudes toward research, preparing the partnership for collaborative decision-making, collaborative planning for the research design and goals, communication effectiveness, equitable distribution of resources and credit, and

  3. Viewpoints: a framework for object oriented database modelling and distribution

    Directory of Open Access Journals (Sweden)

    Fouzia Benchikha

    2006-01-01

    Full Text Available The viewpoint concept has received widespread attention recently. Its integration into a data model improves the flexibility of the conventional object-oriented data model and allows one to improve the modelling power of objects. The viewpoint paradigm can be used as a means of providing multiple descriptions of an object and as a means of mastering the complexity of current database systems enabling them to be developed in a distributed manner. The contribution of this paper is twofold: to define an object data model integrating viewpoints in databases and to present a federated database system integrating multiple sources following a local-as-extended-view approach.

  4. Participation in the implementation of the Water Framework Directive in Denmark

    DEFF Research Database (Denmark)

    Wright, Stuart Anthony Lewis; Jacobsen, Brian Højland

    2011-01-01

    Public participation in the form of informing, consulting and actively involving all interested parties is required during the implementation of the Water Framework Directive (WFD). This paper discusses progress with implementation of the WFD in Denmark and the measures taken to conform...... to the requirements for public participation. The first aim of the paper is to establish whether enough is being done regarding participation in Denmark, the conclusion being that whilst Denmark is in line with statutory requirements, consultation appears limited whilst evidence of active involvement is lacking...... which could potentially increase the effectiveness of the WFD. Furthermore, the adoption of the project approach would also be one way to satisfy the requirement for active involvement in the Directive. However, some problems exist, relating to time, administrative costs, problems with control...

  5. MPOWER and the Framework Convention on Tobacco Control implementation in the South-East Asia region

    Directory of Open Access Journals (Sweden)

    P K Singh

    2012-01-01

    Full Text Available The 11 member states of WHO′s South-East Asia Region share common factors of high prevalence of tobacco use, practice of several forms of tobacco use, increasing prevalence of tobacco use among the youth and women, link of tobacco use with poverty, and influence of tobacco advertisements in propagating the use of tobacco, especially among young girls and women. The effects of tobacco use are many-fold, leading to high morbidity and mortality rates as well as loss of gross domestic product (GDP to respective countries. The WHO Regional Office for South-East Asia has been actively involved in curbing this menace essentially by way of assisting member states in implementing the WHO Framework Convention on Tobacco Control (FCTC. This paper gives an overview of these activities and discusses the opportunities and challenges in implementing the FCTC and possible practical solutions.

  6. Straffa and Goodwin: A Unifying Framework for Standards of Value in the Income Distribution Framework

    NARCIS (Netherlands)

    Steenge, A.E.

    1995-01-01

    In this paper we show that direct links exist between Sraffian and Goodwinian methodologies in income distribution theory. In fact, Sraffa's Standard commodity approach and Goodwin's transformation of axes are shown to be different manifestations of the same structural approach. Both are brought

  7. Operations management in distribution networks within a smart city framework.

    Science.gov (United States)

    Cerulli, Raffaele; Dameri, Renata Paola; Sciomachen, Anna

    2017-02-20

    This article studies a vehicle routing problem with environmental constraints that are motivated by the requirements for sustainable urban transport. The empirical research presents a fleet planning problem that takes into consideration both minimum cost vehicle routes and minimum pollution. The problem is formulated as a mixed integer linear programming model and experimentally validated using data collected from a real situation: a grocery company delivering goods ordered via e-channels to customers spread in the urban and metropolitan area of Genoa smart city. The proposed model is a variant of the vehicle routing problem tailored to include environmental issues and street limitations. Its novelty regards also the use of real data instances provided by the B2C grocery company. Managerial implications are the choice of both the routes and the number and type of vehicles. Results show that commercial distribution strategies achieve better results in term of both business and environmental performance, provided the smart mobility goals and constraints are included into the distribution model from the beginning.

  8. A Semantics-Based Information Distribution Framework for Large Web-Based Course Forum System

    Science.gov (United States)

    Chim, Hung; Deng, Xiaotie

    2008-01-01

    We propose a novel data distribution framework for developing a large Web-based course forum system. In the distributed architectural design, each forum server is fully equipped with the ability to support some course forums independently. The forum servers collaborating with each other constitute the whole forum system. Therefore, the workload of…

  9. Conceptual Framework for Curriculum Decisions in Education for Marketing and Distribution Careers.

    Science.gov (United States)

    Gordon, Alice K.; And Others

    Developed to provide bases for curriculum decisions in education for marketing and distribution careers, the conceptual framework presented here contains the following elements: Identification of social, economic and educational trends which affect employment and education in marketing and distribution; an assessment of current education practice;…

  10. Using a knowledge translation framework to implement asthma clinical practice guidelines in primary care.

    Science.gov (United States)

    Licskai, Christopher; Sands, Todd; Ong, Michael; Paolatto, Lisa; Nicoletti, Ivan

    2012-10-01

    Quality problem International guidelines establish evidence-based standards for asthma care; however, recommendations are often not implemented and many patients do not meet control targets. Initial assessment Regional pilot data demonstrated a knowledge-to-practice gap. Choice of solutions We engineered health system change in a multi-step approach described by the Canadian Institutes of Health Research knowledge translation framework. Implementation Knowledge translation occurred at multiple levels: patient, practice and local health system. A regional administrative infrastructure and inter-disciplinary care teams were developed. The key project deliverable was a guideline-based interdisciplinary asthma management program. Six community organizations, 33 primary care physicians and 519 patients participated. The program operating cost was $290/patient. Evaluation Six guideline-based care elements were implemented, including spirometry measurement, asthma controller therapy, a written self-management action plan and general asthma education, including the inhaler device technique, role of medications and environmental control strategies in 93, 95, 86, 100, 97 and 87% of patients, respectively. Of the total patients 66% were adults, 61% were female, the mean age was 35.7 (SD = ± 24.2) years. At baseline 42% had two or more symptoms beyond acceptable limits vs. 17% (P< 0.001) post-intervention; 71% reported urgent/emergent healthcare visits at baseline (2.94 visits/year) vs. 45% (1.45 visits/year) (P< 0.001); 39% reported absenteeism (5.0 days/year) vs. 19% (3.0 days/year) (P< 0.001). The mean follow-up interval was 22 (SD = ± 7) months. Lessons learned A knowledge-translation framework can guide multi-level organizational change, facilitate asthma guideline implementation, and improve health outcomes in community primary care practices. Program costs are similar to those of diabetes programs. Program savings offset costs in a ratio of 2.1:1.

  11. Distributed Caching in a Multi-Server Environment : A study of Distributed Caching mechanisms and an evaluation of Distributed Caching Platforms available for the .NET Framework

    OpenAIRE

    Herber, Robert

    2010-01-01

    This paper discusses the problems Distributed Caching can be used to solve and evaluates a couple of Distributed Caching Platforms targeting the .NET Framework. Basic concepts and functionality that is general for all distributed caching platforms is covered in chapter 2. We discuss how Distributed Caching can resolve synchronization problems when using multiple local caches, how a caching tier can relieve the database and improve the scalability of the system, and also how memory consumption...

  12. A Strategic Approach to Curriculum Design for Information Literacy in Teacher Education--Implementing an Information Literacy Conceptual Framework

    Science.gov (United States)

    Klebansky, Anna; Fraser, Sharon P.

    2013-01-01

    This paper details a conceptual framework that situates curriculum design for information literacy and lifelong learning, through a cohesive developmental information literacy based model for learning, at the core of teacher education courses at UTAS. The implementation of the framework facilitates curriculum design that systematically,…

  13. Hybrid Multi-Agent Control in Microgrids: Framework, Models and Implementations Based on IEC 61850

    Directory of Open Access Journals (Sweden)

    Xiaobo Dou

    2014-12-01

    Full Text Available Operation control is a vital and complex issue for microgrids. The objective of this paper is to explore the practical means of applying decentralized control by using a multi agent system in actual microgrids and devices. This paper presents a hierarchical control framework (HCF consisting of local reaction control (LRC level, local decision control (LDC level, horizontal cooperation control (HCC level and vertical cooperation control (VCC level to meet different control requirements of a microgrid. Then, a hybrid multi-agent control model (HAM is proposed to implement HCF, and the properties, functionalities and operating rules of HAM are described. Furthermore, the paper elaborates on the implementation of HAM based on the IEC 61850 Standard, and proposes some new implementation methods, such as extended information models of IEC 61850 with agent communication language and bidirectional interaction mechanism of generic object oriented substation event (GOOSE communication. A hardware design and software system are proposed and the results of simulation and laboratory tests verify the effectiveness of the proposed strategies, models and implementations.

  14. Links in a distributed database: Theory and implementation

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, N.T.; Kraimer, M.R.

    1991-12-01

    This document addresses the problem of extending database links across Input/Output Controller (IOC) boundaries. It lays a foundation by reviewing the current system and proposing an implementation specification designed to guide all work in this area. The document also describes an implementation that is less ambitious than our formally stated proposal, one that does not extend the reach of all database links across IOC boundaries. Specifically, it introduces an implementation of input and output links and comments on that overall implementation. We include a set of manual pages describing each of the new functions the implementation provides.

  15. Distributed Earth observation data integration and on-demand services based on a collaborative framework of geospatial data service gateway

    Science.gov (United States)

    Xie, Jibo; Li, Guoqing

    2015-04-01

    Earth observation (EO) data obtained by air-borne or space-borne sensors has the characteristics of heterogeneity and geographical distribution of storage. These data sources belong to different organizations or agencies whose data management and storage methods are quite different and geographically distributed. Different data sources provide different data publish platforms or portals. With more Remote sensing sensors used for Earth Observation (EO) missions, different space agencies have distributed archived massive EO data. The distribution of EO data archives and system heterogeneity makes it difficult to efficiently use geospatial data for many EO applications, such as hazard mitigation. To solve the interoperable problems of different EO data systems, an advanced architecture of distributed geospatial data infrastructure is introduced to solve the complexity of distributed and heterogeneous EO data integration and on-demand processing in this paper. The concept and architecture of geospatial data service gateway (GDSG) is proposed to build connection with heterogeneous EO data sources by which EO data can be retrieved and accessed with unified interfaces. The GDSG consists of a set of tools and service to encapsulate heterogeneous geospatial data sources into homogenous service modules. The GDSG modules includes EO metadata harvesters and translators, adaptors to different type of data system, unified data query and access interfaces, EO data cache management, and gateway GUI, etc. The GDSG framework is used to implement interoperability and synchronization between distributed EO data sources with heterogeneous architecture. An on-demand distributed EO data platform is developed to validate the GDSG architecture and implementation techniques. Several distributed EO data achieves are used for test. Flood and earthquake serves as two scenarios for the use cases of distributed EO data integration and interoperability.

  16. Exploring the link between PPM implementation and company success in achieving strategic goals: an empirical framework

    Directory of Open Access Journals (Sweden)

    Oosthuizen, Chiara

    2016-11-01

    Full Text Available Organisations are constantly under pressure to innovate and grow by successfully executing their business strategies. The ever-increasing rate of change in technology has implications for product lifecycles, cost pressures, expectations of higher quality, and a larger variety of products and services. These trends result in mounting pressures and a huge increase in complexity, as the drivers of technology must be managed to achieve a competitive advantage. Project portfolio management (PPM is a solution for unravelling the complexities of multi-projects. In theory, PPM assists an organisation to achieve this competitive advantage through implementing its business strategy, balancing its portfolios, maximising value, and ensuring resource adequacy. There is, however, a lack of empirical evidence on the use and success of PPM approaches in South Africa. This article presents a framework that lays the foundation of an empirical study that will aim to explore the link between PPM implementation and company success in achieving strategic objectives. We base our framework on the factors of good practice in PPM, which include 1 single-project-level characteristics and activities; 2 multi-project- level characteristics and activities; 3 the link between projects and strategy process; and 4 availability and quality of project information.

  17. Efficient Parallel Video Processing Techniques on GPU: From Framework to Implementation

    Directory of Open Access Journals (Sweden)

    Huayou Su

    2014-01-01

    Full Text Available Through reorganizing the execution order and optimizing the data structure, we proposed an efficient parallel framework for H.264/AVC encoder based on massively parallel architecture. We implemented the proposed framework by CUDA on NVIDIA’s GPU. Not only the compute intensive components of the H.264 encoder are parallelized but also the control intensive components are realized effectively, such as CAVLC and deblocking filter. In addition, we proposed serial optimization methods, including the multiresolution multiwindow for motion estimation, multilevel parallel strategy to enhance the parallelism of intracoding as much as possible, component-based parallel CAVLC, and direction-priority deblocking filter. More than 96% of workload of H.264 encoder is offloaded to GPU. Experimental results show that the parallel implementation outperforms the serial program by 20 times of speedup ratio and satisfies the requirement of the real-time HD encoding of 30 fps. The loss of PSNR is from 0.14 dB to 0.77 dB, when keeping the same bitrate. Through the analysis to the kernels, we found that speedup ratios of the compute intensive algorithms are proportional with the computation power of the GPU. However, the performance of the control intensive parts (CAVLC is much related to the memory bandwidth, which gives an insight for new architecture design.

  18. Implementation and Evaluation of Technology Mentoring Program Developed for Teacher Educators: A 6M-Framework

    Directory of Open Access Journals (Sweden)

    Selim Gunuc

    2015-06-01

    Full Text Available The purpose of this basic research is to determine the problems experienced in the Technology Mentoring Program (TMP, and the study discusses how these problems affect the process in general. The implementation was carried out with teacher educators in the education faculty. 8 doctorate students (mentors provided technology mentoring implementation for one academic term to 9 teacher educators (mentees employed in the Education Faculty. The data were collected via the mentee and the mentor interview form, mentor reflections and organization meeting reflections. As a result, the problems based on the mentor, on the mentee and on the organization/institution were determined. In order to carry out TMP more effectively and successfully, a 6M-framework (Modifying, Meeting, Matching, Managing, Mentoring - Monitoring was suggested within the scope of this study. It could be stated that fewer problems will be encountered and that the process will be carried out more effectively and successfully when the structure in this framework is taken into consideration.

  19. Technical design and system implementation of region-line primitive association framework

    Science.gov (United States)

    Wang, Min; Xing, Jinjin; Wang, Jie; Lv, Guonian

    2017-08-01

    Apart from regions, image edge lines are an important information source, and they deserve more attention in object-based image analysis (OBIA) than they currently receive. In the region-line primitive association framework (RLPAF), we promote straight-edge lines as line primitives to achieve powerful OBIAs. Along with regions, straight lines become basic units for subsequent extraction and analysis of OBIA features. This study develops a new software system called remote-sensing knowledge finder (RSFinder) to implement RLPAF for engineering application purposes. This paper introduces the extended technical framework, a comprehensively designed feature set, key technology, and software implementation. To our knowledge, RSFinder is the world's first OBIA system based on two types of primitives, namely, regions and lines. It is fundamentally different from other well-known region-only-based OBIA systems, such as eCogntion and ENVI feature extraction module. This paper has important reference values for the development of similarly structured OBIA systems and line-involved extraction algorithms of remote sensing information.

  20. Efficient parallel video processing techniques on GPU: from framework to implementation.

    Science.gov (United States)

    Su, Huayou; Wen, Mei; Wu, Nan; Ren, Ju; Zhang, Chunyuan

    2014-01-01

    Through reorganizing the execution order and optimizing the data structure, we proposed an efficient parallel framework for H.264/AVC encoder based on massively parallel architecture. We implemented the proposed framework by CUDA on NVIDIA's GPU. Not only the compute intensive components of the H.264 encoder are parallelized but also the control intensive components are realized effectively, such as CAVLC and deblocking filter. In addition, we proposed serial optimization methods, including the multiresolution multiwindow for motion estimation, multilevel parallel strategy to enhance the parallelism of intracoding as much as possible, component-based parallel CAVLC, and direction-priority deblocking filter. More than 96% of workload of H.264 encoder is offloaded to GPU. Experimental results show that the parallel implementation outperforms the serial program by 20 times of speedup ratio and satisfies the requirement of the real-time HD encoding of 30 fps. The loss of PSNR is from 0.14 dB to 0.77 dB, when keeping the same bitrate. Through the analysis to the kernels, we found that speedup ratios of the compute intensive algorithms are proportional with the computation power of the GPU. However, the performance of the control intensive parts (CAVLC) is much related to the memory bandwidth, which gives an insight for new architecture design.

  1. A Framework for Implementing and Valuing Biodiversity Offsets in Colombia: A Landscape Scale Perspective

    Directory of Open Access Journals (Sweden)

    Shirley Saenz

    2013-11-01

    Full Text Available Biodiversity offsets provide a mechanism for maintaining or enhancing environmental values in situations where development is sought, despite negative environmental impacts. They seek to ensure that unavoidable deleterious environmental impacts of development are balanced by environmental gains. When onsite impacts warrant the use of offsets there is often little attention paid to make sure that the location of offset sites provides the greatest conservation benefit, ensuring they are consistent with landscape level conservation goals. In most offset frameworks it is difficult for developers to proactively know the offset requirements they will need to implement. Here we propose a framework to address these needs. We propose a series of rules for selecting offset sites that meet the conservation needs of potentially impacted biological targets. We then discuss an accounting approach that seeks to support offset ratio determinations based on a structured and transparent approach. To demonstrate the approach, we present a framework developed in partnership with the Colombian Ministry of Environment and Sustainable Development to reform existing mitigation regulatory processes.

  2. A Web GIS Framework for Participatory Sensing Service: An Open Source-Based Implementation

    Directory of Open Access Journals (Sweden)

    Yu Nakayama

    2017-04-01

    Full Text Available Participatory sensing is the process in which individuals or communities collect and analyze systematic data using mobile phones and cloud services. To efficiently develop participatory sensing services, some server-side technologies have been proposed. Although they provide a good platform for participatory sensing, they are not optimized for spatial data management and processing. For the purpose of spatial data collection and management, many web GIS approaches have been studied. However, they still have not focused on the optimal framework for participatory sensing services. This paper presents a web GIS framework for participatory sensing service (FPSS. The proposed FPSS enables an integrated deployment of spatial data capture, storage, and data management functions. In various types of participatory sensing experiments, users can collect and manage spatial data in a unified manner. This feature is realized by the optimized system architecture and use case based on the general requirements for participatory sensing. We developed an open source GIS-based implementation of the proposed framework, which can overcome financial difficulties that are one of the major problems of deploying sensing experiments. We confirmed with the prototype that participatory sensing experiments can be performed efficiently with the proposed FPSS.

  3. Distribution Matching with the Bhattacharyya Similarity: A Bound Optimization Framework.

    Science.gov (United States)

    Ben Ayed, Ismail; Punithakumar, Kumaradevan; Shuo Li

    2015-09-01

    We present efficient graph cut algorithms for three problems: (1) finding a region in an image, so that the histogram (or distribution) of an image feature within the region most closely matches a given model; (2) co-segmentation of image pairs and (3) interactive image segmentation with a user-provided bounding box. Each algorithm seeks the optimum of a global cost function based on the Bhattacharyya measure, a convenient alternative to other matching measures such as the Kullback-Leibler divergence. Our functionals are not directly amenable to graph cut optimization as they contain non-linear functions of fractional terms, which make the ensuing optimization problems challenging. We first derive a family of parametric bounds of the Bhattacharyya measure by introducing an auxiliary labeling. Then, we show that these bounds are auxiliary functions of the Bhattacharyya measure, a result which allows us to solve each problem efficiently via graph cuts. We show that the proposed optimization procedures converge within very few graph cut iterations. Comprehensive and various experiments, including quantitative and comparative evaluations over two databases, demonstrate the advantages of the proposed algorithms over related works in regard to optimality, computational load, accuracy and flexibility.

  4. Understanding effects in reviews of implementation interventions using the Theoretical Domains Framework.

    Science.gov (United States)

    Little, Elizabeth A; Presseau, Justin; Eccles, Martin P

    2015-06-17

    Behavioural theory can be used to better understand the effects of behaviour change interventions targeting healthcare professional behaviour to improve quality of care. However, the explicit use of theory is rarely reported despite interventions inevitably involving at least an implicit idea of what factors to target to implement change. There is a quality of care gap in the post-fracture investigation (bone mineral density (BMD) scanning) and management (bisphosphonate prescription) of patients at risk of osteoporosis. We aimed to use the Theoretical Domains Framework (TDF) within a systematic review of interventions to improve quality of care in post-fracture investigation. Our objectives were to explore which theoretical factors the interventions in the review may have been targeting and how this might be related to the size of the effect on rates of BMD scanning and osteoporosis treatment with bisphosphonate medication. A behavioural scientist and a clinician independently coded TDF domains in intervention and control groups. Quantitative analyses explored the relationship between intervention effect size and total number of domains targeted, and as number of different domains targeted. Nine randomised controlled trials (RCTs) (10 interventions) were analysed. The five theoretical domains most frequently coded as being targeted by the interventions in the review included "memory, attention and decision processes", "knowledge", "environmental context and resources", "social influences" and "beliefs about consequences". Each intervention targeted a combination of at least four of these five domains. Analyses identified an inverse relationship between both number of times and number of different domains coded and the effect size for BMD scanning but not for bisphosphonate prescription, suggesting that the more domains the intervention targeted, the lower the observed effect size. When explicit use of theory to inform interventions is absent, it is possible to

  5. A unified framework for spiking and gap-junction interactions in distributed neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Jan eHahne

    2015-09-01

    Full Text Available Contemporary simulators for networks of point and few-compartment model neurons come with a plethora of ready-to-use neuron and synapse models and support complex network topologies. Recent technological advancements have broadened the spectrum of application further to the efficient simulation of brain-scale networks on supercomputers. In distributed network simulations the amount of spike data that accrues per millisecond and process is typically low, such that a common optimization strategy is to communicate spikes at relatively long intervals, where the upper limit is given by the shortest synaptic transmission delay in the network. This approach is well-suited for simulations that employ only chemical synapses but it has so far impeded the incorporation of gap-junction models, which require instantaneous neuronal interactions. Here, we present a numerical algorithm based on a waveform-relaxation technique which allows for network simulations with gap junctions in a way that is compatible with the delayed communication strategy. Using a reference implementation in the NEST simulator, we demonstrate that the algorithm and the required data structures can be smoothly integrated with existing code such that they complement the infrastructure for spiking connections. To show that the unified framework for gap-junction and spiking interactions achieves high performance and delivers high accuracy...

  6. Supporting evaluation and implementation of musculoskeletal Models of Care: A globally-informed framework for judging 'readiness' and 'success'.

    Science.gov (United States)

    Briggs, Andrew M; Jordan, Joanne E; Jennings, Matthew; Speerin, Robyn; Bragge, Peter; Chua, Jason; Woolf, Anthony D; Slater, Helen

    2016-06-06

    Objective To develop a globally-informed framework to evaluate 'readiness' for implementation and 'success' after implementation of musculoskeletal Models of Care (MoCs). Methods Three-phases were undertaken: 1) qualitative study with 27 Australian subject matter experts (SMEs) to develop a draft Framework; 2) eDelphi study with an international panel of 93 SMEs across 30 nations to evaluate face validity, refine and establish consensus on the Framework components; and 3) translation of the Framework into a user-focused resource and evaluation of its acceptability with the eDelphi panel. Results A comprehensive evaluation framework was developed for judging 'readiness' and 'success' of musculoskeletal MoCs. The Framework consists of nine domains, with each domain containing a number of themes underpinned by detailed elements. In the first Delphi round, scores of 'partly agree' or 'completely agree' with the draft Framework ranged from 96.7-100%. In the second round, 'essential' scores ranged from 58.6-98.9%, resulting in 14 of 34 themes being classified as essential. SMEs strongly agreed or agreed that the final Framework was useful (98.8%), usable (95.1%), credible (100%) and appealing (93.9%). Overall, 96.3% strongly supported or supported the final structure of the Framework as it was presented, while 100%, 96.3% and 100% strongly supported or supported the content within the readiness, initiating implementation and success streams, respectively. Conclusions An empirically-derived framework to evaluate the readiness and success of musculoskeletal MoCs was strongly supported by an international panel of SMEs. The Framework provides an important internationally-applicable benchmark for the development, implementation and evaluation of musculoskeletal MoCs. This article is protected by copyright. All rights reserved.

  7. Proposing a Strategic Framework for Distributed Manufacturing Execution System Using Cloud Computing

    Directory of Open Access Journals (Sweden)

    Shiva Khalili Gheidari

    2013-07-01

    Full Text Available This paper introduces a strategic framework that uses service-oriented architecture to design distributed MES over cloud. In this study, the main structure of framework is defined in terms of a series of modules that communicate with each other by use of a design pattern, called mediator. Framework focus is on the main module, which handles distributed orders with other ones and finally suggests the benefit of using cloud in comparison with previous architectures. The main structure of framework (mediator and the benefit of focusing on the main module by using cloud, should be pointed more, also the aim and the results of comparing this method with previous architecture whether by quality and quantity is not described.

  8. Development, implementation and critique of a bioethics framework for pharmaceutical sponsors of human biomedical research.

    Science.gov (United States)

    Van Campen, Luann E; Therasse, Donald G; Klopfenstein, Mitchell; Levine, Robert J

    2015-11-01

    Pharmaceutical human biomedical research is a multi-dimensional endeavor that requires collaboration among many parties, including those who sponsor, conduct, participate in, or stand to benefit from the research. Human subjects' protections have been promulgated to ensure that the benefits of such research are accomplished with respect for and minimal risk to individual research participants, and with an overall sense of fairness. Although these protections are foundational to clinical research, most ethics guidance primarily highlights the responsibilities of investigators and ethics review boards. Currently, there is no published resource that comprehensively addresses bioethical responsibilities of industry sponsors; including their responsibilities to parties who are not research participants, but are, nevertheless key stakeholders in the endeavor. To fill this void, in 2010 Eli Lilly and Company instituted a Bioethics Framework for Human Biomedical Research. This paper describes how the framework was developed and implemented and provides a critique based on four years of experience. A companion article provides the actual document used by Eli Lilly and Company to guide ethical decisions regarding all phases of human clinical trials. While many of the concepts presented in this framework are not novel, compiling them in a manner that articulates the ethical responsibilities of a sponsor is novel. By utilizing this type of bioethics framework, we have been able to develop bioethics positions on various topics, provide research ethics consultations, and integrate bioethics into the daily operations of our human biomedical research. We hope that by sharing these companion papers we will stimulate discussion within and outside the biopharmaceutical industry for the benefit of the multiple parties involved in pharmaceutical human biomedical research.

  9. Developing a framework for implementing intensive care unit diaries: a focused review of the literature.

    Science.gov (United States)

    Beg, Muna; Scruth, Elizabeth; Liu, Vincent

    2016-11-01

    Intensive care unit diaries have been shown to improve post-critical illness recovery, however, prior reports of diary implementation are heterogeneous. We sought to construct a common framework for designing and implementing Intensive Care Unit diaries based on prior studies. We conducted a focused review of the literature regarding intensive care diaries based on a systematic search of several databases. Two reviewers assessed 56 studies and data were abstracted from a total of 25 eligible studies conducted between 1990 and 2014. We identified key information regarding the development, design, and implementation of the journals. We then grouped elements that appeared consistently across these studies within three main categories: (1) diary target populations; (2) diary format and content; and (3) the manner of diary return and follow-up. Most studies were conducted in European countries in adult intensive care units and targeted patients in both medical and surgical units. The timing of diary initiation was based on the elapsed length of stay or duration of mechanical ventilation. We categorised diary format and content as: entry content, authors, use of standardised headings, type of language, initiation, frequency of entries, and physical location of diaries. Diaries were hand written and many studies found that photographs were an essential element in ICU diaries. We categorised the manner of diary return and follow-up. The context in which intensive care unit diaries were returned were felt to be important factors in improving the use of diaries in recovery. In conclusion, we describe a common framework for the future development of intensive care unit diaries that revolves around the target population for the diaries, their format and content, and the timing of their use. Future studies should address how these elements impact the mechanisms by which intensive are diaries exert beneficial effects. Copyright © 2016 Australian College of Critical Care Nurses

  10. The SOPHY Framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, Martin Fejrskov; Bendtsen, Jan Dimon;

    The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  11. Implementation of Data Integration using Distributed Systems: A Review

    OpenAIRE

    Rakesh Jaitly

    2011-01-01

    Data integration in the distributed data system is introduced to solve the problem that data model has. The data integration in the distributional systems can be supported effectively. Data conversion is still a challenge in distributed system integration.Community based system is used for distributed data integration. It comprises of three elements: community, data model and communication protocol. The integration system solves the data heterogeneous problem in production management, making ...

  12. A conceptual framework for organizational readiness to implement nutrition and physical activity programs in early childhood education settings.

    Science.gov (United States)

    Sharma, Shreela V; Upadhyaya, Mudita; Schober, Daniel J; Byrd-Williams, Courtney

    2014-10-30

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have "collective readiness," which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors.

  13. The Distributed Logical Reasoning Language D—Tuili and Its Implementation on Microcomputer Network

    Institute of Scientific and Technical Information of China (English)

    高全泉; 陆汝钤; 等

    1992-01-01

    D-Tuili,having been implemented on microcompute network,is a distributed logical reasoning programming language.D-Tuili supports parallel programming on the language level,and couples loosely with the distributed database management system,so data in distributed databases can be used in the distributed logic programs.In this paper,we mainly introduce the components of D-Tuili used to design distributed logic programs.Furthermore,the main principles to implement D-Tuili and the main technologies adopted in the implemented system of D-Tuili are described.

  14. Implementation of the Framework Convention on Tobacco Control in Africa: Current Status of Legislation

    Directory of Open Access Journals (Sweden)

    Jacqueline Tumwine

    2011-11-01

    Full Text Available Objective: To describe, as of July 2011, the status of tobacco control legislation in Africa in three key areas of the Framework Convention on Tobacco Control (FCTC—(1 Protection from exposure to tobacco smoke, (2 Packaging and labelling of tobacco products, and (3 Tobacco advertising, promotion and sponsorship. Methods: Review and analysis of tobacco control legislation in Africa, media reports, journal articles, tobacco industry documents and data published in the 2011 WHO Report on the Global Tobacco Epidemic. Results: Modest progress in FCTC implementation in Africa with many countries having legislation or policies on the protection from exposure to tobacco smoke, however, only a handful of countries meet the standards of the FCTC Article 8 and its Guidelines particularly with regards to designated smoking areas. Little progress on packaging and labelling of tobacco products, with few countries having legislation meeting the minimum standards of the FCTC Article 11 and its Guidelines. Mauritius is the only African country with graphic or pictorial health warnings in place and has the largest warning labels in Africa. Slightly better progress in banning tobacco advertising, promotion and sponsorship has been shown by African countries, although the majority of legislation falls short of the standards of the FCTC Article 13 and its Guidelines. Despite their efforts, African countries’ FCTC implementation at national level has not matched the strong regional commitment demonstrated during the FCTC treaty negotiations. Conclusion: This study highlights the need for Africa to step up efforts to adopt and implement effective tobacco control legislation that is fully compliant with the FCTC. In order to achieve this, countries should prioritise resources for capacity building for drafting strong FCTC compliant legislation, research to inform policy and boost political will, and countering the tobacco industry which is a major obstacle to FCTC

  15. Complex governance structures and incoherent policies: Implementing the EU water framework directive in Sweden.

    Science.gov (United States)

    Söderberg, Charlotta

    2016-12-01

    Contemporary processes of environmental policymaking in general span over several territorial tiers. This also holds for the EU Water Framework Directive system of environmental quality standards (EQS), which are part of a complex multi-level institutional landscape, embracing both EU, national and sub-national level. Recent evaluations show that many EU member states, including Sweden, have not reached the ecological goals for water in 2015. Departing from theories on policy coherence and multi-level governance, this paper therefore analyses Swedish water governance as a case to further our understanding of policy implementation in complex governance structures: how does policy coherence (or the lack thereof) affect policy implementation in complex governance structures? To answer this question, the paper maps out the formal structure of the water governance system, focusing on power directions within the system, analyses policy coherence in Swedish water governance through mapping out policy conflicts between the EQS for water and other goals/regulations and explore how they are handled by national and sub-national water bureaucrats. The study concludes that without clear central guidance, 'good ecological status' for Swedish water will be difficult to achieve since incoherent policies makes policy implementation inefficient due to constant power struggles between different authorities, and since environmental goals are often overridden by economic and other societal goals. Further research is needed in order to explore if similar policy conflicts between water quality and other objectives occur in other EU member states and how bureaucrats handle such conflicts in different institutional settings. This study of the Swedish case indicates that the role of the state as a navigator and rudder-holder is important in order to improve policy implementation in complex governance structures - otherwise; bureaucrats risk being lost in an incoherent archipelago of

  16. Open Heavy Flavor Production in QCD -- Conceptual Framework and Implementation Issues

    CERN Document Server

    Tung, W K; Schmidt, C; Tung, Wu-Ki; Kretzer, Stefan; Schmidt, Carl

    2002-01-01

    Heavy flavor production is an important QCD process both in its own right and as a key component of precision global QCD analysis. Apparent disagreements between fixed-flavor scheme calculations of b-production rate with experimental measurements in hadro-, lepto-, and photo-production provide new impetus to a thorough examination of the theory and phenomenology of this process. We review existing methods of calculation, and place them in the context of the general PQCD framework of Collins. A distinction is drawn between scheme dependence and implementation issues related to quark mass effects near threshold. We point out a so far overlooked kinematic constraint on the threshold behavior, which greatly simplifies the variable flavor number scheme. It obviates the need for the elaborate existing prescriptions, and leads to robust predictions. It can facilitate the study of current issues on heavy flavor production as well as precision global QCD analysis.

  17. Microplastics in seawater: Recommendations from the Marine Strategy Framework Directive implementation process

    Directory of Open Access Journals (Sweden)

    Jesus Gago

    2016-11-01

    Full Text Available Microplastic litter is a pervasive pollutant present in marine systems across the globe. The legacy of microplastics pollution in the marine environment today may remain for years to come due to the persistence of these materials. Microplastics are emerging contaminants of potential concern and as yet there are few recognised approaches for monitoring. In 2008, the EU Marine Strategy Framework Directive (MSFD, 2008/56/EC included microplastics as an aspect to be measured. Here we outline the approach as discussed by the European Union expert group on marine litter, the technical Subgroup on Marine litter (TSG-ML, with a focus on the implementation of monitoring microplastics in seawater in European seas. It is concluded that harmonization and coherence is needed to achieve reliable monitoring.

  18. A Framework for Sentiment Analysis Implementation of Indonesian Language Tweet on Twitter

    Science.gov (United States)

    Asniar; Aditya, B. R.

    2017-01-01

    Sentiment analysis is the process of understanding, extracting, and processing the textual data automatically to obtain information. Sentiment analysis can be used to see opinion on an issue and identify a response to something. Millions of digital data are still not used to be able to provide any information that has usefulness, especially for government. Sentiment analysis in government is used to monitor the work programs of the government such as the Government of Bandung City through social media data. The analysis can be used quickly as a tool to see the public response to the work programs, so the next strategic steps can be taken. This paper adopts Support Vector Machine as a supervised algorithm for sentiment analysis. It presents a framework for sentiment analysis implementation of Indonesian language tweet on twitter for Work Programs of Government of Bandung City. The results of this paper can be a reference for decision making in local government.

  19. Challenges to the Implementation of a New Framework for Safeguarding Financial Stability

    Directory of Open Access Journals (Sweden)

    Vlahović Ana

    2014-09-01

    Full Text Available There is probably no single economic concept that has attracted more attention and intrigued scientific and professional circles than financial stability. For over a decade now that have been efforts to establish the starting point in explaining this condition or characteristic of the financial system since some find that the key to defining financial stability lies in stability and others argue in favour of the opposite, instability. Unfortunately, no agreement has been reached on a universal definition that would be widely accepted at the international level. Consequently, this gave rise to open discussions on systemic risk, creating a framework for preserving financial stability, and the role of central banks in this process. This article analyses the results achieved in the development of a theoretical concept of financial stability and its practical implementation. A consensus has been reached on the necessity of removing rigid barriers between macro and prudential policies and on the necessity of their coordinated actions. The primary objectives of monetary and fiscal stability have been shifted towards preserving financial stability. The isolated macroprudential principle rightfully got the epithet of an archaic approach. Coordinated micro and macroprudential policies have definitely prevailed and become reality in many countries, including Montenegro. Created institutional frameworks for safeguarding financial stability at all levels - national, Pan-European and global - represent a challenge for further comparative studies.

  20. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs

    Directory of Open Access Journals (Sweden)

    Verbakel Jan

    2011-10-01

    Full Text Available Abstract Clinical Prediction Rules (CPRs are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  1. Framework for the impact analysis and implementation of Clinical Prediction Rules (CPRs)

    LENUS (Irish Health Repository)

    Wallace, Emma

    2011-10-14

    Abstract Clinical Prediction Rules (CPRs) are tools that quantify the contribution of symptoms, clinical signs and available diagnostic tests, and in doing so stratify patients according to the probability of having a target outcome or need for a specified treatment. Most focus on the derivation stage with only a minority progressing to validation and very few undergoing impact analysis. Impact analysis studies remain the most efficient way of assessing whether incorporating CPRs into a decision making process improves patient care. However there is a lack of clear methodology for the design of high quality impact analysis studies. We have developed a sequential four-phased framework based on the literature and the collective experience of our international working group to help researchers identify and overcome the specific challenges in designing and conducting an impact analysis of a CPR. There is a need to shift emphasis from deriving new CPRs to validating and implementing existing CPRs. The proposed framework provides a structured approach to this topical and complex area of research.

  2. Implementation of the ATLAS trigger within the multi-threaded software framework AthenaMT

    CERN Document Server

    Wynne, Benjamin; The ATLAS collaboration

    2016-01-01

    We present an implementation of the ATLAS High Level Trigger that provides parallel execution of trigger algorithms within the ATLAS multi­threaded software framework, AthenaMT. This development will enable the ATLAS High Level Trigger to meet future challenges due to the evolution of computing hardware and upgrades of the Large Hadron Collider, LHC, and ATLAS Detector. During the LHC data­taking period starting in 2021, luminosity will reach up to three times the original design value. Luminosity will increase further, to up to 7.5 times the design value, in 2026 following LHC and ATLAS upgrades. This includes an upgrade of the ATLAS trigger architecture that will result in an increase in the High Level Trigger input rate by a factor of 4 to 10 compared to the current maximum rate of 100 kHz. The current ATLAS multiprocess framework, AthenaMP, manages a number of processes that process events independently, executing algorithms sequentially in each process. AthenaMT will provide a fully multi­threaded env...

  3. Development and implementation of a nutrition intervention programme in North West Pakistan: a realist framework.

    Science.gov (United States)

    Lhussier, Monique; Bangash, Sonia; Dykes, Fiona; Zaman, Mukhtiar; Lowe, Nicola M

    2012-12-01

    Maternal and infant malnutrition is prevalent in rural regions of NW Pakistan. This article reports on the use of a combination of a realist Context-Mechanism-Outcome framework and participatory appraisal methods to facilitate the development of a locally sensitive and responsive nutritional intervention programme. Data were gathered through a series of focus group (FG) discussions with local lady health workers, as well as pregnant and breastfeeding women attending an Emergency Field Hospital in North West Pakistan between May 2008 and March 2009. A nutrition intervention programme was implemented that involved cookery demonstration kitchens and free food supplements, coupled with nutrition and healthcare information and advice for pregnant and breastfeeding women. Subsequent FG discussions revealed that the programme had a positive impact on knowledge gained by women in the community and generated an openness to receiving and spreading knowledge. The framework, which rested on the use of a double feedback loop, involving local women, lady health workers, local researchers and UK-based researchers, has enabled not only the establishment of the programme, but has also given the local team the tools to apply for, and gain, further funding for the development of nutrition support services. The development of such methodological tools, which empower local researchers and service providers (wherever located) to operationalize local knowledge and assess interventions, is particularly relevant in international financially-constrained contexts.

  4. Implementing and Managing framework for PaaS in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Junaid Qayyum

    2011-09-01

    Full Text Available With the rapid development of Internet and Cloud computing, there are more and more network resources. Sharing, management and on-demand allocation of network resources are particularly important in Cloud computing. Platform as a Service (PaaS is one of the key services in Cloud computing. PaaS is very attractive for schools, research institutions and enterprises which need reducing IT costs, improving computing platform sharing and meeting license constraints. However, nearly all current available cloud computing platforms are either proprietary or their software infrastructure is invisible to the research community except for a few open-source platforms. For universities and research institutes, more open and testable experimental platforms are needed in a lab-level with PCs. In this paper, a framework for managing PaaS in a virtual Cloud computing lab is developed. The framework implements the user management, resource management and access management. The system has good expandability and can improve resources sharing and utilization.

  5. A framework for sustainable implementation of e-medicine in transitioning countries.

    Science.gov (United States)

    Isabalija, Stephen Robert; Mbarika, Victor; Kituyi, Geoffrey Mayoka

    2013-01-01

    Organizations in developed countries such as the United States of America and Canada face difficulties and challenges in technology transfer from one organization to another; the complexity of problems easily compounds when such transfers are attempted from developed to developing countries due to differing socioeconomic and cultural environments. There is a gap in the formation of research and education programs to address technology transfer issues that go beyond just transferring the technologies to sustaining such transfers for longer periods. This study examined telemedicine transfer challenges in three Sub-Sahara African countries and developed a framework for sustainable implementation of e-medicine. Both quantitative and qualitative research methods were used. The study findings indicate that e-medicine sustainability in Sub-Saharan Africa is affected by institutional factors such as institutional environment and knowledge management practices; technical factors such as the technological environment and technology transfer project environment; social environmental factors such as social environment and donor involvement. These factors were used to model the proposed framework.

  6. A Framework for Sustainable Implementation of E-Medicine in Transitioning Countries

    Directory of Open Access Journals (Sweden)

    Stephen Robert Isabalija

    2013-01-01

    Full Text Available Organizations in developed countries such as the United States of America and Canada face difficulties and challenges in technology transfer from one organization to another; the complexity of problems easily compounds when such transfers are attempted from developed to developing countries due to differing socioeconomic and cultural environments. There is a gap in the formation of research and education programs to address technology transfer issues that go beyond just transferring the technologies to sustaining such transfers for longer periods. This study examined telemedicine transfer challenges in three Sub-Sahara African countries and developed a framework for sustainable implementation of e-medicine. Both quantitative and qualitative research methods were used. The study findings indicate that e-medicine sustainability in Sub-Saharan Africa is affected by institutional factors such as institutional environment and knowledge management practices; technical factors such as the technological environment and technology transfer project environment; social environmental factors such as social environment and donor involvement. These factors were used to model the proposed framework.

  7. The Framework for KM Implementation in Product and Service Oriented SMEs: Evidence from Field Studies in Taiwan

    Directory of Open Access Journals (Sweden)

    Yao Chin Lin

    2015-03-01

    Full Text Available Knowledge management (KM is a core competency that determines the success of small and medium-sized enterprises (SMEs in this knowledge-based economy. Instead of competing on the basis of physical and financial capital, the success of SMEs is influenced by the knowledge, experience and skills of the owners and its employees. Unfortunately, many SMEs are still struggling with KM implementation due to lacking a comprehensive KM framework. This study aims to identify enablers for KM success and build up a framework for KM implementation in service and product oriented SMEs. By using multiple research methods, this study collects data from SMEs in Taiwan to prove our suggested enablers and reference KM framework. The suggested framework can provide useful assistance and guidance for holistic KM solutions. The K-object concept, which adopted the XML standard, may become a significant managerial and technical element in the KM practice. The enhanced KM framework mandates every employee’s participation in knowledge activities, not just some elite knowledge workers. The findings provide useful implications for researchers and practitioners by providing useful templates for implementing KM initiatives in different industries and more comprehensive framework for KM implementation in different types of SMEs.

  8. Leveraging the Zachman framework implementation using action - research methodology - a case study: aligning the enterprise architecture and the business goals

    Science.gov (United States)

    Nogueira, Juan Manuel; Romero, David; Espadas, Javier; Molina, Arturo

    2013-02-01

    With the emergence of new enterprise models, such as technology-based enterprises, and the large quantity of information generated through technological advances, the Zachman framework continues to represent a modelling tool of great utility and value to construct an enterprise architecture (EA) that can integrate and align the IT infrastructure and business goals. Nevertheless, implementing an EA requires an important effort within an enterprise. Small technology-based enterprises and start-ups can take advantage of EAs and frameworks but, because these enterprises have limited resources to allocate for this task, an enterprise framework implementation is not feasible in most cases. This article proposes a new methodology based on action-research for the implementation of the business, system and technology models of the Zachman framework to assist and facilitate its implementation. Following the explanation of cycles of the proposed methodology, a case study is presented to illustrate the results of implementing the Zachman framework in a technology-based enterprise: PyME CREATIVA, using action-research approach.

  9. Mobile agent-enabled framework for structuring and building distributed systems on the internet

    Institute of Scientific and Technical Information of China (English)

    CAO Jiannong; ZHOU Jingyang; ZHU Weiwei; LI Xuhui

    2006-01-01

    Mobile agent has shown its promise as a powerful means to complement and enhance existing technology in various application areas. In particular, existing work has demonstrated that MA can simplify the development and improve the performance of certain classes of distributed applications, especially for those running on a wide-area, heterogeneous, and dynamic networking environment like the Internet. In our previous work, we extended the application of MA to the design of distributed control functions, which require the maintenance of logical relationship among and/or coordination of processing entities in a distributed system. A novel framework is presented for structuring and building distributed systems, which use cooperating mobile agents as an aid to carry out coordination and cooperation tasks in distributed systems. The framework has been used for designing various distributed control functions such as load balancing and mutual exclusion in our previous work. In this paper, we use the framework to propose a novel approach to detecting deadlocks in distributed system by using mobile agents, which demonstrates the advantage of being adaptive and flexible of mobile agents. We first describe the MAEDD (Mobile Agent Enabled Deadlock Detection) scheme, in which mobile agents are dispatched to collect and analyze deadlock information distributed across the network sites and, based on the analysis, to detect and resolve deadlocks. Then the design of an adaptive hybrid algorithm derived from the framework is presented. The algorithm can dynamically adapt itself to the changes in system state by using different deadlock detection strategies. The performance of the proposed algorithm has been evaluated using simulations. The results show that the algorithm can outperform existing algorithms that use a fixed deadlock detection strategy.

  10. Transactive control: a framework for operating power systems characterized by high penetration of distributed energy resources

    DEFF Research Database (Denmark)

    Hu, Junjie; Yang, Guangya; Kok, Koen

    2016-01-01

    The increasing number of distributed energy resources connected to power systems raises operational challenges for the network operator, such as introducing grid congestion and voltage deviations in the distribution network level, as well as increasing balancing needs at the whole system level......, followed by a literature review and demonstration projects that apply to transactive control. Cases are then presented to illustrate the transactive control framework. At the end, discussions and research directions are presented, for applying transactive control to operating power systems, characterized...

  11. An RFID-Based Manufacturing Control Framework for Loosely Coupled Distributed Manufacturing System Supporting Mass Customization

    Science.gov (United States)

    Chen, Ruey-Shun; Tsai, Yung-Shun; Tu, Arthur

    In this study we propose a manufacturing control framework based on radio-frequency identification (RFID) technology and a distributed information system to construct a mass-customization production process in a loosely coupled shop-floor control environment. On the basis of this framework, we developed RFID middleware and an integrated information system for tracking and controlling the manufacturing process flow. A bicycle manufacturer was used to demonstrate the prototype system. The findings of this study were that the proposed framework can improve the visibility and traceability of the manufacturing process as well as enhance process quality control and real-time production pedigree access. Using this framework, an enterprise can easily integrate an RFID-based system into its manufacturing environment to facilitate mass customization and a just-in-time production model.

  12. Foundational Report Series: Advanced Distribution Management Systems for Grid Modernization, Implementation Strategy for a Distribution Management System

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Ravindra [Argonne National Lab. (ANL), Argonne, IL (United States); Reilly, James T. [Reilly Associates, Pittston, PA (United States); Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-03-01

    Electric distribution utilities encounter many challenges to successful deployment of Distribution Management Systems (DMSs). The key challenges are documented in this report, along with suggestions for overcoming them. This report offers a recommended list of activities for implementing a DMS. It takes a strategic approach to implementing DMS from a project management perspective. The project management strategy covers DMS planning, procurement, design, building, testing, Installation, commissioning, and system integration issues and solutions. It identifies the risks that are associated with implementation and suggests strategies for utilities to use to mitigate them or avoid them altogether. Attention is given to common barriers to successful DMS implementation. This report begins with an overview of the implementation strategy for a DMS and proceeds to put forward a basic approach for procuring hardware and software for a DMS; designing the interfaces with external corporate computing systems such as EMS, GIS, OMS, and AMI; and implementing a complete solution.

  13. A comprehensive model to evaluate implementation of the world health organization framework convention of tobacco control.

    Science.gov (United States)

    Sarrafzadegan, Nizal; Kelishad, Roya; Rabiei, Katayoun; Abedi, Heidarali; Mohaseli, Khadijeh Fereydoun; Masooleh, Hasan Azaripour; Alavi, Mousa; Heidari, Gholamreza; Ghaffari, Mostafa; O'Loughlin, Jennifer

    2012-03-01

    Iran is one of the countries that has ratified the World Health Organization Framework Convention of Tobacco Control (WHO-FCTC), and has implemented a series of tobacco control interventions including the Comprehensive Tobacco Control Law. Enforcement of this legislation and assessment of its outcome requires a dedicated evaluation system. This study aimed to develop a generic model to evaluate the implementation of the Comprehensive Tobacco Control Law in Iran that was provided based on WHO-FCTC articles. Using a grounded theory approach, qualitative data were collected from 265 subjects in individual interviews and focus group discussions with policymakers who designed the legislation, key stakeholders, and members of the target community. In addition, field observations data in supermarkets/shops, restaurants, teahouses and coffee shops were collected. Data were analyzed in two stages through conceptual theoretical coding. Overall, 617 open codes were extracted from the data into tables; 72 level-3 codes were retained from the level-2 code series. Using a Model Met paradigm, the relationships between the components of each paradigm were depicted graphically. The evaluation model entailed three levels, namely: short-term results, process evaluation and long-term results. Central concept of the process of evaluation is that enforcing the law influences a variety of internal and environmental factors including legislative changes. These factors will be examined during the process evaluation and context evaluation. The current model can be applicable for providing FCTC evaluation tools across other jurisdictions.

  14. BOT Contract through the optics of Albanian legal provisions - Issues of the implementation and transfer framework

    Directory of Open Access Journals (Sweden)

    Entela Prifti

    2016-07-01

    Full Text Available The last years have resulted in an increase of concession contracts in Albania, followed by a revised modern legal framework. Beside the debate on whether the government should perform most of the activities itself instead of giving them to the private sector through a concession contract, the concession contracts are nowadays a reality and as such they should be studied and analysed carefully. The scope of this article is limited to the provisions of the Albanian legislation and its approach to the international provisions regarding BOT (build – operate - transfer concession contract. A detailed analyse will drive to the conclusionas to what extent the Albanian concession legislation does compile with the international accepted principles of Public Private Partnership concerning mainly implementation and transfer phase of a BOT contract. Albanian Public Private Partnershiplegislation has gone through many revisions and amendments during the last twenty years, resulting in a challenging situation for everybody that deals with any aspects of a concession. Having a detailed understanding of the legal provisions is indeed the core element toward a successful implementation process of any concession, resulting in the highest profitability for concession parties, the public entity and the private investor, and consequently culminating to the best interest of the population.

  15. Searching the short-period variable stars with the photometric algorithm implemented in LUIZA framework

    Science.gov (United States)

    Obara, Lukasz; Żarnecki, Aleksander Filip

    2015-09-01

    Pi of the Sky is a system of wide field-of-view robotic telescopes, which search for short timescale astrophysical phenomena, especially for prompt optical GRB emission. The system was designed for autonomous operation, monitoring a large fraction of the sky with 12m-13m range and time resolution of the order of 1 - 100 seconds. LUIZA is a dedicated framework developed for efficient off-line processing of the Pi of the Sky data, implemented in C++. The photometric algorithm based on ASAS photometry was implemented in LUIZA and compared with the algorithm based on the pixel cluster reconstruction and simple aperture photometry algorithm. Optimized photometry algorithms were then applied to the sample of test images, which were modified to include different patterns of variability of the stars (training sample). Different statistical estimators are considered for developing the general variable star identification algorithm. The algorithm will then be used to search for short-period variable stars in the real data.

  16. Hierarchical control framework for integrated coordination between distributed energy resources and demand response

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Di; Lian, Jianming; Sun, Yannan; Yang, Tao; Hansen, Jacob

    2017-09-01

    Demand response is representing a significant but largely untapped resource that can greatly enhance the flexibility and reliability of power systems. In this paper, a hierarchical control framework is proposed to facilitate the integrated coordination between distributed energy resources and demand response. The proposed framework consists of coordination and device layers. In the coordination layer, various resource aggregations are optimally coordinated in a distributed manner to achieve the system-level objectives. In the device layer, individual resources are controlled in real time to follow the optimal power generation or consumption dispatched from the coordination layer. For the purpose of practical applications, a method is presented to determine the utility functions of controllable loads by taking into account the real-time load dynamics and the preferences of individual customers. The effectiveness of the proposed framework is validated by detailed simulation studies.

  17. A Multi-Functional Fully Distributed Control Framework for AC Microgrids

    DEFF Research Database (Denmark)

    Shafiee, Qobad; Nasirian, Vahidreza; Quintero, Juan Carlos Vasquez

    2017-01-01

    This paper proposes a fully distributed control methodology for secondary control of AC microgrids. The control framework includes three modules: voltage regulator, reactive power regulator, and active power/frequency regulator. The voltage regulator module maintains the average voltage of the mi...

  18. A GIS framework for the stochastic distributed modelling of rainfall induced shallow landslides

    Science.gov (United States)

    Raia, S.; Rossi, M.; Marchesini, I.; Baum, R. L.; Godt, J. W.; Guzzetti, F.

    2011-12-01

    Deterministic distributed models to forecast shallow landslides spatially extend site-specific slope stability and infiltration models. A problem with using existing deterministic models to forecast shallow landslides is the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes, particularly over large areas. An additional problem is the operational difficulty in performing the simulations. This is because of the amount and diversity of the topographic, geological, hydrological, and rainfall data required by the numerical models. To overcome these problems, we propose a stochastic approach to the distributed modelling of shallow rainfall-induced landslides in a GIS environment. For this purpose, we developed a new stochastic version of the Transient Rainfall Infiltration and Grid-based Regional Slope-stability analysis code (TRIGRS). The new code (TRIGRS-S) uses Gaussian and uniform probability distributions to describe the mechanical and hydrological properties of the slope materials. A Monte Carlo approach is used to investigate the variability of the model parameters. To help in the preparation of the model input data, and in the execution of the simulations, we implemented the TRIGRS-S code in the GRASS GIS environment. Statistical analysis of the results is performed in R, a programming language and software environment for statistical computing and plotting. TRIGRS-S was tested in a 3-km2 area north of Seattle, USA, and in a 13-km2 area south of Perugia, Italy. Adoption of the stochastic framework in the two study areas has resulted in improved spatial forecasts of shallow landslides, when compared to the deterministic forecasts. We attribute the difference to the natural variability of the mechanical and hydrological properties of the slope materials, and to the uncertainty associated with the simplified slope- stability and infiltration models. We expect the stochastic approach, code TRIGRS

  19. A Framework for the Development of Scalable Heterogeneous Robot Teams with Dynamically Distributed Processing

    Science.gov (United States)

    Martin, Adrian

    As the applications of mobile robotics evolve it has become increasingly less practical for researchers to design custom hardware and control systems for each problem. This research presents a new approach to control system design that looks beyond end-of-lifecycle performance and considers control system structure, flexibility, and extensibility. Toward these ends the Control ad libitum philosophy is proposed, stating that to make significant progress in the real-world application of mobile robot teams the control system must be structured such that teams can be formed in real-time from diverse components. The Control ad libitum philosophy was applied to the design of the HAA (Host, Avatar, Agent) architecture: a modular hierarchical framework built with provably correct distributed algorithms. A control system for exploration and mapping, search and deploy, and foraging was developed to evaluate the architecture in three sets of hardware-in-the-loop experiments. First, the basic functionality of the HAA architecture was studied, specifically the ability to: a) dynamically form the control system, b) dynamically form the robot team, c) dynamically form the processing network, and d) handle heterogeneous teams. Secondly, the real-time performance of the distributed algorithms was tested, and proved effective for the moderate sized systems tested. Furthermore, the distributed Just-in-time Cooperative Simultaneous Localization and Mapping (JC-SLAM) algorithm demonstrated accuracy equal to or better than traditional approaches in resource starved scenarios, while reducing exploration time significantly. The JC-SLAM strategies are also suitable for integration into many existing particle filter SLAM approaches, complementing their unique optimizations. Thirdly, the control system was subjected to concurrent software and hardware failures in a series of increasingly complex experiments. Even with unrealistically high rates of failure the control system was able to

  20. Distributed Framework for Data Mining As a Service on Private Cloud

    Directory of Open Access Journals (Sweden)

    Shraddha Masih

    2014-11-01

    Full Text Available Data mining research faces two great challenges: i. Automated mining ii. Mining of distributed data. Conventional mining techniques are centralized and the data needs to be accumulated at central location. Mining tool needs to be installed on the computer before performing data mining. Thus, extra time is incurred in collecting the data. Mining is 4 done by specialized analysts who have access to mining tools. This technique is not optimal when the data is distributed over the network. To perform data mining in distributed scenario, we need to design a different framework to improve efficiency. Also, the size of accumulated data grows exponentially with time and is difficult to mine using a single computer. Personal computers have limitations in terms of computation capability and storage capacity. Cloud computing can be exploited for compute-intensive and data intensive applications. Data mining algorithms are both compute and data intensive, therefore cloud based tools can provide an infrastructure for distributed data mining. This paper is intended to use cloud computing to support distributed data mining. We propose a cloud based data mining model which provides the facility of mass data storage along with distributed data mining facility. This paper provide a solution for distributed data mining on Hadoop framework using an interface to run the algorithm on specified number of nodes without any user level configuration. Hadoop is configured over private servers and clients can process their data through common framework from anywhere in private network. Data to be mined can either be chosen from cloud data server or can be uploaded from private computers on the network. It is observed that the framework is helpful in processing large size data in less time as compared to single system.

  1. Distributed Prognostics System Implementation on Wireless Embedded Devices

    Data.gov (United States)

    National Aeronautics and Space Administration — Distributed prognostics is the next step in the evolution of prognostic methodologies. It is an important enabling technology for the emerging Condition Based...

  2. A distributed SIRT implementation for the ASTRA Toolbox

    NARCIS (Netherlands)

    Palenstijn, W.J.; Bédorf, J.; Batenburg, K.J.; King, M.; Glick, S.; Mueller, K.

    2015-01-01

    The ASTRA Toolbox is a software toolbox that enables rapid development of GPU accelerated tomography algorithms. It contains GPU implementations of forward and backprojection operations for common scanning geometries, as well as a set of algorithms for iterative reconstruction. These algorithms are

  3. The Ecological Marine Units Project as a Framework for Collaborative Data Exploration, Distribution, and Knowledge Building

    Science.gov (United States)

    Wright, Dawn; Sayre, Roger; Breyer, Sean; Butler, Kevin; VanGraafeiland, Keith; Goodin, Kathy; Kavanaugh, Maria; Costello, Mark; Cressie, Noel; Basher, Zeenatul; Harris, Peter; Guinotte, John

    2017-04-01

    scientific research on species distributions and their relationships to the marine physical environment. To further benefit the community and facilitate collaborate knowledge building, data products are shared openly and interoperably via www.esri.com/ecological-marine-units. This includes provision of 3D point mesh and EMU clusters at the surface, bottom, and within the water column in varying formats via download, web services or web apps, as well as generic algorithms and GIS workflows that scale from global to regional and local. A major aim is for the community members to may move the research forward with higher-resolution data from their own field studies or areas of interest, with the original EMU project team assisting with GIS implementation (especially via a new online discussion forum), or hosting of additional data products as needed.

  4. Research and Implementation of Distributed Virtual Simulation Platform Based on Components

    Institute of Scientific and Technical Information of China (English)

    SUN Zhi-xin; WANG Ru-chuan; WANG Shao-di

    2004-01-01

    This paper proposes a combination of system's theoretic simulation methodology with the virtual reality technology as a basis for a component-based virtual simulation framework. The created universal framework can be used in different fields, such as drive training, airplane fighting training, and so on. The result of the synergism is a powerful component-based virtual simulation framework. After having briefly introduced the concepts and principles of the distributed component object, the paper describes a software development method based on components. Then a method of virtual simulation system modeling based on components is proposed, and the integrated framework supporting distributed virtual simulation and its key technologies are discussed at length. Our experiments indicate that the framework can be widely used in simulation fields such as arms antagonism, driving simulation and so on.

  5. Modelling Framework and the Quantitative Analysis of Distributed Energy Resources in Future Distribution Networks

    DEFF Research Database (Denmark)

    Han, Xue; Sandels, Claes; Zhu, Kun;

    2013-01-01

    , comprising distributed generation, active demand and electric vehicles. Subsequently, quantitative analysis was made on the basis of the current and envisioned DER deployment scenarios proposed for Sweden. Simulations are performed in two typical distribution network models for four seasons. The simulation...... results show that in general the DER deployment brings in the possibilities to reduce the power losses and voltage drops by compensating power from the local generation and optimizing the local load profiles....

  6. A Linear Algebra Framework for Static High Performance Fortran Code Distribution

    Directory of Open Access Journals (Sweden)

    Corinne Ancourt

    1997-01-01

    Full Text Available High Performance Fortran (HPF was developed to support data parallel programming for single-instruction multiple-data (SIMD and multiple-instruction multiple-data (MIMD machines with distributed memory. The programmer is provided a familiar uniform logical address space and specifies the data distribution by directives. The compiler then exploits these directives to allocate arrays in the local memories, to assign computations to elementary processors, and to migrate data between processors when required. We show here that linear algebra is a powerful framework to encode HPF directives and to synthesize distributed code with space-efficient array allocation, tight loop bounds, and vectorized communications for INDEPENDENT loops. The generated code includes traditional optimizations such as guard elimination, message vectorization and aggregation, and overlap analysis. The systematic use of an affine framework makes it possible to prove the compilation scheme correct.

  7. ICT4RED 12 - Component implementation framework: A conceptual framework for integrating mobile technology into resource-constrained rural schools

    CSIR Research Space (South Africa)

    Ford, M

    2014-05-01

    Full Text Available ICT for Rural Education Development (ICT4RED) is a large-scale pilot that is testing the use of tablets in 26 deep rural schools in the Eastern Cape Province of South Africa. The aim is to develop a replicable framework that will enable evidence...

  8. Design and Implementation of Distributed N-Tier Applications

    Directory of Open Access Journals (Sweden)

    PECHERLE George

    2010-05-01

    Full Text Available This paper describes the process of creatingmultitier applications from design to implementationputting particular emphasis on solutions that use JavaRemote Method Invocation (Java RMI. Starting witha short description about concepts like tier and n-tiersoftware, it continues by presenting the goal of thearticle, followed closely by a proposed design, based onRMI, and implementation in Java, with advantagesand disadvantages, rounded up by a comparisonbetween our solution and other popular technologies.

  9. Strategic Framework for Implementing the Potential of Import Substitution on the Example of Railway Engineering

    Directory of Open Access Journals (Sweden)

    Yuliya Georgievna Lavrikova

    2015-07-01

    Full Text Available At present, Russia’s economy is dependent on import in some of its strategically important sectors. The recent economic and political developments such as the aggravation of geopolitical situation and termination of economic partnership between Russia and a number of countries and entities, and also the Government’s policy that aims to reduce import dependence determine the need to expand the interaction between domestic producers and the need to use domestic resources, materials and equipment in economic activities. Import substitution in Russia can become a driving force of its industrial growth. The paper presents different interpretations of the term “import substitution” contained in several publications of recent years; it also reveals a common approach of the authors to this problem. The article summarizes existing proposals on priority areas of import substitution such as the shift towards import-substituting production and technology in strategically important industries. Mechanical engineering is seen as a most important industry in this respect. Russia’s machine-building complex is a highly diversified industry, therefore the policy of import substitution implies that it will be implemented efficiently in various sectors of mechanical engineering on the basis of the differentiated approach, with regard to industry and sectoral specifics. The article considers a strategic framework for the implementation of the import substitution potential on the example of railway engineering. The authors reveal trends in the development of the internal market of railway engineering products; they determine the degree of import dependence for individual sectors of the industry on the basis of statistical data. The article substantiates priorities and possibilities of import substitution in different sectors, and in high-tech sectors of railway engineering. The authors point out a goal of import substitution in these sectors, the goal is to

  10. A configurable distributed high-performance computing framework for satellite's TDI-CCD imaging simulation

    Science.gov (United States)

    Xue, Bo; Mao, Bingjing; Chen, Xiaomei; Ni, Guoqiang

    2010-11-01

    This paper renders a configurable distributed high performance computing(HPC) framework for TDI-CCD imaging simulation. It uses strategy pattern to adapt multi-algorithms. Thus, this framework help to decrease the simulation time with low expense. Imaging simulation for TDI-CCD mounted on satellite contains four processes: 1) atmosphere leads degradation, 2) optical system leads degradation, 3) electronic system of TDI-CCD leads degradation and re-sampling process, 4) data integration. Process 1) to 3) utilize diversity data-intensity algorithms such as FFT, convolution and LaGrange Interpol etc., which requires powerful CPU. Even uses Intel Xeon X5550 processor, regular series process method takes more than 30 hours for a simulation whose result image size is 1500 * 1462. With literature study, there isn't any mature distributing HPC framework in this field. Here we developed a distribute computing framework for TDI-CCD imaging simulation, which is based on WCF[1], uses Client/Server (C/S) layer and invokes the free CPU resources in LAN. The server pushes the process 1) to 3) tasks to those free computing capacity. Ultimately we rendered the HPC in low cost. In the computing experiment with 4 symmetric nodes and 1 server , this framework reduced about 74% simulation time. Adding more asymmetric nodes to the computing network, the time decreased namely. In conclusion, this framework could provide unlimited computation capacity in condition that the network and task management server are affordable. And this is the brand new HPC solution for TDI-CCD imaging simulation and similar applications.

  11. Framework and implementation of a continuous network-wide health monitoring system for roadways

    Science.gov (United States)

    Wang, Ming; Birken, Ralf; Shahini Shamsabadi, Salar

    2014-03-01

    According to the 2013 ASCE report card America's infrastructure scores only a D+. There are more than four million miles of roads (grade D) in the U.S. requiring a broad range of maintenance activities. The nation faces a monumental problem of infrastructure management in the scheduling and implementation of maintenance and repair operations, and in the prioritization of expenditures within budgetary constraints. The efficient and effective performance of these operations however is crucial to ensuring roadway safety, preventing catastrophic failures, and promoting economic growth. There is a critical need for technology that can cost-effectively monitor the condition of a network-wide road system and provide accurate, up-to-date information for maintenance activity prioritization. The Versatile Onboard Traffic Embedded Roaming Sensors (VOTERS) project provides a framework and the sensing capability to complement periodical localized inspections to continuous network-wide health monitoring. Research focused on the development of a cost-effective, lightweight package of multi-modal sensor systems compatible with this framework. An innovative software infrastructure is created that collects, processes, and evaluates these large time-lapse multi-modal data streams. A GIS-based control center manages multiple inspection vehicles and the data for further analysis, visualization, and decision making. VOTERS' technology can monitor road conditions at both the surface and sub-surface levels while the vehicle is navigating through daily traffic going about its normal business, thereby allowing for network-wide frequent assessment of roadways. This deterioration process monitoring at unprecedented time and spatial scales provides unique experimental data that can be used to improve life-cycle cost analysis models.

  12. GraphLab: A Distributed Framework for Machine Learning in the Cloud

    CERN Document Server

    Low, Yucheng; Kyrola, Aapo; Bickson, Danny; Guestrin, Carlos

    2011-01-01

    Machine Learning (ML) techniques are indispensable in a wide range of fields. Unfortunately, the exponential increase of dataset sizes are rapidly extending the runtime of sequential algorithms and threatening to slow future progress in ML. With the promise of affordable large-scale parallel computing, Cloud systems offer a viable platform to resolve the computational challenges in ML. However, designing and implementing efficient, provably correct distributed ML algorithms is often prohibitively challenging. To enable ML researchers to easily and efficiently use parallel systems, we introduced the GraphLab abstraction which is designed to represent the computational patterns in ML algorithms while permitting efficient parallel and distributed implementations. In this paper we provide a formal description of the GraphLab parallel abstraction and present an efficient distributed implementation. We conduct a comprehensive evaluation of GraphLab on three state-of-the-art ML algorithms using real large-scale data...

  13. A FRAMEWORK OF SETTING UP GOAL CONTROL LIMITS OF TARGET COSTING FOR NON-NORMAL DISTRIBUTIONS

    Institute of Scientific and Technical Information of China (English)

    Hsin-Hung WU; Fong-Jung YU

    2007-01-01

    This study provides a framework of target costing to extend its original scope when the underlying distribution is non-normal. The new specification limits can be derived by listening to the market price from Taguchi loss function. Later, the new specification limits can be linked through the non-normality-based (C)pk value along with non-normality-based X-R control charts to derive goal control limits. Moreover, an example is provided to illustrate the usefulness of the proposed framework of target costing by relentlessly reducing cost and improving product quality to gain competitiveness in the marketplace.

  14. Enabling pathways to health equity: developing a framework for implementing social capital in practice.

    Science.gov (United States)

    Putland, Christine; Baum, Fran; Ziersch, Anna; Arthurson, Kathy; Pomagalska, Dorota

    2013-05-29

    Mounting evidence linking aspects of social capital to health and wellbeing outcomes, in particular to reducing health inequities, has led to intense interest in social capital theory within public health in recent decades. As a result, governments internationally are designing interventions to improve health and wellbeing by addressing levels of social capital in communities. The application of theory to practice is uneven, however, reflecting differing views on the pathways between social capital and health, and divergent theories about social capital itself. Unreliable implementation may restrict the potential to contribute to health equity by this means, yet to date there has been limited investigation of how the theory is interpreted at the level of policy and then translated into practice. The paper outlines a collaborative research project designed to address this knowledge deficit in order to inform more effective implementation. Undertaken in partnership with government departments, the study explored the application of social capital theory in programs designed to promote health and wellbeing in Adelaide, South Australia. It comprised three case studies of community-based practice, employing qualitative interviews and focus groups with community participants, practitioners, program managers and policy makers, to examine the ways in which the concept was interpreted and operationalized and identify the factors influencing success. These key lessons informed the development of practical resources comprising a guide for practitioners and briefing for policy makers. Overall the study showed that effective community projects can contribute to population health and wellbeing and reducing health inequities. Of specific relevance to this paper, however, is the finding that community projects rely for their effectiveness on a broader commitment expressed through policies and frameworks at the highest level of government decision making. In particular this

  15. Design and Implementation of Ceph: A Scalable Distributed File System

    Energy Technology Data Exchange (ETDEWEB)

    Weil, S A; Brandt, S A; Miller, E L; Long, D E; Maltzahn, C

    2006-04-19

    File system designers continue to look to new architectures to improve scalability. Object-based storage diverges from server-based (e.g. NFS) and SAN-based storage systems by coupling processors and memory with disk drives, delegating low-level allocation to object storage devices (OSDs) and decoupling I/O (read/write) from metadata (file open/close) operations. Even recent object-based systems inherit decades-old architectural choices going back to early UNIX file systems, however, limiting their ability to effectively scale to hundreds of petabytes. We present Ceph, a distributed file system that provides excellent performance and reliability with unprecedented scalability. Ceph maximizes the separation between data and metadata management by replacing allocation tables with a pseudo-random data distribution function (CRUSH) designed for heterogeneous and dynamic clusters of unreliable OSDs. We leverage OSD intelligence to distribute data replication, failure detection and recovery with semi-autonomous OSDs running a specialized local object storage file system (EBOFS). Finally, Ceph is built around a dynamic distributed metadata management cluster that provides extremely efficient metadata management that seamlessly adapts to a wide range of general purpose and scientific computing file system workloads. We present performance measurements under a variety of workloads that show superior I/O performance and scalable metadata management (more than a quarter million metadata ops/sec).

  16. Dist-Orc: A Rewriting-based Distributed Implementation of Orc with Formal Analysis

    Directory of Open Access Journals (Sweden)

    José Meseguer

    2010-09-01

    Full Text Available Orc is a theory of orchestration of services that allows structured programming of distributed and timed computations. Several formal semantics have been proposed for Orc, including a rewriting logic semantics developed by the authors. Orc also has a fully fledged implementation in Java with functional programming features. However, as with descriptions of most distributed languages, there exists a fairly substantial gap between Orc's formal semantics and its implementation, in that: (i programs in Orc are not easily deployable in a distributed implementation just by using Orc's formal semantics, and (ii they are not readily formally analyzable at the level of a distributed Orc implementation. In this work, we overcome problems (i and (ii for Orc. Specifically, we describe an implementation technique based on rewriting logic and Maude that narrows this gap considerably. The enabling feature of this technique is Maude's support for external objects through TCP sockets. We describe how sockets are used to implement Orc site calls and returns, and to provide real-time timing information to Orc expressions and sites. We then show how Orc programs in the resulting distributed implementation can be formally analyzed at a reasonable level of abstraction by defining an abstract model of time and the socket communication infrastructure, and discuss the assumptions under which the analysis can be deemed correct. Finally, the distributed implementation and the formal analysis methodology are illustrated with a case study.

  17. Adaptive invasive species distribution models: A framework for modeling incipient invasions

    Science.gov (United States)

    Uden, Daniel R.; Allen, Craig R.; Angeler, David G.; Corral, Lucia; Fricke, Kent A.

    2015-01-01

    The utilization of species distribution model(s) (SDM) for approximating, explaining, and predicting changes in species’ geographic locations is increasingly promoted for proactive ecological management. Although frameworks for modeling non-invasive species distributions are relatively well developed, their counterparts for invasive species—which may not be at equilibrium within recipient environments and often exhibit rapid transformations—are lacking. Additionally, adaptive ecological management strategies address the causes and effects of biological invasions and other complex issues in social-ecological systems. We conducted a review of biological invasions, species distribution models, and adaptive practices in ecological management, and developed a framework for adaptive, niche-based, invasive species distribution model (iSDM) development and utilization. This iterative, 10-step framework promotes consistency and transparency in iSDM development, allows for changes in invasive drivers and filters, integrates mechanistic and correlative modeling techniques, balances the avoidance of type 1 and type 2 errors in predictions, encourages the linking of monitoring and management actions, and facilitates incremental improvements in models and management across space, time, and institutional boundaries. These improvements are useful for advancing coordinated invasive species modeling, management and monitoring from local scales to the regional, continental and global scales at which biological invasions occur and harm native ecosystems and economies, as well as for anticipating and responding to biological invasions under continuing global change.

  18. Numerical Implementation of Spatial Elastoplastic Damage Model of Concrete in the Framework of Isogeometric Analysis Approach

    Directory of Open Access Journals (Sweden)

    Cheng Ma

    2016-01-01

    Full Text Available This paper is a study of the numerical implementation of the spatial elastoplastic damage model of concrete by isogeometric analysis (IGA method from three perspectives: the geometric modeling and the numerical formulation via IGA method, the constitutive model of concrete, and the solution algorithms for the local and global problems. The plasticity of concrete is considered on the basis of a nonassociated flow rule, where a three-parameter Barcelona yield surface and a modified Drucker-Prager plastic potential are used. The damage evolution of concrete driven by the internal variables is expressed by a piecewise function. In the study, the return-mapping algorithm and the substepping strategy are used for stress updating, and a new dissipation-based arc-length method with constraint path that considers the combined contribution of plasticity and damage to the energy dissipation is employed to trace the equilibrium path. After comparisons between simulation results and experimental data, the use of the elastoplastic damage model in the framework of IGA approach is proven to be practical in reflecting material properties of concrete.

  19. ICT Tools for Implementation the European Qualification Framework in the Agricultural Sector

    Directory of Open Access Journals (Sweden)

    Miklós Herdon

    2011-11-01

    Full Text Available The development of the European Qualifications Framework for Lifelong Learning (EQF commenced in 2004 in response to requests from Member States, the social partners and other stakeholders for a common reference tool to increase the transparency of qualifications. Although Qualifications within the Agricultural sector in Europe share a common base, each country represents significant geographical differences that result in variable Learning Outcomes. The ImpAQ project (Implement Agriculture Qualification recognizes the importance of researching different national qualifications in order to contribute to the comparative analysis at national and European level. The ImpAQ aims to compare the Qualifications related to the Agricultural sector, by identifying and analyzing the main issues to be addressed with the purpose of connecting them to the EQF and focusing on the best resolving approaches following the "best fit" criterion. Within the ImpAQ project the consortium developed and applied ICT tools for collecting information from countries of consortium members to build Inventory Database of Agricultural Qualifications and Agricultural Matrix. The matrix cells contain that which qualification entitle for job in the product/process. The Inventory Database and the Agricultural Matrix is used for comparison qualifications. In our article we describe the concept and ICT tools which was used in the project for filling the matrix and uploading information of Hungarian qualifications into the database.

  20. Prospects for Learning in River Management: Exploring the Initial Implementation of the Water Framework Directive in a Swedish River Basin

    Science.gov (United States)

    Lundmark, Carina; Jonsson, Gunnar

    2014-01-01

    This case study explores the initial implementation of the EU Water Framework Directive (WFD) in the Lule River basin, Sweden, examining how and to what extent administrative procedures enable learning through dialogue and stakeholder collaboration. Theorising on adaptive co-management and social learning is used to structure what is to be learnt,…

  1. Supporting the Implementation of Externally Generated Learning Outcomes and Learning-Centered Curriculum Development: An Integrated Framework

    Science.gov (United States)

    Hubball, Harry; Gold, Neil; Mighty, Joy; Britnell, Judy

    2007-01-01

    This article provides an overview of one Canadian provincially initiated curriculum reform effort in which several generic learning outcomes were established. It also presents a flexible, practical, and integrated framework for the development, implementation, and evaluation of program-level learning outcomes in undergraduate curricula contexts.…

  2. That Your Education May Be Complete: Implementing the Bishops' Curriculum Framework in Continuity with the Christian Teaching Tradition

    Science.gov (United States)

    Manning, Patrick R.

    2012-01-01

    While the U.S. Bishops' Doctrinal Elements of a Curriculum Framework provides robust content guidelines for a national high school Religion curriculum, its successful implementation will depend largely on concurrent development of, and training in, pedagogy suited to Christian education. This paper directs educators to existing catechetical…

  3. Applying the knowledge to action framework to plan a strategy for implementing breast cancer screening guidelines: an interprofessional perspective.

    Science.gov (United States)

    Munce, Sarah; Kastner, Monika; Cramm, Heidi; Lal, Shalini; Deschêne, Sarah-Maude; Auais, Mohammad; Stacey, Dawn; Brouwers, Melissa

    2013-09-01

    Integrated knowledge translation (IKT) interventions may be one solution to improving the uptake of clinical guidelines. IKT research initiatives are particularly relevant for breast cancer research and initiatives targeting the implementation of clinical guidelines and guideline implementation initiatives, where collaboration with an interdisciplinary team of practitioners, patients, caregivers, and policy makers is needed for producing optimum patient outcomes. The objective of this paper was to describe the process of developing an IKT strategy that could be used by guideline developers to improve the uptake of their new clinical practice guidelines on breast cancer screening. An interprofessional group of students as well as two faculty members met six times over three days at the KT Canada Summer Institute in 2011. The team used all of the phases of the action cycle in the Knowledge to Action Framework as an organizing framework. While the entire framework was used, the step involving assessing barriers to knowledge use was judged to be particularly relevant in anticipating implementation problems and being able to inform the specific KT interventions that would be appropriate to mitigate these challenges and to accomplish goals and outcomes. This activity also underscored the importance of group process and teamwork in IKT. We propose that an a priori assessment of barriers to knowledge use (i.e., level and corresponding barriers), along with the other phases of the Knowledge to Action Framework, is a strategic approach for KT strategy development, implementation, and evaluation planning and could be used in the future planning of KT strategies.

  4. The Climate Change Education Evidence Base: Lessons Learned from NOAA's Monitoring and Evaluation Framework Implementation

    Science.gov (United States)

    Baek, J.

    2012-12-01

    effort has provided some shared understanding and general guidance, there is still a lack of guidance to make decisions at any level of the community. A recent memorandum from the Office of Management and Budget provides more specific guidance around the generation and utilization of evidence. For example, the amount of funding awarded through grants should be weighted by the level of the evidence supporting a proposed project. As the field of climate change education establishes an evidence base, study designs should address a greater number of internal validity threats through comparison groups and reliable common measures. In addition, OMB invites agencies to develop systematic measurement of costs and costs per outcome. A growing evidence base, one that includes data that includes costs and even monetizes benefits, can inform decisions based on the strongest returns on investments within a portfolio. This paper will provide examples from NOAA's Monitoring and Evaluation Framework Implementation project that illustrate how NOAA is facing these challenges. This is intended to inform climate change educators, evaluators, and researchers in ways to integrate evaluation into the management of their programs while providing insight across the portfolio.

  5. Research and Implementation of Architecture for Distributed Service Performance Management

    Institute of Scientific and Technical Information of China (English)

    CHEN Jing; YIN Xiao-chuan; ZHANG Shui-ping

    2006-01-01

    An architecture for online discovery quantitative models system of service performance management is proposed. The system is capable of constructing the quantitative models without prior knowledge of the managed elements. The model can be updated continuously in response to the changes made in provider configurations and the evolution of business demands. Due to the existence of strong correlation between the distributed service metrics and response times, linear and hyper-linear quantitative models are constructed, which respectively, use stepwise multiple linear regression algorithm. The simulation results show the effectiveness of a quantitative model constructing system and model constructing algorithms.

  6. a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images

    Science.gov (United States)

    Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.

    2015-07-01

    Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.

  7. A Middleware-based Distributed Management Framework for Next-generation Network

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    As the basis for next-generation application middleware, distributed computing technique is increasingly used in the network management to fulfill the integration of diverse management systems. Furthermore Web technology is widely used in interaction friendly with user. This article puts forward a MNMI (Middleware-based Network Management Integration) framework in the domain of the management of IPv6 test bed-NSFCNET (National Natural Science Foundation of China Network). The framework uses a logically-layers method from user interface to managers and pseudo-agent, which uses distributed management and groups managed objects to retrieve, and thus it has assured the performance of network management by providing multiple object quick access.

  8. A fully distributed implementation of mean annual streamflow regional regression equations

    Science.gov (United States)

    Verdin, K.L.; Worstell, B.

    2008-01-01

    Estimates of mean annual streamflow are needed for a variety of hydrologic assessments. Away from gage locations, regional regression equations that are a function of upstream area, precipitation, and temperature are commonly used. Geographic information systems technology has facilitated their use for projects, but traditional approaches using the polygon overlay operator have been too inefficient for national scale applications. As an alternative, the Elevation Derivatives for National Applications (EDNA) database was used as a framework for a fully distributed implementation of mean annual streamflow regional regression equations. The raster "flow accumulation" operator was used to efficiently achieve spatially continuous parameterization of the equations for every 30 m grid cell of the conterminous United States (U.S.). Results were confirmed by comparing with measured flows at stations of the Hydro-Climatic Data Network, and their applications value demonstrated in the development of a national geospatial hydropower assessment. Interactive tools at the EDNA website make possible the fast and efficient query of mean annual streamflow for any location in the conterminous U.S., providing a valuable complement to other national initiatives (StreamStats and the National Hydrography Dataset Plus). ?? 2008 American Water Resources Association.

  9. Measuring determinants of implementation behavior: Psychometric properties of a questionnaire based on the theoretical domains framework

    NARCIS (Netherlands)

    Huijg, J.M.; Gebhardt, W.A.; Dusseldorp, E.; Verheijden, M.W.; Zouwe, N. van der; Middelkoop, B.J.C.; Crone, M.R.

    2014-01-01

    Background: To be able to design effective strategies to improve healthcare professionals' implementation behaviors, a valid and reliable questionnaire is needed to assess potential implementation determinants. The present study describes the development of the Determinants of Implementation Behavio

  10. A Framework for Federated Two-Factor Authentication Enabling Cost-Effective Secure Access to Distributed Cyberinfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Ezell, Matthew A [ORNL; Rogers, Gary L [University of Tennessee, Knoxville (UTK); Peterson, Gregory D. [University of Tennessee, Knoxville (UTK)

    2012-01-01

    As cyber attacks become increasingly sophisticated, the security measures used to mitigate the risks must also increase in sophistication. One time password (OTP) systems provide strong authentication because security credentials are not reusable, thus thwarting credential replay attacks. The credential changes regularly, making brute-force attacks significantly more difficult. In high performance computing, end users may require access to resources housed at several different service provider locations. The ability to share a strong token between multiple computing resources reduces cost and complexity. The National Science Foundation (NSF) Extreme Science and Engineering Discovery Environment (XSEDE) provides access to digital resources, including supercomputers, data resources, and software tools. XSEDE will offer centralized strong authentication for services amongst service providers that leverage their own user databases and security profiles. This work implements a scalable framework built on standards to provide federated secure access to distributed cyberinfrastructure.

  11. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    Science.gov (United States)

    2012-09-01

    Orchard & Vachtsevanos, 2009). However, the integration of diag- nosis and prognosis algorithms is seldom studied. In fact, many diagnosis...tasks to be performed in a distributed way. We show how different submodels can be automatically constructed to solve the local diagnosis and prognosis...or the prognosis task. Some works have proposed the inte- gration of both tasks within a common framework (Patrick et al., 2007; Orchard

  12. Towards Distributed Information Retrieval in the Semantic Web: Query Reformulation Using the oMAP Framework

    NARCIS (Netherlands)

    Straccia, U.; Troncy, R.

    2006-01-01

    This paper introduces a general methodology for performing distributed search in the Semantic Web. We propose to define this task as a three steps process, namely resource selection, query reformulation/ontology alignment and rank aggregation/data fusion. For the second problem, we have implemented

  13. The development of an implementation framework for service-learning during the undergraduate nursing programme in the Western Cape Province

    Directory of Open Access Journals (Sweden)

    Hester Julie

    2015-07-01

    Full Text Available Background: Service-learning (SL is a contested field of knowledge and issues of sustainability and scholarship have been raised about it. The South African Higher Education Quality Committee (HEQC has provided policy documents to guide higher education institutions (HEIs in the facilitation of SL institutionalisation in their academic programmes. An implementation framework was therefore needed to institutionalise the necessary epistemological shifts advocated in the national SL policy guidelines.Objectives: This article is based on the findings of a doctoral thesis that aimed at developing an SL implementation framework for the School of Nursing (SoN at the University of the Western Cape (UWC.Method: Mixed methods were used during the first four phases of the design and developmenti ntervention research model developed by Rothman and Thomas.Results: The SL implementation framework that was developed during Phase 3 specified the intervention elements to address the gaps that had been identified by the core findings of Phases 1 and 2. Four intervention elements were specified for the SL implementation framework. The first intervention element focused on the assessment of readiness for SL institutionalisation. The development of SL capacity and SL scholarship was regarded as the pivotal intervention element for three of the elements: the development of a contextual SL definition, an SL pedagogical model, and a monitoring and evaluation system for SL institutionalisation.Conclusion: The SL implementation framework satisfies the goals of SL institutionalisation, namely to develop a common language and a set of principles to guide practice, and to ensure the allocation of resources in order to facilitate the SL teaching methodology.The contextualised SL definition that was formulated for the SoN contributes to the SL operationalisation discourse at the HEI.

  14. The development of an implementation framework for service-learning during the undergraduate nursing programme in the Western Cape Province.

    Science.gov (United States)

    Julie, Hester

    2015-11-13

    Service-learning (SL) is a contested field of knowledge and issues of sustainability and scholarship have been raised about it. The South African Higher Education Quality Committee (HEQC) has provided policy documents to guide higher education institutions (HEIs) in the facilitation of SL institutionalisation in their academic programmes. An implementation framework was therefore needed to institutionalise the necessary epistemological shifts advocated in the national SL policy guidelines. This article is based on the findings of a doctoral thesis that aimed at developing an SL implementation framework for the School of Nursing (SoN) at the University of the Western Cape (UWC). Mixed methods were used during the first four phases of the design and developmenti ntervention research model developed by Rothman and Thomas. The SL implementation framework that was developed during Phase 3 specified the intervention elements to address the gaps that had been identified by the core findings of Phases 1 and 2. Four intervention elements were specified for the SL implementation framework. The first intervention element focused on the assessment of readiness for SL institutionalisation. The development of SL capacity and SL scholarship was regarded as the pivotal intervention element for three of the elements: the development of a contextual SL definition, an SL pedagogical model, and a monitoring and evaluation system for SL institutionalisation. The SL implementation framework satisfies the goals of SL institutionalisation, namely to develop a common language and a set of principles to guide practice, and to ensure the allocation of resources in order to facilitate the SL teaching methodology.The contextualised SL definition that was formulated for the SoN contributes to the SL operationalisation discourse at the HEI.

  15. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  16. Optimization by Estimation of Distribution with DEUM Framework Based on Markov Random Fields

    Institute of Scientific and Technical Information of China (English)

    Siddhartha Shakya; John McCall

    2007-01-01

    This paper presents a Markov random field (MRP) approach to estimating and sampling the probability distribution in populations of solutions. The approach is used to define a class of algorithms under the general heading distribution estimation using Markov random fields (DEUM). DEUM is a subclass of estimation of distribution algorithms (EDAs) where interaction between solution variables is represented as an undirected graph and the joint probability of a solution is factorized as a Gibbs distribution derived from the structure of the graph. The focus of this paper will be on describing the three main characteristics of DEUM framework, which distinguishes it from the traditional EDA. They are: 1) use of MRF models, 2) fitness modeling approach to estimating the parameter of the model and 3) Monte Carlo approach to sampling from the model.

  17. A Framework for Analyzing Massive Astrophysical Datasets on a Distributed Grid

    Science.gov (United States)

    Gardner, J. P.; Connolly, A.; McBride, C.

    2007-10-01

    Virtual observatories will give astronomers easy access to an unprecedented amount of data. Extracting scientific knowledge from these data will increasingly demand both efficient algorithms as well as the power of parallel computers. Such machines will range in size from small Beowulf clusters to large, massively parallel platforms (MPPs) to collections of MPPs distributed across a Grid, such as the NSF TeraGrid facility. Nearly all efficient analyses of large astronomical datasets use trees as their fundamental data structure. Writing efficient tree-based techniques, a task that is time-consuming even on single-processor computers, is exceedingly cumbersome on parallel or grid-distributed resources. We have developed a framework, Ntropy, that provides a flexible, extensible, and easy-to-use way of developing tree-based data analysis algorithms for both serial and parallel platforms. Our experience has shown that not only does our framework save development time, it also delivers an increase in serial performance. Furthermore, our framework makes it easy for an astronomer with little or no parallel programming experience to scale their application quickly to a distributed, multi-processor environment. By minimizing development time for efficient and scalable data analysis, we will enable wide-scale knowledge discovery on massive datasets.

  18. Practical Considerations regarding Implementation of Wind Power Applications into Real-Time Hardware-In-The-Loop Framework

    DEFF Research Database (Denmark)

    Petersen, Lennart; Iov, Florin

    2017-01-01

    This paper addresses the system implementation of voltage control architecture in wind power plants into a Real-Time Hardware-In-The-Loop framework. The increasing amount of wind power penetration into the power systems has en-gaged the wind power plants to take over the responsibility for adequate...... controls is reproduced in continuous-time domain using Laplace transform, while in practical im-plementation digital control systems are employed. The scope of this paper is to elaborate on the practical implementa-tion of the voltage control architecture into a Real-Time Hardware-In-The-Loop framework......, where the focus is laid on the model development in a real-time simulator. It enables to verify the functionality of developed controls, which is one of the research priorities due to the increased complexity of large wind power plants requiring high level of com-munication between plant control...

  19. Developing a pedagogical framework for the design and the implementation of e-portfolios in educational practice

    Directory of Open Access Journals (Sweden)

    Athanassios Jimoyiannis

    2012-01-01

    Full Text Available A theoretical framework for designing, implementing and researching students’ engagement, learning, and personal development in e-portfolios is described in this article. After providing an overview of the research on e-portfolios in education, the paper analyses the theoretical foundations of e-portfolio learning. Following it proposes a conceptual and organizational framework for teachers and instructors a to conceptualize principles of student motivation, self-directed learning and reflection, and b to implement effective e-portfolio learning initiatives at secondary and higher education, and teacher professional development. Finally, the article presents representative case studies and good practice examples regarding the implementation of e-portfolio initiatives using different tools in various educational contexts and programs.

  20. A Parallel and Distributed Surrogate Model Implementation for Computational Steering

    KAUST Repository

    Butnaru, Daniel

    2012-06-01

    Understanding the influence of multiple parameters in a complex simulation setting is a difficult task. In the ideal case, the scientist can freely steer such a simulation and is immediately presented with the results for a certain configuration of the input parameters. Such an exploration process is however not possible if the simulation is computationally too expensive. For these cases we present in this paper a scalable computational steering approach utilizing a fast surrogate model as substitute for the time-consuming simulation. The surrogate model we propose is based on the sparse grid technique, and we identify the main computational tasks associated with its evaluation and its extension. We further show how distributed data management combined with the specific use of accelerators allows us to approximate and deliver simulation results to a high-resolution visualization system in real-time. This significantly enhances the steering workflow and facilitates the interactive exploration of large datasets. © 2012 IEEE.

  1. Distributed Geant4 simulation in medical and space science applications using DIANE framework and the GRID

    CERN Document Server

    Moscicki, J T; Mantero, A; Pia, M G

    2003-01-01

    Distributed computing is one of the most important trends in IT which has recently gained significance for large-scale scientific applications. Distributed analysis environment (DIANE) is a R&D study, focusing on semiinteractive parallel and remote data analysis and simulation, which has been conducted at CERN. DIANE provides necessary software infrastructure for parallel scientific applications in the master-worker model. Advanced error recovery policies, automatic book-keeping of distributed jobs and on-line monitoring and control tools are provided. DIANE makes a transparent use of a number of different middleware implementations such as load balancing service (LSF, PBS, GRID Resource Broker, Condor) and security service (GSI, Kerberos, openssh). A number of distributed Geant 4 simulations have been deployed and tested, ranging from interactive radiotherapy treatment planning using dedicated clusters in hospitals, to globally-distributed simulations of astrophysics experiments using the European data g...

  2. Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure

    Science.gov (United States)

    Sloan, Tine

    2013-01-01

    This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…

  3. Distributed Leadership and Organizational Change: Implementation of a Teaching Performance Measure

    Science.gov (United States)

    Sloan, Tine

    2013-01-01

    This article explores leadership practice and change as evidenced in multiple data sources gathered during a self-study implementation of a teaching performance assessment. It offers promising models of distributed leadership and organizational change that can inform future program implementers and the field in general. Our experiences suggest…

  4. Distribution theory approach to implementing directional acoustic sensors.

    Science.gov (United States)

    Schmidlin, Dean J

    2010-01-01

    The objective of directional acoustic sensors is to provide high directivity while occupying a small amount of space. An idealized point sensor achieves this objective from a knowledge of the spatial partial derivatives of acoustic pressure at a point in space. Direct measurement of these derivatives is difficult in practice. Consequently, it is expedient to come up with indirect methods. The use of pressure sensors to construct finite-difference approximations is an example of such a method. This paper utilizes the theory of distributions to derive another indirect method for estimating the various spatial partial derivatives of the pressure. This alternate method is then used to construct a multichannel filter which processes the acoustic pressure by mean of three-dimensional integral transforms throughout a 6epsilon-length cube centered at the origin. The output of the multichannel filter is a spatially and temporally filtered version of the pressure at the origin. The temporal filter is a lowpass Gaussian filter whose bandwidth is inversely proportional to epsilon. Finally, the lattice method for numerical multiple integration is utilized to develop a discrete-spatial version of the multichannel filter.

  5. The implementation of a global fund grant in Lesotho: applying a framework on knowledge absorptive capacity.

    Science.gov (United States)

    Biesma, Regien; Makoa, Elsie; Mpemi, Regina; Tsekoa, Lineo; Odonkor, Philip; Brugha, Ruairi

    2012-02-01

    One of the biggest challenges in scaling up health interventions in sub-Saharan Africa for government recipients is to effectively manage the rapid influx of aid from different donors, each with its own requirements and conditions. However, there is little empirical evidence on how governments absorb knowledge from new donors in order to satisfy their requirements. This case study applies Cuellar and Gallivan's (2006) framework on knowledge absorptive capacity (AC) to illustrate how recipient government organisations in Lesotho identified, assimilated and utilised knowledge on how to meet the disbursement and reporting requirements of Lesotho's Round 5 grant from the Global Fund to Fight AIDS, TB and Malaria (Global Fund). In-depth topic guided interviews with 22 respondents and document reviews were conducted between July 2008 and February 2009. Analysis focused on six organisational determinants that affect an organisation's absorptive capacity: prior-related knowledge, combinative capabilities, motivation, organisational structure, cultural match, and communication channels. Absorptive capacity was mostly evident at the level of the Principal Recipient, the Ministry of Finance, who established a new organisational unit to meet the requirements of Global Fund Grants, while the level of AC was less advanced among the Ministry of Health (Sub-Recipient) and district level implementers. Recipient organisations can increase their absorptive capacity, not only through prior knowledge of donor requirements, but also by deliberately changing their organisational form and through combinative capabilities. The study also revealed how vulnerable African governments are to loss of staff capacity. The application of organisational theory to analyse the interactions of donor agencies with public and non-public country stakeholders illustrates the complexity of the environment that aid recipient governments have to manage. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Comparison of Two Spatial Optimization Techniques: A Framework to Solve Multiobjective Land Use Distribution Problems

    Science.gov (United States)

    Meyer, Burghard Christian; Lescot, Jean-Marie; Laplana, Ramon

    2009-02-01

    Two spatial optimization approaches, developed from the opposing perspectives of ecological economics and landscape planning and aimed at the definition of new distributions of farming systems and of land use elements, are compared and integrated into a general framework. The first approach, applied to a small river catchment in southwestern France, uses SWAT (Soil and Water Assessment Tool) and a weighted goal programming model in combination with a geographical information system (GIS) for the determination of optimal farming system patterns, based on selected objective functions to minimize deviations from the goals of reducing nitrogen and maintaining income. The second approach, demonstrated in a suburban landscape near Leipzig, Germany, defines a GIS-based predictive habitat model for the search of unfragmented regions suitable for hare populations ( Lepus europaeus), followed by compromise optimization with the aim of planning a new habitat structure distribution for the hare. The multifunctional problem is solved by the integration of the three landscape functions (“production of cereals,” “resistance to soil erosion by water,” and “landscape water retention”). Through the comparison, we propose a framework for the definition of optimal land use patterns based on optimization techniques. The framework includes the main aspects to solve land use distribution problems with the aim of finding the optimal or best land use decisions. It integrates indicators, goals of spatial developments and stakeholders, including weighting, and model tools for the prediction of objective functions and risk assessments. Methodological limits of the uncertainty of data and model outcomes are stressed. The framework clarifies the use of optimization techniques in spatial planning.

  7. Comparison of two spatial optimization techniques: a framework to solve multiobjective land use distribution problems.

    Science.gov (United States)

    Meyer, Burghard Christian; Lescot, Jean-Marie; Laplana, Ramon

    2009-02-01

    Two spatial optimization approaches, developed from the opposing perspectives of ecological economics and landscape planning and aimed at the definition of new distributions of farming systems and of land use elements, are compared and integrated into a general framework. The first approach, applied to a small river catchment in southwestern France, uses SWAT (Soil and Water Assessment Tool) and a weighted goal programming model in combination with a geographical information system (GIS) for the determination of optimal farming system patterns, based on selected objective functions to minimize deviations from the goals of reducing nitrogen and maintaining income. The second approach, demonstrated in a suburban landscape near Leipzig, Germany, defines a GIS-based predictive habitat model for the search of unfragmented regions suitable for hare populations (Lepus europaeus), followed by compromise optimization with the aim of planning a new habitat structure distribution for the hare. The multifunctional problem is solved by the integration of the three landscape functions ("production of cereals," "resistance to soil erosion by water," and "landscape water retention"). Through the comparison, we propose a framework for the definition of optimal land use patterns based on optimization techniques. The framework includes the main aspects to solve land use distribution problems with the aim of finding the optimal or best land use decisions. It integrates indicators, goals of spatial developments and stakeholders, including weighting, and model tools for the prediction of objective functions and risk assessments. Methodological limits of the uncertainty of data and model outcomes are stressed. The framework clarifies the use of optimization techniques in spatial planning.

  8. Measuring implementation behaviour of menu guidelines in the childcare setting: confirmatory factor analysis of a theoretical domains framework questionnaire (TDFQ).

    Science.gov (United States)

    Seward, Kirsty; Wolfenden, Luke; Wiggers, John; Finch, Meghan; Wyse, Rebecca; Oldmeadow, Christopher; Presseau, Justin; Clinton-McHarg, Tara; Yoong, Sze Lin

    2017-04-04

    While there are number of frameworks which focus on supporting the implementation of evidence based approaches, few psychometrically valid measures exist to assess constructs within these frameworks. This study aimed to develop and psychometrically assess a scale measuring each domain of the Theoretical Domains Framework for use in assessing the implementation of dietary guidelines within a non-health care setting (childcare services). A 75 item 14-domain Theoretical Domains Framework Questionnaire (TDFQ) was developed and administered via telephone interview to 202 centre based childcare service cooks who had a role in planning the service menu. Confirmatory factor analysis (CFA) was undertaken to assess the reliability, discriminant validity and goodness of fit of the 14-domain theoretical domain framework measure. For the CFA, five iterative processes of adjustment were undertaken where 14 items were removed, resulting in a final measure consisting of 14 domains and 61 items. For the final measure: the Chi-Square goodness of fit statistic was 3447.19; the Standardized Root Mean Square Residual (SRMR) was 0.070; the Root Mean Square Error of Approximation (RMSEA) was 0.072; and the Comparative Fit Index (CFI) had a value of 0.78. While only one of the three indices support goodness of fit of the measurement model tested, a 14-domain model with 61 items showed good discriminant validity and internally consistent items. Future research should aim to assess the psychometric properties of the developed TDFQ in other community-based settings.

  9. Implementation of a Systematic Accountability Framework in 2014 to Improve the Performance of the Nigerian Polio Program

    Science.gov (United States)

    Tegegne, Sisay G.; MKanda, Pascal; Yehualashet, Yared G.; Erbeto, Tesfaye B.; Touray, Kebba; Nsubuga, Peter; Banda, Richard; Vaz, Rui G.

    2016-01-01

    Background. An accountability framework is a central feature of managing human and financial resources. One of its primary goals is to improve program performance through close monitoring of selected priority activities. The principal objective of this study was to determine the contribution of a systematic accountability framework to improving the performance of the World Health Organization (WHO)–Nigeria polio program staff, as well as the program itself. Methods. The effect of implementation of the accountability framework was evaluated using data on administrative actions and select process indicators associated with acute flaccid paralysis (AFP) surveillance, routine immunization, and polio supplemental immunization activities. Data were collected in 2014 during supportive supervision, using Magpi software (a company that provides service to collect data using mobile phones). A total of 2500 staff were studied. Results. Data on administrative actions and process indicators from quarters 2–4 in 2014 were compared. With respect to administrative actions, 1631 personnel (74%) received positive feedback (written or verbal commendation) in quarter 4 through the accountability framework, compared with 1569 (73%) and 1152 (61%) during quarters 3 and 2, respectively. These findings accorded with data on process indicators associated with AFP surveillance and routine immunization, showing statistically significant improvements in staff performance at the end of quarter 4, compared with other quarters. Conclusions. Improvements in staff performance and process indicators were observed for the WHO-Nigeria polio program after implementation of a systematic accountability framework. PMID:26823334

  10. Introducing an accountability framework for polio eradication in Ethiopia: results from the first year of implementation 2014-2015.

    Science.gov (United States)

    Kassahun, Aron; Braka, Fiona; Gallagher, Kathleen; Gebriel, Aregai Wolde; Nsubuga, Peter; M'pele-Kilebou, Pierre

    2017-01-01

    the World Health Organization (WHO), Ethiopia country office, introduced an accountability framework into its Polio Eradication Program in 2014 with the aim of improving the program's performance. Our study aims to evaluate staff performance and key program indicators following the introduction of the accountability framework. the impact of the WHO accountability framework was reviewed after its first year of implementation from June 2014 to June 2015. We analyzed selected program and staff performance indicators associated with acute flaccid paralysis (AFP) surveillance from a database available at WHO. Data on managerial actions taken were also reviewed. Performance of a total of 38 staff was evaluated during our review. our review of results for the first four quarters of implementation of the polio eradication accountability framework showed improvement both at the program and individual level when compared with the previous year. Managerial actions taken during the study period based on the results from the monitoring tool included eleven written acknowledgments, six discussions regarding performance improvement, six rotations of staff, four written first-warning letters and nine non-renewal of contracts. the introduction of the accountability framework resulted in improvement in staff performance and overall program indicators for AFP surveillance.

  11. Using a framework to implement large-scale innovation in medical education with the intent of achieving sustainability.

    Science.gov (United States)

    Hudson, Judith N; Farmer, Elizabeth A; Weston, Kathryn M; Bushnell, John A

    2015-01-16

    Particularly when undertaken on a large scale, implementing innovation in higher education poses many challenges. Sustaining the innovation requires early adoption of a coherent implementation strategy. Using an example from clinical education, this article describes a process used to implement a large-scale innovation with the intent of achieving sustainability. Desire to improve the effectiveness of undergraduate medical education has led to growing support for a longitudinal integrated clerkship (LIC) model. This involves a move away from the traditional clerkship of 'block rotations' with frequent changes in disciplines, to a focus upon clerkships with longer duration and opportunity for students to build sustained relationships with supervisors, mentors, colleagues and patients. A growing number of medical schools have adopted the LIC model for a small percentage of their students. At a time when increasing medical school numbers and class sizes are leading to competition for clinical supervisors it is however a daunting challenge to provide a longitudinal clerkship for an entire medical school class. This challenge is presented to illustrate the strategy used to implement sustainable large scale innovation. A strategy to implement and build a sustainable longitudinal integrated community-based clerkship experience for all students was derived from a framework arising from Roberto and Levesque's research in business. The framework's four core processes: chartering, learning, mobilising and realigning, provided guidance in preparing and rolling out the 'whole of class' innovation. Roberto and Levesque's framework proved useful for identifying the foundations of the implementation strategy, with special emphasis on the relationship building required to implement such an ambitious initiative. Although this was innovation in a new School it required change within the school, wider university and health community. Challenges encountered included some resistance to

  12. A Conceptual Framework for Organizational Readiness to Implement Nutrition and Physical Activity Programs in Early Childhood Education Settings

    Science.gov (United States)

    Upadhyaya, Mudita; Schober, Daniel J.; Byrd-Williams, Courtney

    2014-01-01

    Across multiple sectors, organizational readiness predicts the success of program implementation. However, the factors influencing readiness of early childhood education (ECE) organizations for implementation of new nutrition and physical activity programs is poorly understood. This study presents a new conceptual framework to measure organizational readiness to implement nutrition and physical activity programs in ECE centers serving children aged 0 to 5 years. The framework was validated for consensus on relevance and generalizability by conducting focus groups; the participants were managers (16 directors and 2 assistant directors) of ECE centers. The framework theorizes that it is necessary to have “collective readiness,” which takes into account such factors as resources, organizational operations, work culture, and the collective attitudes, motivation, beliefs, and intentions of ECE staff. Results of the focus groups demonstrated consensus on the relevance of proposed constructs across ECE settings. Including readiness measures during program planning and evaluation could inform implementation of ECE programs targeting nutrition and physical activity behaviors. PMID:25357258

  13. Using an Implementation Research Framework to Identify Potential Facilitators and Barriers of an Intervention to Increase HPV Vaccine Uptake.

    Science.gov (United States)

    Selove, Rebecca; Foster, Maya; Mack, Raquel; Sanderson, Maureen; Hull, Pamela C

    Although the incidence of cervical cancer has been decreasing in the United States over the last decade, Hispanic and African American women have substantially higher rates than Caucasian women. The human papillomavirus (HPV) is a necessary, although insufficient, cause of cervical cancer. In the United States in 2013, only 37.6% of girls 13 to 17 years of age received the recommended 3 doses of a vaccine that is almost 100% efficacious for preventing infection with viruses that are responsible for 70% of cervical cancers. Implementation research has been underutilized in interventions for increasing vaccine uptake. The Consolidated Framework for Implementation Research (CFIR), an approach for designing effective implementation strategies, integrates 5 domains that may include barriers and facilitators of HPV vaccination. These include the innovative practice (Intervention), communities where youth and parents live (Outer Setting), agencies offering vaccination (Inner Setting), health care staff (Providers), and planned execution and evaluation of intervention delivery (Implementation Process). Secondary qualitative analysis of transcripts of interviews with 30 community health care providers was conducted using the CFIR to code potential barriers and facilitators of HPV vaccination implementation. All CFIR domains except Implementation Process were well represented in providers' statements about challenges and supports for HPV vaccination. A comprehensive implementation framework for promoting HPV vaccination may increase vaccination rates in ethnically diverse communities. This study suggests that the CFIR can be used to guide clinicians in planning implementation of new approaches to increasing HPV vaccine uptake in their settings. Further research is needed to determine whether identifying implementation barriers and facilitators in all 5 CFIR domains as part of developing an intervention contributes to improved HPV vaccination rates.

  14. Implementing and Investigating Distributed Leadership in a National University Network--SaMnet

    Science.gov (United States)

    Sharma, Manjula D.; Rifkin, Will; Tzioumis, Vicky; Hill, Matthew; Johnson, Elizabeth; Varsavsky, Cristina; Jones, Susan; Beames, Stephanie; Crampton, Andrea; Zadnik, Marjan; Pyke, Simon

    2017-01-01

    The literature suggests that collaborative approaches to leadership, such as distributed leadership, are essential for supporting educational innovators in leading change in teaching in universities. This paper briefly describes the array of activities, processes and resources to support distributed leadership in the implementation of a network,…

  15. Implementing and Investigating Distributed Leadership in a National University Network--SaMnet

    Science.gov (United States)

    Sharma, Manjula D.; Rifkin, Will; Tzioumis, Vicky; Hill, Matthew; Johnson, Elizabeth; Varsavsky, Cristina; Jones, Susan; Beames, Stephanie; Crampton, Andrea; Zadnik, Marjan; Pyke, Simon

    2017-01-01

    The literature suggests that collaborative approaches to leadership, such as distributed leadership, are essential for supporting educational innovators in leading change in teaching in universities. This paper briefly describes the array of activities, processes and resources to support distributed leadership in the implementation of a network,…

  16. Depth distribution of Moho and tectonic framework in eastern Asian continent and its adjacent ocean areas

    Institute of Scientific and Technical Information of China (English)

    TENG; Jiwen; (滕吉文); ZENG; Rongsheng; (曾融生); YAN; Yafen; (闫雅芬); ZHANG; Hui; (张慧)

    2003-01-01

    With the results of interpretation of seismic sounding profiles acquired in the past 30 years in the continent of China and its adjacent countries andocean regions, such as Russia, Kazakhstan, Japan, India, Pakistan, Philippine ocean basin, Pacific and Indian Ocean, we compiled a 2D Moho distribution map forthe continent and its adjacent areas of eastern Asia. From the features of depth distribution and undulation of Moho, it is suggested that the eastern Asian region can be divided into 18 gradient belts with different sizes, 18 crustal blocks, 20 sediment basins and depression zones. The depth of Moho varies smoothly in each block, while the boundary (separating different blocks) delineates the abrupt variation of Moho depth. Then, some subjects,such as oregen and sediment basin, fault system and rift, plate boundary, ocean-continent coupling and tectonic framework, are discussed based on the distribution gradient belts and block partition features of Moho depth in the eastern Asia and its adjacent regions.

  17. SCondi: A Smart Context Distribution Framework Based on a Messaging Service for the Internet of Things

    Directory of Open Access Journals (Sweden)

    Jongmoon Park

    2014-01-01

    Full Text Available When developing IoT (Internet of Things applications, context distribution is a key feature to support effective delivery of related contextual data obtained from things to all interested entities. With the advent of the IoT era, multiple billion devices can generate huge amounts of data that might be used in IoT applications. In this paper, we present a context distribution framework named SCondi utilizing the messaging service which supports MQTT—an OASIS standard IoT messaging protocol. SCondi provides the notion of context channel as a core feature to support efficient and reliable mechanism for distributing huge context information in the IoT environment. The context channel provides a pluggable filter mechanism that supports effective extraction, tailoring, authentication, and security of information.

  18. Anticipating species distributions: Handling sampling effort bias under a Bayesian framework.

    Science.gov (United States)

    Rocchini, Duccio; Garzon-Lopez, Carol X; Marcantonio, Matteo; Amici, Valerio; Bacaro, Giovanni; Bastin, Lucy; Brummitt, Neil; Chiarucci, Alessandro; Foody, Giles M; Hauffe, Heidi C; He, Kate S; Ricotta, Carlo; Rizzoli, Annapaola; Rosà, Roberto

    2017-04-15

    Anticipating species distributions in space and time is necessary for effective biodiversity conservation and for prioritising management interventions. This is especially true when considering invasive species. In such a case, anticipating their spread is important to effectively plan management actions. However, considering uncertainty in the output of species distribution models is critical for correctly interpreting results and avoiding inappropriate decision-making. In particular, when dealing with species inventories, the bias resulting from sampling effort may lead to an over- or under-estimation of the local density of occurrences of a species. In this paper we propose an innovative method to i) map sampling effort bias using cartogram models and ii) explicitly consider such uncertainty in the modeling procedure under a Bayesian framework, which allows the integration of multilevel input data with prior information to improve the anticipation species distributions.

  19. Technical and Economic Assessment of the Implementation of Measures for Reducing Energy Losses in Distribution Systems

    Science.gov (United States)

    Aguila, Alexander; Wilson, Jorge

    2017-07-01

    This paper develops a methodology to assess a group of measures of electrical improvements in distribution systems, starting from the complementation of technical and economic criteria. In order to solve the problem of energy losses in distribution systems, technical and economic analysis was performed based on a mathematical model to establish a direct relationship between the energy saved by way of minimized losses and the costs of implementing the proposed measures. This paper aims at analysing the feasibility of reducing energy losses in distribution systems, by changing existing network conductors by larger crosssection conductors and distribution voltage change at higher levels. The impact of this methodology provides a highly efficient mathematical tool for analysing the feasibility of implementing improvement projects based on their costs which is a very useful tool for the distribution companies that will serve as a starting point to the analysis for this type of projects in distribution systems.

  20. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    Science.gov (United States)

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  1. Developing and Implementing a Framework of Participatory Simulation for Mobile Learning Using Scaffolding

    Science.gov (United States)

    Yin, Chengjiu; Song, Yanjie; Tabata, Yoshiyuki; Ogata, Hiroaki; Hwang, Gwo-Jen

    2013-01-01

    This paper proposes a conceptual framework, scaffolding participatory simulation for mobile learning (SPSML), used on mobile devices for helping students learn conceptual knowledge in the classroom. As the pedagogical design, the framework adopts an experiential learning model, which consists of five sequential but cyclic steps: the initial stage,…

  2. An optimized framework for degree distribution in LT codes based on power law

    Institute of Scientific and Technical Information of China (English)

    Asim; Muhammad; Choi; GoangSeog

    2013-01-01

    LT codes are practical realization of digital fountain codes, which provides the concept of rateless coding. In this scheme, encoded symbols are generated infinitely from k information symbols. Decoder uses only(1+α)k number of encoded symbols to recover the original information. The degree distribution function in the LT codes helps to generate a random graph also referred as tanner graph. The artifact of tanner graph is responsible for computational complexity and overhead in the LT codes. Intuitively, a well designed degree distribution can be used for an efficient implementation of LT codes. The degree distribution function is studied as a function of power law, and LT codes are classified into two different categories: SFLT and RLT codes. Also, two different degree distributions are proposed and analyzed for SFLT codes which guarantee optimal performance in terms of computational complexity and overhead.

  3. TMD PDFs. A Monte Carlo implementation for the sea quark distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hautmann, F. [Oxford Univ. (United Kingdom). Dept. of Theoretical Physics; Hentschinski, M. [Univ. Autonoma de Madrid (Spain). Dept. Fisica Teorica UAM/CSIC; Jung, H. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2012-05-15

    This article gives an introduction to transverse momentum dependent (TMD) parton distribution functions and their use in shower Monte Carlo event generators for high-energy hadron collisions, and describes recent progress in the treatment of sea quark effects within a TMD parton-shower framework.

  4. TMD PDFs. A Monte Carlo implementation for the sea quark distribution

    Energy Technology Data Exchange (ETDEWEB)

    Hautmann, F. [Oxford Univ. (United Kingdom). Dept. of Theoretical Physics; Hentschinski, M. [Univ. Autonoma de Madrid (Spain). Dept. Fisica Teorica UAM/CSIC; Jung, H. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany); European Organization for Nuclear Research (CERN), Geneva (Switzerland)

    2012-05-15

    This article gives an introduction to transverse momentum dependent (TMD) parton distribution functions and their use in shower Monte Carlo event generators for high-energy hadron collisions, and describes recent progress in the treatment of sea quark effects within a TMD parton-shower framework.

  5. TMD PDFs: a Monte Carlo implementation for the sea quark distribution

    CERN Document Server

    Hautmann, F; Jung, H

    2012-01-01

    This article gives an introduction to transverse momentum dependent (TMD) parton distribution functions and their use in shower Monte Carlo event generators for high-energy hadron collisions, and describes recent progress in the treatment of sea quark effects within a TMD parton-shower framework.

  6. The Development Of A Theoretical Lean Culture Causal Framework To Support The Effective Implementation Of Lean In Automotive Component Manufacturers

    Directory of Open Access Journals (Sweden)

    Van der Merwe, Karl Robert

    2014-05-01

    Full Text Available Although it is generally accepted that lean manufacturing improves operational performance, many organisations are struggling to adapt to the lean philosophy. The purpose of this study is to contribute to a more effective strategy for implementing the lean manufacturing improvement philosophy. The study sets out both to integrate well-researched findings and theories related to generic organisational culture with more recent research and experience related to lean culture, and to examine the role that culture plays in the effective implementation of lean manufacturing principles and techniques. The ultimate aim of this exercise is to develop a theoretical lean culture causal framework.

  7. A simplistic pedagogical formulation of a thermal speed distribution using a relativistic framework

    Indian Academy of Sciences (India)

    Ashmeet Singh

    2013-07-01

    A novel pedagogical technique is presented that can be used in the undergraduate (UG) class to formulate a relativistically extended kinetic theory of gases and thermal speed distribution, while assuming the basic thermal symmetry arguments of the famous Maxwell–Boltzmann distribution as presented at the UG level. The adopted framework can be used by students to understand the physics of a thermally governed system at high temperature and speeds, without having to indulge in high level tensor-based mathematics, as has been done by the previous works on the subject. Our approach, a logical extension of that proposed by Maxwell, will first recapitulate what is taught and known in the UG class and then present a methodology inspired from the Maxwell–Boltzmann framework that will help students to understand and derive the physics of relativistic thermal systems. The methodology uses simple tools well known to undergraduates and involves a component of computational techniques that can be used to involve students in this exercise. We have tried to place the current work in a larger perspective with regard to the earlier works done and emphasize on its simplicity and accessibility to students. Towards the end, interesting implications of the relativistically extended distribution are presented and compared with the Maxwell–Boltzmann results at various temperatures.

  8. Smart-Energy Operating-System - A Framework for Implementing Flexible Electric Energy Systems in Smart Cities

    DEFF Research Database (Denmark)

    Madsen, Henrik; Parvizi, Jacopo; Bacher, Peder

    The Smart-Energy Operating-System (SE-OS) framework has been developed within the CITIES research project (www.smart-cities-centre.org). This framework enables a systematic approach for implementing flexible electric energy systems in smart cities. The SE-OS methodologies are based on methods...... for data analytics, cyber physical modelling, forecasting, control, optimization, IoT, IoS, and cloud computing. The SE-OS concept has being used for enabling flexibility and demand response in smart cities in a large number of demo project. Finally it is shown that SE-OS in combination with methods...... for energy systems (gas, thermal, power, biomass, fuel) integration can provide virtual energy storage solutions on all relevant time scales, ie. from minutes to seasonal storage. The Smart-Energy Operating-System (SE-OS) is used to develop, implement and test of solutions (layers: data, models, optimization...

  9. European Expert Consensus Paper on the implementation of Article 14 of the WHO Framework Convention on Tobacco Control.

    Science.gov (United States)

    Clancy, Luke

    2016-11-01

    On 24 November 2015, under the auspices of the European Policy Roundtable on Smoking Cessation, 15 experts on tobacco control and dependence from across the European Union, chaired by Professor Luke Clancy, met in Oslo, Norway, to discuss the implementation of the Tobacco Products Directive and the WHO Framework Convention on Tobacco Control, namely Article 14. On the occasion of the 10th anniversary of the Framework Convention on Tobacco Control, this paper reports the consensus reached by all Roundtable participants on the need to further advance the availability and access to services to support cessation of tobacco use. The implementation of services to support cessation of tobacco use in line with Article 14 can and should be significantly improved to protect the health of European citizens. The meeting was initiated and funded by Pfizer.

  10. Implementing the Gribov-Zwanziger framework in N=1 Super Yang-Mills in the Landau gauge

    CERN Document Server

    Capri, M A L; Guimaraes, M S; Justo, I F; Palhares, L F; Sorella, S P; Vercauteren, D

    2014-01-01

    The Gribov-Zwanziger framework accounting for the existence of Gribov copies is extended to N=1 Super Yang--Mills theories quantized in the Landau gauge. We show that the restriction of the domain of integration in the Euclidean functional integral to the first Gribov horizon can be implemented in a way to recover non-perturbative features of N=1 Super Yang--Mills theories, namely: the existence of the gluino condensate as well as the vanishing of the vacuum energy.

  11. Design and implementation of a QoS measurement and monitoring framework for Diff-Serv network

    Science.gov (United States)

    Ge, Fei; Cao, Yang

    2004-04-01

    The QoS measurement and monitoring framework proposed for Diff-Serv consists of three conceptual modules, the QoS data acquisition agent, the Qos data correlating agent and the QoS data analysis agent. The Common Object Request Broker Architecture (CORBA) is used when implementing the frame. The constructing of the measurement package, the transmitting rule of active measurement packages and path measurement is studied. A abbreviated QoS data correlating and analyzing is introduced.

  12. Study of isospin nonconservation in the framework of spectral distribution theory

    CERN Document Server

    Kar, Kamales

    2014-01-01

    The observed isospin-symmetry breaking in light nuclei are caused not only by the Coulomb interaction but by the isovector one and two body plus isotensor two body nuclear interactions as well. Spectral distribution theory which treats nuclear spectroscopy and other structural properties in a statistical framework was earlier applied to isospin conserving Hamiltonians only. In this paper we extend that to include the nuclear interactions non-scalar in isospin and work out examples in sd shell to calculate the linear term in the isobaric mass-multiplet equation originating from these non-scalar parts.

  13. Study of isospin nonconservation in the framework of spectral distribution theory

    Science.gov (United States)

    Kar, Kamales; Sarkar, Sukhendusekhar

    2015-05-01

    The observed isospin-symmetry breaking in light nuclei are caused not only by the Coulomb interaction but also by the isovector one- and two-body plus isotensor two- body nuclear interactions. Spectral distribution theory, which treats nuclear spectroscopy and other structural properties in a statistical framework, has been applied mostly to isospin conserving Hamiltonians. In this paper we extend that to include the nuclear interactions non-scalar in isospin and work out examples in the sd shell to calculate the linear term in the isobaric mass-multiplet equation originating from these non-isoscalar parts.

  14. Pegasus: A Framework for Mapping Complex Scientific Workflows onto Distributed Systems

    Directory of Open Access Journals (Sweden)

    Ewa Deelman

    2005-01-01

    Full Text Available This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. Pegasus enables users to represent the workflows at an abstract level without needing to worry about the particulars of the target execution systems. The paper describes general issues in mapping applications and the functionality of Pegasus. We present the results of improving application performance through workflow restructuring which clusters multiple tasks in a workflow into single entities. A real-life astronomy application is used as the basis for the study.

  15. The latest ‘big thing’ for South African companies: Enterprise and supplier development – proposing an implementation framework

    Directory of Open Access Journals (Sweden)

    R.I. David Pooe

    2016-01-01

    Full Text Available Background: Although enterprise development and supplier development are two distinct concepts in the organisational and management literature, the Broad-based Black Economic Empowerment (B-BBEE legislation refers to the enterprise development and supplier development as a single concept. As a result, many companies have a single policy, strategy and structural arrangements to manage enterprise development and supplier development programs and activities, thereby conflating the two concepts.Objectives: The aim of this conceptual article was to propose an implementation framework for enterprise and supplier development and to show the rationale of keeping enterprise development and supplier development as two distinct but related activities.Method: The conceptual article provided an overview of the policy context regarding enterprise development, followed by a discussion on enterprise development and its complexities. The article drew from the literature on supplier development and from the supplier adaptation, relational view, and learning and knowledge perspective theories in the development of its argument.Results: The article presented an implementation framework for enterprise and supplier development and then concluded with some recommendations and direction for future research.Conclusion: The implications of this research have great value for organisations as they prepare to implement Enterprise and supplier development (ESD programmes. The proposed framework will also contribute to a better understanding of the ESD process and the link between enterprise development and supplier development processes.

  16. Lilith: A Java framework for the development of scalable tools for high performance distributed computing platforms

    Energy Technology Data Exchange (ETDEWEB)

    Evensky, D.A.; Gentile, A.C.; Armstrong, R.C.

    1998-03-19

    Increasingly, high performance computing constitutes the use of very large heterogeneous clusters of machines. The use and maintenance of such clusters are subject to complexities of communication between the machines in a time efficient and secure manner. Lilith is a general purpose tool that provides a highly scalable, secure, and easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. Lilith is written in Java, taking advantage of Java`s unique features of loading and distributing code dynamically, its platform independence, its thread support, and its provision of graphical components to facilitate easy to use resultant tools. The authors describe the use of Lilith in a tool developed for the maintenance of the large distributed cluster at their institution and present details of the Lilith architecture and user API for the general user development of scalable tools.

  17. BIOMedical Search Engine Framework: Lightweight and customized implementation of domain-specific biomedical search engines.

    Science.gov (United States)

    Jácome, Alberto G; Fdez-Riverola, Florentino; Lourenço, Anália

    2016-07-01

    Text mining and semantic analysis approaches can be applied to the construction of biomedical domain-specific search engines and provide an attractive alternative to create personalized and enhanced search experiences. Therefore, this work introduces the new open-source BIOMedical Search Engine Framework for the fast and lightweight development of domain-specific search engines. The rationale behind this framework is to incorporate core features typically available in search engine frameworks with flexible and extensible technologies to retrieve biomedical documents, annotate meaningful domain concepts, and develop highly customized Web search interfaces. The BIOMedical Search Engine Framework integrates taggers for major biomedical concepts, such as diseases, drugs, genes, proteins, compounds and organisms, and enables the use of domain-specific controlled vocabulary. Technologies from the Typesafe Reactive Platform, the AngularJS JavaScript framework and the Bootstrap HTML/CSS framework support the customization of the domain-oriented search application. Moreover, the RESTful API of the BIOMedical Search Engine Framework allows the integration of the search engine into existing systems or a complete web interface personalization. The construction of the Smart Drug Search is described as proof-of-concept of the BIOMedical Search Engine Framework. This public search engine catalogs scientific literature about antimicrobial resistance, microbial virulence and topics alike. The keyword-based queries of the users are transformed into concepts and search results are presented and ranked accordingly. The semantic graph view portraits all the concepts found in the results, and the researcher may look into the relevance of different concepts, the strength of direct relations, and non-trivial, indirect relations. The number of occurrences of the concept shows its importance to the query, and the frequency of concept co-occurrence is indicative of biological relations

  18. Implementation and Scalability of a Pure Java Parallel Framework with Application to Hyperbolic Conservation Laws (Preprint)

    Science.gov (United States)

    2008-02-04

    However, this may not always be the case. A master may want to tap into the resources of a Beowulf cluster for example, but may only do so through the...framework tailored for explicitly parallel, Single Program, Multiple Data (SPMD) programming applications on clustered networks. In particular, the...horizontal and vertical scalability over clusters of shared-memory machines. Communication within the framework via RMI is discussed in Section 3 for

  19. A WEB-BASED FRAMEWORK FOR VISUALIZING INDUSTRIAL SPATIOTEMPORAL DISTRIBUTION USING STANDARD DEVIATIONAL ELLIPSE AND SHIFTING ROUTES OF GRAVITY CENTERS

    National Research Council Canada - National Science Library

    Y. Song; Z. Gui; H. Wu; Y. Wei

    2017-01-01

    .... The framework uses standard deviational ellipse (SDE) and shifting route of gravity centers to show the spatial distribution and yearly developing trends of different enterprise types according to their industry categories...

  20. A simplistic pedagogical formulation of the Maxwell-Boltzmann Thermal Speed Distribution using a relativistic framework

    CERN Document Server

    Singh, Ashmeet

    2012-01-01

    A novel pedagogical technique is presented that can be used in the undergraduate (UG) class to formulate a relativistically extended Kinetic Theory of Gases and Maxwell-Boltzmann thermal speed distribution, while keeping the basic thermal symmetry arguments intact. The adopted framework can be used by students to understand the physics in a thermally governed system at high temperature and speeds, without having to indulge in high level tensor based mathematics. Our approach will first recapitulate what is taught and known in the UG class and then present a methodology that will help students to understand and derive the physics of relativistic thermal systems. The methodology uses simple tools well known in the UG class and involves a component of computational techniques that can be used to involve students in this exercise. We also present towards the end the interesting implications of the relativistically extended distribution and compare it with Maxwell-Boltzmann results at various temperatures.

  1. A framework for structural modelling of an RFID-enabled intelligent distributed manufacturing control system

    Directory of Open Access Journals (Sweden)

    Barenji, Ali Vatankhah

    2014-08-01

    Full Text Available A modern manufacturing facility typically contains several distributed control systems, such as machining stations, assembly stations, and material handling and storage systems. Integrating Radio Frequency Identification (RFID technology into these control systems provides a basis for monitoring and configuring their components in real-time. With the right structural modelling, it is then possible to evaluate designs and translate them into new operational applications almost immediately. This paper proposes an architecture for the structural modelling of an intelligent distributed control system for a manufacturing facility, by utilising RFID technology. Emphasis is placed on a requirements analysis of the manufacturing system, the design of RFID-enabled intelligent distributed control systems using Unified Modelling Language (UML diagrams, and the use of efficient algorithms and tools for the implementation of these systems.

  2. Evaluating the successful implementation of evidence into practice using the PARiHS framework: theoretical and practical challenges

    Directory of Open Access Journals (Sweden)

    McCormack Brendan

    2008-01-01

    Full Text Available Abstract Background The PARiHS framework (Promoting Action on Research Implementation in Health Services has proved to be a useful practical and conceptual heuristic for many researchers and practitioners in framing their research or knowledge translation endeavours. However, as a conceptual framework it still remains untested and therefore its contribution to the overall development and testing of theory in the field of implementation science is largely unquantified. Discussion This being the case, the paper provides an integrated summary of our conceptual and theoretical thinking so far and introduces a typology (derived from social policy analysis used to distinguish between the terms conceptual framework, theory and model – important definitional and conceptual issues in trying to refine theoretical and methodological approaches to knowledge translation. Secondly, the paper describes the next phase of our work, in particular concentrating on the conceptual thinking and mapping that has led to the generation of the hypothesis that the PARiHS framework is best utilised as a two-stage process: as a preliminary (diagnostic and evaluative measure of the elements and sub-elements of evidence (E and context (C, and then using the aggregated data from these measures to determine the most appropriate facilitation method. The exact nature of the intervention is thus determined by the specific actors in the specific context at a specific time and place. In the process of refining this next phase of our work, we have had to consider the wider issues around the use of theories to inform and shape our research activity; the ongoing challenges of developing robust and sensitive measures; facilitation as an intervention for getting research into practice; and finally to note how the current debates around evidence into practice are adopting wider notions that fit innovations more generally. Summary The paper concludes by suggesting that the future

  3. dCache: implementing a high-end NFSv4.1 service using a Java NIO framework

    CERN Document Server

    CERN. Geneva

    2012-01-01

    dCache is a high performance scalable storage system widely used by HEP community. In addition to set of home grown protocols we also provide industry standard access mechanisms like WebDAV and NFSv4.1. This support places dCache as a direct competitor to commercial solutions. Nevertheless conforming to a protocol is not enough; our implementations must perform comparably or even better than commercial systems. To achieve this, dCache uses two high-end IO frameworks from well know application servers: GlassFish and JBoss. This presentation describes how we implemented an rfc1831 and rfc2203 compliant ONC RPC (Sun RPC) service based on the Grizzly NIO framework, part of the GlassFish application server. This ONC RPC service is the key component of dCache’s NFSv4.1 implementation, but is independent of dCache and available for other projects. We will also show some details of dCache NFS v4.1 implementations, describe some of the Java NIO techniques used and, finally, present details of our performance e...

  4. Offering an Operational Framework for Measuring the Risk Level in the Implementation of Re-engineering Projects of Business Processes

    Directory of Open Access Journals (Sweden)

    Tahmoures Hassan-Gholipour

    2013-01-01

    Full Text Available The appearance of new technologies, increase in competition, and new economic challenges has forced the organizations to implement business processes re-engineering projects. Successful implementation of the re-engineering project is a difficult function and depends on different key factors. The present study seeks to offer a new framework for measuring the risk level in the implementation of re-engineering projects by analyzing its critical success and failure factors. These factors were identified through reviewing the research literature and exploring the expert’s viewpoint. The framework that was offered in this study consists of five success factors and one failure one. The success factors include uniform leadership, cooperative workplace, senior management commitment, supportive management, and the use of information technologies. Also the failure factor includes resistance to change. Because this model identifies the risk level of the project failure before the implementation of re-engineering projects, it is a beneficial instrument for organizations that seek to re-engineer their business processes.

  5. Toward enhancing the distributed video coder under a multiview video codec framework

    Science.gov (United States)

    Lee, Shih-Chieh; Chen, Jiann-Jone; Tsai, Yao-Hong; Chen, Chin-Hua

    2016-11-01

    The advance of video coding technology enables multiview video (MVV) or three-dimensional television (3-D TV) display for users with or without glasses. For mobile devices or wireless applications, a distributed video coder (DVC) can be utilized to shift the encoder complexity to decoder under the MVV coding framework, denoted as multiview distributed video coding (MDVC). We proposed to exploit both inter- and intraview video correlations to enhance side information (SI) and improve the MDVC performance: (1) based on the multiview motion estimation (MVME) framework, a categorized block matching prediction with fidelity weights (COMPETE) was proposed to yield a high quality SI frame for better DVC reconstructed images. (2) The block transform coefficient properties, i.e., DCs and ACs, were exploited to design the priority rate control for the turbo code, such that the DVC decoding can be carried out with fewest parity bits. In comparison, the proposed COMPETE method demonstrated lower time complexity, while presenting better reconstructed video quality. Simulations show that the proposed COMPETE can reduce the time complexity of MVME to 1.29 to 2.56 times smaller, as compared to previous hybrid MVME methods, while the image peak signal to noise ratios (PSNRs) of a decoded video can be improved 0.2 to 3.5 dB, as compared to H.264/AVC intracoding.

  6. Integrating a Trust Framework with a Distributed Certificate Validation Scheme for MANETs

    Directory of Open Access Journals (Sweden)

    Marias Giannis F

    2006-01-01

    Full Text Available Many trust establishment solutions in mobile ad hoc networks (MANETs rely on public key certificates. Therefore, they should be accompanied by an efficient mechanism for certificate revocation and validation. Ad hoc distributed OCSP for trust (ADOPT is a lightweight, distributed, on-demand scheme based on cached OCSP responses, which provides certificate status information to the nodes of a MANET. In this paper we discuss the ADOPT scheme and issues on its deployment over MANETs. We present some possible threats to ADOPT and suggest the use of a trust assessment and establishment framework, named ad hoc trust framework (ATF, to support ADOPT's robustness and efficiency. ADOPT is deployed as a trust-aware application that provides feedback to ATF, which calculates the trustworthiness of the peer nodes' functions and helps ADOPT to improve its performance by rapidly locating valid certificate status information. Moreover, we introduce the TrustSpan algorithm to reduce the overhead that ATF produces, and the TrustPath algorithm to identify and use trusted routes for propagating sensitive information, such as third parties' accusations. Simulation results show that ATF adds limited overhead compared to its efficiency in detecting and isolating malicious and selfish nodes. ADOPT's reliability is increased, since it can rapidly locate a legitimate response by using information provided by ATF.

  7. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    Energy Technology Data Exchange (ETDEWEB)

    Bonior, Jason D [ORNL; Evans, Philip G [ORNL; Sheets, Gregory S [ORNL; Jones, John P [ORNL; Flynn, Toby H [ORNL; O' Neil, Lori Ross [Pacific Northwest National Laboratory (PNNL); Hutton, William [Pacific Northwest National Laboratory (PNNL); Pratt, Richard [Pacific Northwest National Laboratory (PNNL); Carroll, Thomas E. [Pacific Northwest National Laboratory (PNNL)

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  8. A framework for assessing cost management system changes: the case of activity-based costing implementation at food industry

    Directory of Open Access Journals (Sweden)

    Tayebeh Faraji

    2015-04-01

    Full Text Available An opportunity to investigate the technical and organizational effect of management accounting system changes has appeared with companies' adoption of activity-based costing (ABC. This paper presents an empirical investigation to study the effects of ABC system for case study from food industry in Iran. From this case, the paper develops a framework for assessing ABC implementation and hypotheses about factors that influence implementation. The study detects five cost centers and for each cost center, it determines different cost drivers. The results of our survey has detected that implementation of ABC system not only helps precise allocation of overhead costs but also helps internal management companies for better planning and control of production, making better decisions for company's profits.

  9. An organizational framework and strategic implementation for system-level change to enhance research-based practice: QUERI Series

    Directory of Open Access Journals (Sweden)

    Mittman Brian S

    2008-05-01

    Full Text Available Abstract Background The continuing gap between available evidence and current practice in health care reinforces the need for more effective solutions, in particular related to organizational context. Considerable advances have been made within the U.S. Veterans Health Administration (VA in systematically implementing evidence into practice. These advances have been achieved through a system-level program focused on collaboration and partnerships among policy makers, clinicians, and researchers. The Quality Enhancement Research Initiative (QUERI was created to generate research-driven initiatives that directly enhance health care quality within the VA and, simultaneously, contribute to the field of implementation science. This paradigm-shifting effort provided a natural laboratory for exploring organizational change processes. This article describes the underlying change framework and implementation strategy used to operationalize QUERI. Strategic approach to organizational change QUERI used an evidence-based organizational framework focused on three contextual elements: 1 cultural norms and values, in this case related to the role of health services researchers in evidence-based quality improvement; 2 capacity, in this case among researchers and key partners to engage in implementation research; 3 and supportive infrastructures to reinforce expectations for change and to sustain new behaviors as part of the norm. As part of a QUERI Series in Implementation Science, this article describes the framework's application in an innovative integration of health services research, policy, and clinical care delivery. Conclusion QUERI's experience and success provide a case study in organizational change. It demonstrates that progress requires a strategic, systems-based effort. QUERI's evidence-based initiative involved a deliberate cultural shift, requiring ongoing commitment in multiple forms and at multiple levels. VA's commitment to QUERI came in the

  10. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection

    Directory of Open Access Journals (Sweden)

    Declan T. Delaney

    2016-12-01

    Full Text Available No single network solution for Internet of Things (IoT networks can provide the required level of Quality of Service (QoS for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  11. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection.

    Science.gov (United States)

    Delaney, Declan T; O'Hare, Gregory M P

    2016-12-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks.

  12. A Framework to Implement IoT Network Performance Modelling Techniques for Network Solution Selection †

    Science.gov (United States)

    Delaney, Declan T.; O’Hare, Gregory M. P.

    2016-01-01

    No single network solution for Internet of Things (IoT) networks can provide the required level of Quality of Service (QoS) for all applications in all environments. This leads to an increasing number of solutions created to fit particular scenarios. Given the increasing number and complexity of solutions available, it becomes difficult for an application developer to choose the solution which is best suited for an application. This article introduces a framework which autonomously chooses the best solution for the application given the current deployed environment. The framework utilises a performance model to predict the expected performance of a particular solution in a given environment. The framework can then choose an apt solution for the application from a set of available solutions. This article presents the framework with a set of models built using data collected from simulation. The modelling technique can determine with up to 85% accuracy the solution which performs the best for a particular performance metric given a set of solutions. The article highlights the fractured and disjointed practice currently in place for examining and comparing communication solutions and aims to open a discussion on harmonising testing procedures so that different solutions can be directly compared and offers a framework to achieve this within IoT networks. PMID:27916929

  13. Design and implementation of an architectural framework for web portals in a ubiquitous pervasive environment.

    Science.gov (United States)

    Raza, Muhammad Taqi; Yoo, Seung-Wha; Kim, Ki-Hyung; Joo, Seong-Soon; Jeong, Wun-Cheol

    2009-01-01

    Web Portals function as a single point of access to information on the World Wide Web (WWW). The web portal always contacts the portal's gateway for the information flow that causes network traffic over the Internet. Moreover, it provides real time/dynamic access to the stored information, but not access to the real time information. This inherent functionality of web portals limits their role for resource constrained digital devices in the Ubiquitous era (U-era). This paper presents a framework for the web portal in the U-era. We have introduced the concept of Local Regions in the proposed framework, so that the local queries could be solved locally rather than having to route them over the Internet. Moreover, our framework enables one-to-one device communication for real time information flow. To provide an in-depth analysis, firstly, we provide an analytical model for query processing at the servers for our framework-oriented web portal. At the end, we have deployed a testbed, as one of the world's largest IP based wireless sensor networks testbed, and real time measurements are observed that prove the efficacy and workability of the proposed framework.

  14. Towards a framework of critical success factors for implementing supply-chain information systems

    NARCIS (Netherlands)

    Denolf, J.M.; Wognum, P.M.; Trienekens, J.H.; Vorst, van der J.G.A.J.; Omta, S.W.F.

    2015-01-01

    Supply chain information systems (SCISs) have emerged as the core of successful management in supply chains. However, the difficulties of SCIS implementations have been widely cited in the literature. Research on the critical success factors (CSFs) for SCIS implementation is rather scarce and

  15. Participation for effective environmental governance? Evidence from Water Framework Directive implementation in Germany, Spain and the United Kingdom.

    Science.gov (United States)

    Kochskämper, Elisa; Challies, Edward; Newig, Jens; Jager, Nicolas W

    2016-10-01

    Effectiveness of participation in environmental governance is a proliferating assertion in literature that is also reflected in European legislation, such as the European Water Framework Directive (WFD). The Directive mandates participatory river basin management planning across the EU aiming at the delivery of better policy outputs and enhanced implementation. Yet, the impact of this planning mode in WFD implementation remains unclear, though the first planning phase was completed in 2009 and the first implementation cycle by the end of 2015. Notwithstanding the expanding body of literature on WFD implementation, a rather scattered single case study approach seems to predominate. This paper reports on implementation of the WFD in three case studies from Germany, Spain and the United Kingdom, reflecting three substantially different approaches to participatory river basin management planning, on the basis of a comparative case study design. We ask if and how participation improved the environmental standard of outputs and the quality of implementation. We found an increasing quality of outputs with increasing intensity of local participation. Further, social outcomes such as learning occurred within dialogical settings, whereas empowerment and network building emerged also in the case characterized mainly by one-way information. Finally, one important finding deviant from the literature is that stakeholder acceptance seems to be more related to processes than to outputs.

  16. [Sustainable Implementation of Evidence-Based Programmes in Health Promotion: A Theoretical Framework and Concept of Interactive Knowledge to Action].

    Science.gov (United States)

    Rütten, A; Wolff, A; Streber, A

    2016-03-01

    This article discusses 2 current issues in the field of public health research: (i) transfer of scientific knowledge into practice and (ii) sustainable implementation of good practice projects. It also supports integration of scientific and practice-based evidence production. Furthermore, it supports utilisation of interactive models that transcend deductive approaches to the process of knowledge transfer. Existing theoretical approaches, pilot studies and thoughtful conceptual considerations are incorporated into a framework showing the interplay of science, politics and prevention practice, which fosters a more sustainable implementation of health promotion programmes. The framework depicts 4 key processes of interaction between science and prevention practice: interactive knowledge to action, capacity building, programme adaptation and adaptation of the implementation context. Ensuring sustainability of health promotion programmes requires a concentrated process of integrating scientific and practice-based evidence production in the context of implementation. Central to the integration process is the approach of interactive knowledge to action, which especially benefits from capacity building processes that facilitate participation and systematic interaction between relevant stakeholders. Intense cooperation also induces a dynamic interaction between multiple actors and components such as health promotion programmes, target groups, relevant organisations and social, cultural and political contexts. The reciprocal adaptation of programmes and key components of the implementation context can foster effectiveness and sustainability of programmes. Sustainable implementation of evidence-based health promotion programmes requires alternatives to recent deductive models of knowledge transfer. Interactive approaches prove to be promising alternatives. Simultaneously, they change the responsibilities of science, policy and public health practice. Existing boundaries

  17. Experimental implementation of non-Gaussian attacks on a continuous-variable quantum key distribution system

    CERN Document Server

    Lodewyck, J; Garcia-Patron, R; Tualle-Brouri, R; Cerf, N J; Grangier, P; Lodewyck, Jerome; Debuisschert, Thierry; Garcia-Patron, Raul; Tualle-Brouri, Rosa; Cerf, Nicolas J.; Grangier, Philippe

    2007-01-01

    An intercept-resend attack on a continuous-variable quantum-key-distribution protocol is investigated experimentally. By varying the interception fraction, one can implement a family of attacks where the eavesdropper totally controls the channel parameters. In general, such attacks add excess noise in the channel, and may also result in non-Gaussian output distributions. We implement and characterize the measurements needed to detect these attacks, and evaluate experimentally the information rates available to the legitimate users and the eavesdropper. The results are consistent with the optimality of Gaussian attacks resulting from the security proofs.

  18. Conceptual framework for distributed expert-system use in time-sensitive hierarchical control

    Energy Technology Data Exchange (ETDEWEB)

    Henningsen, J.R.

    1987-01-01

    There are many problems faced by decision makers involved in complex, time-sensitive hierarchical control systems. These may include maintaining knowledge of the functional status of the system components, forecasting the impact of past and future events, transferring information to a distant or poorly connected location, changing the requirements for an operation according to resources available, or creating an independent course of action when system connectivity falls. These problems are transdisciplinary in nature, so decision makers in a variety of organizations face them. This research develops a framework for the use of distributed expert systems in support of time-sensitive hierarchical control systems. Attention is focused on determining ways to enhance the likelihood that a system will remain functional during a crisis in which one or more of the system nodes fail. Options in the use of distributed expert systems for this purpose are developed following investigation of related research in the areas of cooperative and distributed systems. A prototype under development of a generic system model called DES (distributed expert systems) is described. DES is a trimular form of support structure, where a trimule is defined to be a combination of a human decision agent, a component system model and an expert system. This concept is an extension of the domular theory of Tenney and Sandell (1981).

  19. A framework for stochastic simulation of distribution practices for hotel reservations

    Energy Technology Data Exchange (ETDEWEB)

    Halkos, George E.; Tsilika, Kyriaki D. [Laboratory of Operations Research, Department of Economics, University of Thessaly, Korai 43, 38 333, Volos (Greece)

    2015-03-10

    The focus of this study is primarily on the Greek hotel industry. The objective is to design and develop a framework for stochastic simulation of reservation requests, reservation arrivals, cancellations and hotel occupancy with a planning horizon of a tourist season. In Greek hospitality industry there have been two competing policies for reservation planning process up to 2003: reservations coming directly from customers and a reservations management relying on tour operator(s). Recently the Internet along with other emerging technologies has offered the potential to disrupt enduring distribution arrangements. The focus of the study is on the choice of distribution intermediaries. We present an empirical model for the hotel reservation planning process that makes use of a symbolic simulation, Monte Carlo method, as, requests for reservations, cancellations, and arrival rates are all sources of uncertainty. We consider as a case study the problem of determining the optimal booking strategy for a medium size hotel in Skiathos Island, Greece. Probability distributions and parameters estimation result from the historical data available and by following suggestions made in the relevant literature. The results of this study may assist hotel managers define distribution strategies for hotel rooms and evaluate the performance of the reservations management system.

  20. Capataz: a framework for distributing algorithms via the World Wide Web

    Directory of Open Access Journals (Sweden)

    Gonzalo J. Martínez

    2015-08-01

    Full Text Available In recent years, some scientists have embraced the distributed computing paradigm. As experiments and simulations demand ever more computing power, coordinating the efforts of many different processors is often the only reasonable resort. We developed an open-source distributed computing framework based on web technologies, and named it Capataz. Acting as an HTTP server, web browsers running on many different devices can connect to it to contribute in the execution of distributed algorithms written in Javascript. Capataz takes advantage of architectures with many cores using web workers. This paper presents an improvement in Capataz´ usability and why it was needed. In previous experiments the total time of distributed algorithms proved to be susceptible to changes in the execution time of the jobs. The system now adapts by bundling jobs together if they are too simple. The computational experiment to test the solution is a brute force estimation of pi. The benchmark results show that by bundling jobs, the overall perfomance is greatly increased.

  1. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science.

    Science.gov (United States)

    Knap, J; Spear, C E; Borodin, O; Leiter, K W

    2015-10-30

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  2. Data Distribution Service-Based Interoperability Framework for Smart Grid Testbed Infrastructure

    Directory of Open Access Journals (Sweden)

    Tarek A. Youssef

    2016-03-01

    Full Text Available This paper presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS.

  3. Deconstructing public participation in the Water Framework Directive: implementation and compliance with the letter or with the spirit of the law

    NARCIS (Netherlands)

    Ker Rault, P.A.; Jeffrey, P.J.

    2008-01-01

    This article offers a fresh reading of the Water Framework Directive (WFD) and of the Common Implementation Strategy guidance document number 8 on public participation (PP) aimed at identifying the conditions required for successful implementation. We propose that a central barrier to implementing A

  4. The limits of splitting: a framework to test model spatial distribution

    Science.gov (United States)

    Lobligeois, F.; Andréassian, V.; Perrin, C.; Loumagne, C.

    2012-04-01

    When it comes to deciding of the necessary spatial representation of a catchment, hydrologists need to choose between spatially lumped and spatially distributed approaches. This decision is not trivial: on the one hand, lumped models have proved both efficient and robust over the years (moreover their relatively low number of parameters limits the numerical problems such as secondary optima, parameter interaction, poor sensitivity); on the other hand many hydrologists believe that distributed models could potentially have a greater ability to take into account the spatial heterogeneity of both rainfall and land surface. Few attempts have been made to test rigorously alternative distributed schemes (see the discussion of semi-lumped and semi-distributed alternatives in Andréassian et al. (2004)). The purpose of our work was to identify whether an optimum level of spatialisation exists: to investigate "the limits of splitting" (Beven, 1996). We propose a framework to evaluate the effect of the distribution over a large set of 181 French catchments, using a newly available high resolution rainfall product of Météo France, combining radar data and raingage measurements. Five grid sizes are studied, as catchments are splitted into 1, 2, 4, 8 and 16 sub-catchments and streamflow simulation results are analysed in validation mode. For each type of basin, we study the trend of model efficiency with the number of sub-catchments. We find paradoxical results: while some catchments clearly benefit from the distribution, others show opposite trends. The large variability between basins underlines the necessity to have enough case studies to reach a robust conclusion. Andréassian, V. et al., 2004. Impact of spatial aggregation of inputs and parameters on the efficiency of rainfall-runoff models: a theoretical study using chimera watersheds. Water Resour. Res., 40(5): W05209, doi: 10.1029/2003WR002854. Beven, K., 1996. The limits of splitting: hydrology. The Science of the

  5. Generic-distributed framework for cloud services marketplace based on unified ontology.

    Science.gov (United States)

    Hasan, Samer; Valli Kumari, V

    2017-11-01

    Cloud computing is a pattern for delivering ubiquitous and on demand computing resources based on pay-as-you-use financial model. Typically, cloud providers advertise cloud service descriptions in various formats on the Internet. On the other hand, cloud consumers use available search engines (Google and Yahoo) to explore cloud service descriptions and find the adequate service. Unfortunately, general purpose search engines are not designed to provide a small and complete set of results, which makes the process a big challenge. This paper presents a generic-distrusted framework for cloud services marketplace to automate cloud services discovery and selection process, and remove the barriers between service providers and consumers. Additionally, this work implements two instances of generic framework by adopting two different matching algorithms; namely dominant and recessive attributes algorithm borrowed from gene science and semantic similarity algorithm based on unified cloud service ontology. Finally, this paper presents unified cloud services ontology and models the real-life cloud services according to the proposed ontology. To the best of the authors' knowledge, this is the first attempt to build a cloud services marketplace where cloud providers and cloud consumers can trend cloud services as utilities. In comparison with existing work, semantic approach reduced the execution time by 20% and maintained the same values for all other parameters. On the other hand, dominant and recessive attributes approach reduced the execution time by 57% but showed lower value for recall.

  6. A policy-based multi-objective optimisation framework for residential distributed energy system design★

    Directory of Open Access Journals (Sweden)

    Wouters Carmen

    2017-01-01

    Full Text Available Distributed energy systems (DES are increasingly being introduced as solutions to alleviate conventional energy system challenges related to energy security, climate change and increasing demands. From a technological and economic perspective, distributed energy resources are already becoming viable. The question still remains as to how these technologies and practices can be “best” selected, sized and integrated within consumer areas. To aid decision-makers and enable widespread DES adoption, a strategic superstructure design framework is therefore still required that ensures balancing of multiple stakeholder interests and fits in with liberalised energy system objectives of competition, security of supply and sustainability. Such a design framework is presented in this work. An optimisation-based approach for the design of neighbourhood-based DES is developed that enables meeting their yearly electricity, heating and cooling needs by appropriately selecting, sizing and locating technologies and energy interactions. A pool of poly-generation and storage technologies is hereto considered combined with local energy sharing between participating prosumers through thermal pipeline design and microgrid operation, and, a bi-directional connection with the central distribution grid. A superstructure mixed-integer linear programming approach (MILP is proposed to trade off three minimisation objectives in the design process: total annualised cost, annual CO2 emissions and electrical system unavailability, aligned with the three central energy system objectives. The developed model is applied on a small South Australian neighbourhood. The approach enables identifying “knee-point” neighbourhood energy system designs through Pareto trade-offs between objectives and serves to inform decision-makers about the impact of policy objectives on DES development strategies.

  7. How far are we from full implementation of health promoting workplace concepts? A review of implementation tools and frameworks in workplace interventions.

    Science.gov (United States)

    Motalebi G, Masoud; Keshavarz Mohammadi, Nastaran; Kuhn, Karl; Ramezankhani, Ali; Azari, Mansour R

    2017-01-08

    Health promoting workplace frameworks provide a holistic view on determinants of workplace health and the link between individuals, work and environment, however, the operationalization of these frameworks has not been very clear. This study provides a typology of the different understandings, frameworks/tools used in the workplace health promotion practice or research worldwide. It discusses the degree of their conformity with Ottawa Charter's spirit and the key actions expected to be implemented in health promoting settings such as workplaces. A comprehensive online search was conducted utilizing relevant key words. The search also included official websites of related international, regional, and national organizations. After exclusion, 27 texts were analysed utilizing conventional content analyses. The results of the analysis were categorized as dimensions (level or main structure) of a healthy or health promoting workplaces and subcategorized characteristics/criteria of healthy/health promoting workplace. Our analysis shows diversity and ambiguity in the workplace health literature regarding domains and characteristics of a healthy/health promoting workplace. This may have roots in lack of a common understanding of the concepts or different social and work environment context. Development of global or national health promoting workplace standards in a participatory process might be considered as a potential solution. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. IMPLEMENTATION OF MULTIAGENT REINFORCEMENT LEARNING MECHANISM FOR OPTIMAL ISLANDING OPERATION OF DISTRIBUTION NETWORK

    DEFF Research Database (Denmark)

    Saleem, Arshad; Lind, Morten

    2008-01-01

    among electric power utilities to utilize modern information and communication technologies (ICT) in order to improve the automation of the distribution system. In this paper we present our work for the implementation of a dynamic multi-agent based distributed reinforcement learning mechanism......The Electric Power system of Denmark exhibits some unique characteristics. An increasing part of the electricity is produced by local generators called distributed generators DGs. Most of these DGs are connected to network through the distribution system. This situation has created an incentive...... for the islanding operation of the distribution system. Purpose of this system is to dynamically divide the distribution network in different sections (islands), in a fault scenario when they are separated from main utility system, and make them survive on local DGs....

  9. A Design Based Research Framework for Implementing a Transnational Mobile and Blended Learning Solution

    Science.gov (United States)

    Palalas, Agnieszka; Berezin, Nicole; Gunawardena, Charlotte; Kramer, Gretchen

    2015-01-01

    The article proposes a modified Design-Based Research (DBR) framework which accommodates the various socio-cultural factors that emerged in the longitudinal PA-HELP research study at Central University College (CUC) in Ghana, Africa. A transnational team of stakeholders from Ghana, Canada, and the USA collaborated on the development,…

  10. Implementation of the international and regional human rights framework for the elimination of female genital mutilation

    NARCIS (Netherlands)

    Middelburg, M.J.; Desiderio, Rene

    2014-01-01

    A human rights approach to FGM places the practice within a broader social justice agenda — one that emphasizes the responsibilities of governments to ensure realization of the full spectrum of women’s and girls’ rights. In order to place FGM within a human rights framework, it is critical to know m

  11. Surveillance indicators and their use in implementation of the marine strategy framework directive

    NARCIS (Netherlands)

    Shephard, Samuel; Greenstreet, S.P.R.; Piet, G.J.; Rindorf, Anna; Dickey-Collas, Mark

    2015-01-01

    The European Union Marine Strategy Framework Directive (MSFD) uses indicators to track ecosystem state in relation to Good Environmental Status (GES). These indicators were initially expected to be "operational", i.e. To have well-understood relationships between state and specified anthropogenic

  12. Surveillance indicators and their use in implementation of the marine strategy framework directive

    NARCIS (Netherlands)

    Shephard, Samuel; Greenstreet, S.P.R.; Piet, G.J.; Rindorf, Anna; Dickey-Collas, Mark

    2015-01-01

    The European Union Marine Strategy Framework Directive (MSFD) uses indicators to track ecosystem state in relation to Good Environmental Status (GES). These indicators were initially expected to be "operational", i.e. To have well-understood relationships between state and specified anthropogenic

  13. Implementing a Standards Development Framework for the Coalition Battle Management Language

    Science.gov (United States)

    2013-06-01

    Voice-to-Message (e.g. 9-Liners) Natural Language Translation Interfaces Current Command & Staff Training Also, NATO COPD Sustaining...on the US Joint Intelligence Community/DoD Content Discovery and Retrieval ( IC /DoD CDR) Model Overview C-BML Standard Development Framework

  14. A Design Based Research Framework for Implementing a Transnational Mobile and Blended Learning Solution

    Science.gov (United States)

    Palalas, Agnieszka; Berezin, Nicole; Gunawardena, Charlotte; Kramer, Gretchen

    2015-01-01

    The article proposes a modified Design-Based Research (DBR) framework which accommodates the various socio-cultural factors that emerged in the longitudinal PA-HELP research study at Central University College (CUC) in Ghana, Africa. A transnational team of stakeholders from Ghana, Canada, and the USA collaborated on the development,…

  15. Framework for Implementing Engineering Senior Design Capstone Courses and Design Clinics

    Science.gov (United States)

    Franchetti, Matthew; Hefzy, Mohamed Samir; Pourazady, Mehdi; Smallman, Christine

    2012-01-01

    Senior design capstone projects for engineering students are essential components of an undergraduate program that enhances communication, teamwork, and problem solving skills. Capstone projects with industry are well established in management, but not as heavily utilized in engineering. This paper outlines a general framework that can be used by…

  16. Grounding a new information technology implementation framework in behavioral science: a systematic analysis of the literature on IT use.

    Science.gov (United States)

    Kukafka, Rita; Johnson, Stephen B; Linfante, Allison; Allegrante, John P

    2003-06-01

    Many interventions to improve the success of information technology (IT) implementations are grounded in behavioral science, using theories, and models to identify conditions and determinants of successful use. However, each model in the IT literature has evolved to address specific theoretical problems of particular disciplinary concerns, and each model has been tested and has evolved using, in most cases, a more or less restricted set of IT implementation procedures. Functionally, this limits the perspective for taking into account the multiple factors at the individual, group, and organizational levels that influence use behavior. While a rich body of literature has emerged, employing prominent models such as the Technology Adoption Model, Social-Cognitive Theory, and Diffusion of Innovation Theory, the complexity of defining a suitable multi-level intervention has largely been overlooked. A gap exists between the implementation of IT and the integration of theories and models that can be utilized to develop multi-level approaches to identify factors that impede usage behavior. We present a novel framework that is intended to guide synthesis of more than one theoretical perspective for the purpose of planning multi-level interventions to enhance IT use. This integrative framework is adapted from PRECEDE/PROCEDE, a conceptual framework used by health planners in hundreds of published studies to direct interventions that account for the multiple determinants of behavior. Since we claim that the literature on IT use behavior does not now include a multi-level approach, we undertook a systematic literature analysis to confirm this assertion. Our framework facilitated organizing this literature synthesis and our analysis was aimed at determining if the IT implementation approaches in the published literature were characterized by an approach that considered at least two levels of IT usage determinants. We found that while 61% of studies mentioned or referred to

  17. Explaining willingness of public professionals to implement new policies: A policy alienation framework

    NARCIS (Netherlands)

    L.G. Tummers (Lars)

    2010-01-01

    textabstractNowadays, public professionals are often unwilling to implement new policies. We analyse this problem using an interdisciplinary approach, combining public administration and change management literature. From public administration, we use the policy alienation concept, consisting of fiv

  18. Explaining willingness of public professionals to implement new policies: A policy alienation framework

    NARCIS (Netherlands)

    L.G. Tummers (Lars)

    2010-01-01

    textabstractNowadays, public professionals are often unwilling to implement new policies. We analyse this problem using an interdisciplinary approach, combining public administration and change management literature. From public administration, we use the policy alienation concept, consisting of

  19. Implementing the Victory Access Control Framework in a Military Ground Vehicle

    Science.gov (United States)

    2015-08-01

    Protocol ( SOAP ) message body, but lacked the ability to encrypt individual XML elements within the SOAP body. Several attempts were made to augment the C...not cost- effective to implement the VACF components in C or C++. The complexity of the VACF message sets and the preponderance of existing open...availability of supporting software precludes the cost- effective implementation of VACF components in languages such as C/C++ • Unknown support

  20. Comparative assessment of surface fluxes from different sources: a framework based on probability distributions

    Science.gov (United States)

    Gulev, S.

    2015-12-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions. Our assessment also discriminated different reanalyses and satellite products with respect to their ability to quantify the role of extreme surface turbulent fluxes in forming ocean heat release in different regions.

  1. Automated Energy Distribution and Reliability System: Validation Integration - Results of Future Architecture Implementation

    Energy Technology Data Exchange (ETDEWEB)

    Buche, D. L.

    2008-06-01

    This report describes Northern Indiana Public Service Co. project efforts to develop an automated energy distribution and reliability system. The purpose of this project was to implement a database-driven GIS solution that would manage all of the company's gas, electric, and landbase objects. This report is second in a series of reports detailing this effort.

  2. Asynchronous Implementation of Distributed Coordination Algorithms : Conditions Using Partially Scrambling and Essentially Cyclic Matrices

    NARCIS (Netherlands)

    Chen, Yao; Xia, Weiguo; Cao, Ming; Lu, Jinhu

    2016-01-01

    Given a distributed coordination algorithm (DCA) for agents coupled by a network, which can be characterized by a stochastic matrix, we say that the DCA can be asynchronously implemented if the consensus property is preserved when the agents are activated to update their states according to their ow

  3. A framework for multi-object tracking over distributed wireless camera networks

    Science.gov (United States)

    Gau, Victor; Hwang, Jenq-Neng

    2010-07-01

    In this paper, we propose a unified framework targeting at two important issues in a distributed wireless camera network, i.e., object tracking and network communication, to achieve reliable multi-object tracking over distributed wireless camera networks. In the object tracking part, we propose a fully automated approach for tracking of multiple objects across multiple cameras with overlapping and non-overlapping field of views without initial training. To effectively exchange the tracking information among the distributed cameras, we proposed an idle probability based broadcasting method, iPro, which adaptively adjusts the broadcast probability to improve the broadcast effectiveness in a dense saturated camera network. Experimental results for the multi-object tracking demonstrate the promising performance of our approach on real video sequences for cameras with overlapping and non-overlapping views. The modeling and ns-2 simulation results show that iPro almost approaches the theoretical performance upper bound if cameras are within each other's transmission range. In more general scenarios, e.g., in case of hidden node problems, the simulation results show that iPro significantly outperforms standard IEEE 802.11, especially when the number of competing nodes increases.

  4. Semantics-based distributed I/O with the ParaMEDIC framework.

    Energy Technology Data Exchange (ETDEWEB)

    Balaji, P.; Feng, W.; Lin, H.; Mathematics and Computer Science; Virginia Tech; North Carolina State Univ.

    2008-01-01

    Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generated data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.

  5. IMPLEMENTATION OF DISTRIBUTION CENTERS AS LOGISTICS COMPETITIVE ADVANTAGE: STUDY ON OIL COMPANY DISTRIBUTOR IN SOUTHEAST BRAZIL

    Directory of Open Access Journals (Sweden)

    Henrique Albernaz

    2014-12-01

    Full Text Available This study aims to present how distribution centers implementation in organization is able to provide a competitive advantage over its competitors. The qualitative research was based on multiple case studies. Thus, these cases were focused on lubricants segment national distribution company. It was intended to introduce improvements recognizing distribution centers (DCs importance as competitive advantage. DC Macaé-RJ and Piracicaba-SP were chosen to represent this scenario. Therefore, as results it was found increased sales and operating leverage of results within the market in which it operates.

  6. Hyper—Distributed Hyper—Parallel Implementation of Heuristic Search of Implicit AND/OR Graph

    Institute of Scientific and Technical Information of China (English)

    帅典勋

    1997-01-01

    This paper presents the hierarchic chaotic cellular networks for the hardware implementation of hyper-distributed hyper-parallel intelligent problem solving based on competitive wave propagation.By using the bifurcation and the synchronization of distributed chaotic dynamic systems,and by improving the Chua's circuit,the mechanism and the algorithms of heuristic search of an implicit AND/OR graph are realized in a hyper-distributed hyper-parallel environment.This paper's approach has many advantages in comparison with other traditional systolic structures based on symbolic logic algorithms.

  7. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  8. CubeSat Cloud: A framework for distributed storage, processing and communication of remote sensing data on cubesat clusters

    Science.gov (United States)

    Challa, Obulapathi Nayudu

    CubeSat Cloud is a novel vision for a space based remote sensing network that includes a collection of small satellites (including CubeSats), ground stations, and a server, where a CubeSat is a miniaturized satellite with a volume of a 10x10x10 cm cube and has a weight of approximately 1 kg. The small form factor of CubeSats limits the processing and communication capabilities. Implemented and deployed CubeSats have demonstrated about 1 GHz processing speed and 9.6 kbps communication speed. A CubeSat in its current state can take hours to process a 100 MB image and more than a day to downlink the same, which prohibits remote sensing, considering the limitations in ground station access time for a CubeSat. This dissertation designs an architecture and supporting networking protocols to create CubeSat Cloud, a distributed processing, storage and communication framework that will enable faster execution of remote sensing missions on CubeSat clusters. The core components of CubeSat Cloud are CubeSat Distributed File System, CubeSat MapMerge, and CubeSat Torrent. The CubeSat Distributed File System has been created for distributing of large amounts of data among the satellites in the cluster. Once the data is distributed, CubeSat MapReduce has been created to process the data in parallel, thereby reducing the processing load for each CubeSat. Finally, CubeSat Torrent has been created to downlink the data at each CubeSat to a distributed set of ground stations, enabling faster asynchronous downloads. Ground stations send the downlinked data to the server to reconstruct the original image and store it for later retrieval. Analysis of the proposed CubeSat Cloud architecture was performed using a custom-designed simulator, called CubeNet and an emulation test bed using Raspberry Pi devices. Results show that for cluster sizes ranging from 5 to 25 small satellites, faster download speeds up to 4 to 22 times faster - can be achieved when using CubeSat Cloud, compared to a

  9. Implementation of Electricity Business Competition Framework with Economic Dispatch Direct Method

    Directory of Open Access Journals (Sweden)

    Yusra Sabri

    2012-12-01

    Full Text Available Technically, electricity business under competition structure is more complex than that of vertically integrated one. The main prolems here are how to create an applicable competition framework and to solve electric calculations very quickly to obtain an optimal energi pricing, cost of losses, congestion and transportation costs by less than 15 minutes. This paper proposes a competition framework with the electric calculations, where a bilateral contract has been accommodated. Optimal energy price in the paper is calculated based on direct method of economic dispatch to obtain the result very quickly. The proposed method has been simulated to a 4-bus system. The simulation results show that the method works well and complies with the expectation. Therefore, electric power business under competition structure can be well realized by the proposed method.

  10. Implementing a pragmatic framework for authentic patient-researcher partnerships in clinical research.

    Science.gov (United States)

    Fagan, Maureen B; Morrison, Constance Rc; Wong, Celene; Carnie, Martha B; Gabbai-Saldate, Paulette

    2016-05-01

    In response to the creation of the Patient-Centered Outcomes Research Institute in 2010, researchers have begun to incorporate patient and family stakeholders into the research process as equal partners, bringing their unique perspectives and experiences to the table. Nonetheless, there is a dearth of literature around how best to engage patients and families and many barriers to doing so effectively. This paper outlines a pragmatic framework of collaborative engagement and partnership between research investigators and patient and family advisors from existing patient and family advisory councils (PFACs) at an academic medical center. This framework includes the role for each party throughout the clinical research process (launch, hypothesis, specific aims, measures/methods, results, interpretations/recommendation and dissemination).

  11. Implementing a framework for goal setting in community based stroke rehabilitation: a process evaluation

    OpenAIRE

    Scobbie, L.; McLean, D; Dixon, D; Duncan, E; Wyke, S

    2013-01-01

    Background: Goal setting is considered ‘best practice’ in stroke rehabilitation; however, there is no consensus regarding the key components of goal setting interventions or how they should be optimally delivered in practice. We developed a theory-based goal setting and action planning framework (G-AP) to guide goal setting practice. G-AP has 4 stages: goal negotiation, goal setting, action planning and coping planning and appraisal and feedback. All stages are recorded in a patient-held reco...

  12. Microplastics in Seawater: Recommendations from the Marine Strategy Framework Directive Implementation Process

    OpenAIRE

    2016-01-01

    Microplastic litter is a pervasive pollutant present in marine systems across the globe. The legacy of microplastics pollution in the marine environment today may remain for years to come due to the persistence of these materials. Microplastics are emerging contaminants of potential concern and as yet there are few recognised approaches for monitoring. In 2008, the EU Marine Strategy Framework Directive (MSFD, 2008/56/EC) included microplastics as an aspect to be measured. Here we outline the...

  13. Implementation of Two-Dimensional Polycrystalline Grains in Object Oriented Micromagnetic Framework

    Directory of Open Access Journals (Sweden)

    Lau, J. W.

    2009-01-01

    Full Text Available In response to the growing need for a more accurate micromagnetic model to understand switching phenomenon in nanoscale magnets, we developed the capability to simulate two-dimensional polycrystalline grains using the Object Oriented Micromagnetic Framework (OOMMF. This addition allows users full flexibility in determining the magnetocrystalline anisotropy and axe in each grain as well as the inter- and intragranular exchange coupling strength.

  14. Towards a global water scarcity risk assessment framework: using scenarios and risk distributions

    Science.gov (United States)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Philip

    2016-04-01

    Over the past decades, changing hydro-climatic and socioeconomic conditions have led to increased water scarcity problems. A large number of studies have shown that these water scarcity conditions will worsen in the near future. Despite numerous calls for risk-based assessments of water scarcity, a framework that includes UNISDR's definition of risk does not yet exist at the global scale. This study provides a first step towards such a risk-based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change projections and socioeconomic scenarios. Our study highlights that water scarcity risk increases given all future scenarios, up to >56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity in terms of Expected Annual Exposed Population, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels. Covering hazard, exposure, and vulnerability, risk-based methods are well-suited to assess water scarcity adaptation. Completing the presented risk framework therefore offers water managers a promising perspective to increase water security in a well-informed and adaptive manner.

  15. Distributed blackboard decision-making framework for collaborative planning based on nested genetic algorithm

    Institute of Scientific and Technical Information of China (English)

    Yaozhong Zhang,Lei Zhang,; Zhiqiang Du

    2015-01-01

    A distributed blackboard decision-making framework for col aborative planning based on nested genetic algorithm (NGA) is proposed. By using blackboard-based communication paradigm and shared data structure, multiple decision-makers (DMs) can col aboratively solve the tasks-platforms al ocation scheduling problems dynamical y through the coordinator. This methodo-logy combined with NGA maximizes tasks execution accuracy, also minimizes the weighted total workload of the DM which is measured in terms of intra-DM and inter-DM coordination. The intra-DM employs an optimization-based scheduling algorithm to match the tasks-platforms assignment request with its own plat-forms. The inter-DM coordinates the exchange of col aborative re-quest information and platforms among DMs using the blackboard architecture. The numerical result shows that the proposed black-board DM framework based on NGA can obtain a near-optimal solution for the tasks-platforms col aborative planning problem. The assignment of platforms-tasks and the patterns of coordina-tion can achieve a nice trade-off between intra-DM and inter-DM coordination workload.

  16. Regulating the spatial distribution of metal nanoparticles within metal-organic frameworks to enhance catalytic efficiency

    Science.gov (United States)

    Yang, Qiu; Liu, Wenxian; Wang, Bingqing; Zhang, Weina; Zeng, Xiaoqiao; Zhang, Cong; Qin, Yongji; Sun, Xiaoming; Wu, Tianpin; Liu, Junfeng; Huo, Fengwei; Lu, Jun

    2017-01-01

    Composites incorporating metal nanoparticles (MNPs) within metal-organic frameworks (MOFs) have broad applications in many fields. However, the controlled spatial distribution of the MNPs within MOFs remains a challenge for addressing key issues in catalysis, for example, the efficiency of catalysts due to the limitation of molecular diffusion within MOF channels. Here we report a facile strategy that enables MNPs to be encapsulated into MOFs with controllable spatial localization by using metal oxide both as support to load MNPs and as a sacrificial template to grow MOFs. This strategy is versatile to a variety of MNPs and MOF crystals. By localizing the encapsulated MNPs closer to the surface of MOFs, the resultant MNPs@MOF composites not only exhibit effective selectivity derived from MOF cavities, but also enhanced catalytic activity due to the spatial regulation of MNPs as close as possible to the MOF surface. PMID:28195131

  17. Regulating the spatial distribution of metal nanoparticles within metal-organic frameworks to enhance catalytic efficiency

    Science.gov (United States)

    Yang, Qiu; Liu, Wenxian; Wang, Bingqing; Zhang, Weina; Zeng, Xiaoqiao; Zhang, Cong; Qin, Yongji; Sun, Xiaoming; Wu, Tianpin; Liu, Junfeng; Huo, Fengwei; Lu, Jun

    2017-02-01

    Composites incorporating metal nanoparticles (MNPs) within metal-organic frameworks (MOFs) have broad applications in many fields. However, the controlled spatial distribution of the MNPs within MOFs remains a challenge for addressing key issues in catalysis, for example, the efficiency of catalysts due to the limitation of molecular diffusion within MOF channels. Here we report a facile strategy that enables MNPs to be encapsulated into MOFs with controllable spatial localization by using metal oxide both as support to load MNPs and as a sacrificial template to grow MOFs. This strategy is versatile to a variety of MNPs and MOF crystals. By localizing the encapsulated MNPs closer to the surface of MOFs, the resultant MNPs@MOF composites not only exhibit effective selectivity derived from MOF cavities, but also enhanced catalytic activity due to the spatial regulation of MNPs as close as possible to the MOF surface.

  18. STATE SPACE GENERATION FRAMEWORK BASED ON BINARY DECISION DIAGRAM FOR DISTRIBUTED EXPLICIT MODEL CHECKING

    Directory of Open Access Journals (Sweden)

    Nacer Tabib

    2016-01-01

    Full Text Available This paper proposes a new framework based on Binary Decision Diagrams (BDD for the graph distribution problem in the context of explicit model checking. The BDD are yet used to represent the state space for a symbolic verification model checking. Thus, we took advantage of high compression ratio of BDD to encode not only the state space, but also the place where each state will be put. So, a fitness function that allows a good balance load of states over the nodes of an homogeneous network is used. Furthermore, a detailed explanation of how to calculate the inter-site edges between different nodes based on the adapted data structure is presented.

  19. A Practical Framework for Sharing and Rendering Real-World Bidirectional Scattering Distribution Functions

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Greg [Anywhere Software, Albany, CA (United States); Kurt, Murat [International Computer Institute, Ege University (Turkey); Bonneel, Nicolas [Harvard Univ., Cambridge, MA (United States)

    2012-09-30

    The utilization of real-world materials has been hindered by a lack of standards for sharing and interpreting measured data. This paper presents an XML representation and an Open Source C library to support bidirectional scattering distribution functions (BSDFs) in data-driven lighting simulation and rendering applications.The library provides for the efficient representation, query, and Monte Carlo sampling of arbitrary BSDFs in amodel-free framework. Currently, we support two BSDF data representations: one using a fixed subdivision of thehemisphere, and one with adaptive density. The fixed type has advantages for certain matrix operations, while theadaptive type can more accurately represent highly peaked data. We discuss advanced methods for data-drivenBSDF rendering for both types, including the proxy of detailed geometry to enhance appearance and accuracy.We also present an advanced interpolation method to reduce measured data into these standard representations.We end with our plan for future extensions and sharing of BSDF data.

  20. Framework and Method for Controlling a Robotic System Using a Distributed Computer Network

    Science.gov (United States)

    Sanders, Adam M. (Inventor); Barajas, Leandro G. (Inventor); Permenter, Frank Noble (Inventor); Strawser, Philip A. (Inventor)

    2015-01-01

    A robotic system for performing an autonomous task includes a humanoid robot having a plurality of compliant robotic joints, actuators, and other integrated system devices that are controllable in response to control data from various control points, and having sensors for measuring feedback data at the control points. The system includes a multi-level distributed control framework (DCF) for controlling the integrated system components over multiple high-speed communication networks. The DCF has a plurality of first controllers each embedded in a respective one of the integrated system components, e.g., the robotic joints, a second controller coordinating the components via the first controllers, and a third controller for transmitting a signal commanding performance of the autonomous task to the second controller. The DCF virtually centralizes all of the control data and the feedback data in a single location to facilitate control of the robot across the multiple communication networks.

  1. Report on OCDE’s tax bases erosion and shifting benefits: origin and implementation within international and global framework

    Directory of Open Access Journals (Sweden)

    Fernando Serrano Antón

    2014-07-01

    Full Text Available This work is intended to analyze circumstances leading to OCDE’s report on tax bases erosion and shifting benefits. Inconsistency of tax systems and unilateralism in current economic globalization framework might have led to asymmetric tax situations, mostly exploited by multinational companies. Means and tools used and proposed by several international institutions in order to implement legally binding actions through soft law and acceptance by different countries as method used in the fight against tax avoidance and fraud are also discussed.

  2. Implementation of Theeuropeanwater Framework Directive In France: New Challenges For River Basin Organisat Ion, Planning and Participation

    Science.gov (United States)

    Allain, S.

    The European Water Framework Directive (2000/60/EC) establishes a system of participatory river basin planning for national and international basins. The French institutional framework for water management is already very close to this system: the 1964 Water Law actually set up basin bodies, the Agences de l'Eau ("Water Agencies"), at the level of large river basins, and multipartite basin commissions, the Comités de Bassin ("River Basin Authorities"), in order to monitor the Agences de l'Eau's policies; besides, the 1992 Water Law created a planning procedure at this level, the Schéma Directeur d'Aménagement et de Gestion des Eaux (SDAGE : "General Water Management Plan"), aiming to determine general orientations for the management of water resources and having to be defined by the Comités de Bassin. At first glance therefore, the implementation of the European Water Framework Directive should not raise a lot of problems in France. However, a quick analysis of the current situation shows that it is not so obvious : if the French Water Policy set up two basin organisations, neither of them deals concretely with the management of the water resources, and the implementation of water management plans depends on many stakeholders; the SDAGE itself only partially meets the demands of the Directive, regarding e. g. the economic analysis; finally, in spite of the creation of multipartite basin commissions, the public participation is very restricted. Such an analysis leads to pay more attention to the relations to establish between organisation, planning and participation at the level of large river basins. An analysis of other elements of the French institutional framework can help us in this way : another planning procedure was actually created by the 1992 Water Law, the Schéma d'Aménagement et de Gestion des Eaux (SAGE : "Water Management Plan"), aiming to fix general objectives to manage the water resources at the level of small river basins, and having to be

  3. Design and Implementation of Key Techniques for Mobile Ad hoc Network Adaptive QoS Provisioning Framework

    Institute of Scientific and Technical Information of China (English)

    YAOYinxiong; LIUJianxun; TANGXinhuai

    2004-01-01

    MAQF is a newly proposed adaptive QoS provisioning framework for Mobile Ad hoc network (MANET) by the authors. Through modifying the architecture of INSIGNIA and adding some components, MAQF overcomes many disadvantages appearing in related works and supports QoS guarantees for MANET. This paper focuses on the design and implementation of some key techniques in MAQF, including QoS routing, signaling in band, adaptive control mechanism, dynamic resource adaptation algorithm and, etc. Simulation results are presented and have verified the validity of MAQF.

  4. Piloting a logic-based framework for understanding organisational change process for a health IT implementation.

    Science.gov (United States)

    Diment, Kieren; Garrety, Karin; Yu, Ping

    2011-01-01

    This paper describes how a method for evaluating organisational change based on the theory of logical types can be used for classifying organisational change processes to understand change after the implementation of an electronic documentation system in a residential aged care facility. In this instance we assess the organisational change reflected by care staff's perceptions of the benefits of the new documentation system at one site, at pre-implementation, and at 12 months post-implementation. The results show how a coherent view from the staff as a whole of the personal benefits, the benefits for others and the benefits for the organization create a situation of positive feedback leading to embeddedness of the documentation system into the site, and a broader appreciation of the potential capabilities of the electronic documentation system.

  5. Design and Implementation of Distributed Numerical Control in Flexible Manufacturing System

    Institute of Scientific and Technical Information of China (English)

    周炳海; 余传猛; 奚立峰; 曹永上

    2004-01-01

    To monitor, control and manage the work process ofcomputer numerical control machine tools in a flexible manufacturing system (FMS) effectively, the distributed numerical control (distributed-NC) software should be innovated with the characteristics of modularization and reconfiguration. In this paper, firstly, distributed-NC functions in the FMS environment are described. Then, we present a design and development method of the real time distributed-NC that is on the basis of the re-configurable software and hardware platform and with an object-oriented model concept. Finally, to verify the proposed method, the distributed-NC software has been implemented in VC + +6.0 and has been tested in connection with the different physical flexible manufacturing shops.

  6. Governance Strengths and Weaknesses to implement the Marine Strategy Framework Directive in European Waters

    DEFF Research Database (Denmark)

    Freire-Gibb, L. Carlos; Koss, Rebecca; Piotr, Margonski

    2014-01-01

    addresses the Strengths, Weakness, Opportunities and Threats (SWOT) of the current European marine governance structures and its relationship to implement the MSFD. Results of the SWOT analysis were acquired through a combination of approaches with MSFD experts and stakeholders including: 30 face......-to-face interviews, an online survey with 264 stakeholder respondents and focus groups within each European marine region. The SWOT analysis concurrently identifies common strengths and weakness and key governance issues for implementing the MSFD for European marine regions. This paper forms one assessment within...

  7. Implementing the distributed consensus-based estimation of environmental variables in unattended wireless sensor networks

    Science.gov (United States)

    Contreras, Rodrigo; Restrepo, Silvia E.; Pezoa, Jorge E.

    2014-10-01

    In this paper, the prototype implementation of a scalable, distributed protocol for calculating the global average of sensed environmental variables in unattended wireless sensor networks (WSNs) is presented. The design and implementation of the protocol introduces a communication scheme for discovering the WSN topology. Such scheme uses a synchronous flooding algorithm, which was implemented over an unreliable radiogram-based wireless channel. The topology discovery protocol has been synchronized with sampling time of the WSN and must be executed before the consensus-based estimation of the global averages. An average consensus algorithm, suited for clustered WSNs with static topologies, was selected from the literature. The algorithm was properly modified so that its implementation guarantees that the convergence time is bounded and less than the sampling time of the WSN. Moreover, to implement the consensus algorithm, a reliable packet-passing protocol was designed to exchange the weighting factors among the sensor nodes. Since the amount of data exchanged in each packet is bounded by the degree of the WSN, the scalability of the protocol is guaranteed to be linear. The proposed protocol was implemented in the Sun SPOT hardware/software platform using the Java programming language. All the radio communications were implemented over the IEEE 802.15.4 standard and the sensed environmental variables corresponded to the temperature and luminosity.

  8. Examining the Quality of Technology Implementation in STEM Classrooms: Demonstration of an Evaluative Framework

    Science.gov (United States)

    Parker, Caroline E.; Stylinski, Cathlyn D.; Bonney, Christina R.; Schillaci, Rebecca; McAuliffe, Carla

    2015-01-01

    Technology applications aligned with science, technology, engineering, and math (STEM) workplace practices can engage students in real-world pursuits but also present dramatic challenges for classroom implementation. We examined the impact of teacher professional development focused on incorporating these workplace technologies in the classroom.…

  9. Explaining the willingness of public professionals to implement new policies: A policy alienation framework

    NARCIS (Netherlands)

    L.G. Tummers (Lars)

    2011-01-01

    textabstractNowadays, many public policies focus on economic values, such as efficiency and client choice. Public professionals often show resistance to implementing such policies. We analyse this problem using an interdisciplinary approach. From public administration, we draw on the policy

  10. A framework for implementation of user-centric identity management systems

    DEFF Research Database (Denmark)

    Adjei, Joseph K.; Olesen, Henning

    2010-01-01

    -ernments in many countries are implementing identity man-agement systems (IdMS) to curtail these incidences and to offer citizens the power to exercise informational self-determination. Using concepts from technology adoption and fit-viability theo-ries as well as the laws of identity, this paper analyzes...

  11. Implementing a Quality Management Framework in a Higher Education Organisation: A Case Study

    Science.gov (United States)

    O'Mahony, Kim; Garavan, Thomas N.

    2012-01-01

    Purpose: This paper aims to report and analyse the lessons learned from a case study on the implementation of a quality management system within an IT Division in a higher education (HE) organisation. Design/methodology/approach: The paper is based on a review of the relevant literatures and the use of primary sources such as document analysis,…

  12. Examining the Quality of Technology Implementation in STEM Classrooms: Demonstration of an Evaluative Framework

    Science.gov (United States)

    Parker, Caroline E.; Stylinski, Cathlyn D.; Bonney, Christina R.; Schillaci, Rebecca; McAuliffe, Carla

    2015-01-01

    Technology applications aligned with science, technology, engineering, and math (STEM) workplace practices can engage students in real-world pursuits but also present dramatic challenges for classroom implementation. We examined the impact of teacher professional development focused on incorporating these workplace technologies in the classroom.…

  13. A conceptual framework for outsourcing of materials handling activities in automotive : differentiation and implementation

    NARCIS (Netherlands)

    Klingenberg, W.; Boksma, J. D.

    2010-01-01

    This article discusses the outsourcing of materials handling activities and investigates different options for its implementation. The article uses descriptive case studies found in literature from the Western European automotive industry to map out differences in current practice and to evaluate

  14. A Framework for Institutional Adoption and Implementation of Blended Learning in Higher Education

    Science.gov (United States)

    Graham, Charles R.; Woodfield, Wendy; Harrison, J. Buckley

    2013-01-01

    There has been rapid growth in blended learning implementation and research focused on course-level issues such as improved learning outcomes, but very limited research focused on institutional policy and adoption issues. More institutional-level blended learning research is needed to guide institutions of higher education in strategically…

  15. Implementations of FroboMind using the Robot Operating System framework

    DEFF Research Database (Denmark)

    Nielsen, Søren Hundevadt; Bøgild, Anders; Jensen, Kjeld

    Conclusion The work provides a highly domain specific architecture in form of the field robotic vehicle conceptual architecture FroboMind (Jensen et al. 2011). This architecture is currently, as a work in progress, being implemented in ROS to evaluate how well FroboMind maps into ROS. A prominent...

  16. Implementing Competency-Based Education: Challenges, Strategies, and a Decision-Making Framework

    Science.gov (United States)

    Dragoo, Amie; Barrows, Richard

    2016-01-01

    The number of competency-based education (CBE) degree programs has increased rapidly over the past five years, yet there is little research on CBE program development. This study utilized conceptual models of higher education change and a qualitative methodology to analyze the strategies and challenges in implementing CBE business degree programs…

  17. Implementing a Quality Management Framework in a Higher Education Organisation: A Case Study

    Science.gov (United States)

    O'Mahony, Kim; Garavan, Thomas N.

    2012-01-01

    Purpose: This paper aims to report and analyse the lessons learned from a case study on the implementation of a quality management system within an IT Division in a higher education (HE) organisation. Design/methodology/approach: The paper is based on a review of the relevant literatures and the use of primary sources such as document analysis,…

  18. Efficient implementation of superquadric particles in Discrete Element Method within an open-source framework

    Science.gov (United States)

    Podlozhnyuk, Alexander; Pirker, Stefan; Kloss, Christoph

    2016-09-01

    Particle shape representation is a fundamental problem in the Discrete Element Method (DEM). Spherical particles with well known contact force models remain popular in DEM due to their relative simplicity in terms of ease of implementation and low computational cost. However, in real applications particles are mostly non-spherical, and more sophisticated particle shape models, like superquadric shape, must be introduced in DEM. The superquadric shape can be considered as an extension of spherical or ellipsoidal particles and can be used for modeling of spheres, ellipsoids, cylinder-like and box(dice)-like particles just varying five shape parameters. In this study we present an efficient C++ implementation of superquadric particles within the open-source and parallel DEM package LIGGGHTS. To reduce computational time several ideas are employed. In the particle-particle contact detection routine we use the minimum bounding spheres and the oriented bounding boxes to reduce the number of potential contact pairs. For the particle-wall contact an accurate analytical solution was found. We present all necessary mathematics for the contact detection and contact force calculation. The superquadric DEM code implementation was verified on test cases such as angle of repose and hopper/silo discharge. The simulation results are in good agreement with experimental data and are presented in this paper. We show adequacy of the superquadric shape model and robustness of the implemented superquadric DEM code.

  19. A Framework for Institutional Adoption and Implementation of Blended Learning in Higher Education

    Science.gov (United States)

    Graham, Charles R.; Woodfield, Wendy; Harrison, J. Buckley

    2013-01-01

    There has been rapid growth in blended learning implementation and research focused on course-level issues such as improved learning outcomes, but very limited research focused on institutional policy and adoption issues. More institutional-level blended learning research is needed to guide institutions of higher education in strategically…

  20. Efficient implementation of superquadric particles in Discrete Element Method within an open-source framework

    Science.gov (United States)

    Podlozhnyuk, Alexander; Pirker, Stefan; Kloss, Christoph

    2017-01-01

    Particle shape representation is a fundamental problem in the Discrete Element Method (DEM). Spherical particles with well known contact force models remain popular in DEM due to their relative simplicity in terms of ease of implementation and low computational cost. However, in real applications particles are mostly non-spherical, and more sophisticated particle shape models, like superquadric shape, must be introduced in DEM. The superquadric shape can be considered as an extension of spherical or ellipsoidal particles and can be used for modeling of spheres, ellipsoids, cylinder-like and box(dice)-like particles just varying five shape parameters. In this study we present an efficient C++ implementation of superquadric particles within the open-source and parallel DEM package LIGGGHTS. To reduce computational time several ideas are employed. In the particle-particle contact detection routine we use the minimum bounding spheres and the oriented bounding boxes to reduce the number of potential contact pairs. For the particle-wall contact an accurate analytical solution was found. We present all necessary mathematics for the contact detection and contact force calculation. The superquadric DEM code implementation was verified on test cases such as angle of repose and hopper/silo discharge. The simulation results are in good agreement with experimental data and are presented in this paper. We show adequacy of the superquadric shape model and robustness of the implemented superquadric DEM code.

  1. The SOPHY framework

    DEFF Research Database (Denmark)

    Laursen, Karl Kaas; Pedersen, M. F.; Bendtsen, Jan Dimon

    2005-01-01

    , hybrid simulator is implemented to demonstrate the virtues of Sophy. The simulator is set up using subsystem models described in human readable XML combined with a composition structure allowing virtual interconnection of subsystems in a simulation scenario. The performance of the simulator has shown......The goal of the Sophy framework (Simulation, Observation and Planning in Hybrid Systems) is to implement a multi-level framework for description, simulation, observation, fault detection and recovery, diagnosis and autonomous planning in distributed embedded hybrid systems. A Java-based distributed...

  2. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    Science.gov (United States)

    Calafiura, Paolo; Leggett, Charles; Seuster, Rolf; Tsulaia, Vakhtang; Van Gemmeren, Peter

    2015-12-01

    AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write mechanisms, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows the running of AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the diversity of ATLAS event processing workloads on various computing resources: Grid, opportunistic resources and HPC.

  3. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; The ATLAS collaboration; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows the sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain confugurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of AthenaMP in the...

  4. Running ATLAS workloads within massively parallel distributed applications using Athena Multi-Process framework (AthenaMP)

    CERN Document Server

    Calafiura, Paolo; Seuster, Rolf; Tsulaia, Vakhtang; van Gemmeren, Peter

    2015-01-01

    AthenaMP is a multi-process version of the ATLAS reconstruction, simulation and data analysis framework Athena. By leveraging Linux fork and copy-on-write, it allows for sharing of memory pages between event processors running on the same compute node with little to no change in the application code. Originally targeted to optimize the memory footprint of reconstruction jobs, AthenaMP has demonstrated that it can reduce the memory usage of certain configurations of ATLAS production jobs by a factor of 2. AthenaMP has also evolved to become the parallel event-processing core of the recently developed ATLAS infrastructure for fine-grained event processing (Event Service) which allows to run AthenaMP inside massively parallel distributed applications on hundreds of compute nodes simultaneously. We present the architecture of AthenaMP, various strategies implemented by AthenaMP for scheduling workload to worker processes (for example: Shared Event Queue and Shared Distributor of Event Tokens) and the usage of Ath...

  5. Implementation of the Master Plan Activities in Serayu River Voyage (SRV Within the Framework of Tourism Development in Banyumas Regency

    Directory of Open Access Journals (Sweden)

    Imam Pamungkas

    2015-02-01

    Full Text Available The Master Plan Activity of Serayu River Voyage (SRV for tourism development in Banyumas Regency were expected to be completed within five years from 2008 to 2012, but during the period until 2013, most programs and activities have not been implemented. The results showed that the Master Plan of SRV in the framework of tourism development in Banyumas Regency has not been implemented properly. The cause is the absence of good coordination between agencies, the lack programs and activities integration, supporting documents have not been revised, absence of good socialization, and the lack of private sector contribution. The factors that constrain and support implementation of the Master Plan is described as follows. Supporting factors: competent human resources (implementor already available at the managerial level and have intellectual tourism, it is only need to add personnel in the sector of culture; the availability of adequate budget; institutions that have been effective and efficient; High community response; High commitment of Banyumas Regent and cooperation related parties (stakeholders; and natural conditions of Serayu tend to calm and the river slope condition is small. The constrain factors: regulatory policies; integration of programs and activities; coordination and socialization implied sectoral ego that need to be addressed. Keywords : implementation, master plan, Serayu River Voyage, human resources, regulation

  6. Implementation of the World Health Organization Framework Convention on Tobacco Control in China: An arduous and long-term task.

    Science.gov (United States)

    Xiao, Dan; Bai, Chun-Xue; Chen, Zheng-Ming; Wang, Chen

    2015-09-01

    China is the largest producer and consumer of tobacco in the world. Consequently, the burden of tobacco-related diseases in China is enormous. Implementation of the World Health Organization Framework Convention on Tobacco Control (WHO FCTC) may lead to a significant reduction in tobacco-related morbidity and mortality both in China and globally. In this review, the authors summarize the epidemic of tobacco use and the progress made in implementing the WHO FCTC, including the promotion of legislation for smoke-free public places; smoking-cessation assistance; labeling of tobacco packaging; enforcement of bans on tobacco advertising, promotion, and sponsorship; increased taxes on tobacco products; increased tobacco prices; improvements in public awareness of the dangers of smoking; and identifying the barriers to implementing effective tobacco-control measures in China. Since the WHO FCTC officially took effect in China on January 9, 2006, China has taken some important steps, especially in promoting legislation for smoke-free public places. Because tobacco permeates the fabric of society, business, commerce, and politics in China, commitments and actions from the government are crucial, and implementing the WHO FCTC in China will be an arduous and long-term task.

  7. A theoretical framework for convergence and continuous dependence of estimates in inverse problems for distributed parameter systems

    Science.gov (United States)

    Banks, H. T.; Ito, K.

    1988-01-01

    Numerical techniques for parameter identification in distributed-parameter systems are developed analytically. A general convergence and stability framework (for continuous dependence on observations) is derived for first-order systems on the basis of (1) a weak formulation in terms of sesquilinear forms and (2) the resolvent convergence form of the Trotter-Kato approximation. The extension of this framework to second-order systems is considered.

  8. A secure and easy-to-implement web-based communication framework for caregiving robot teams

    Science.gov (United States)

    Tuna, G.; Daş, R.; Tuna, A.; Örenbaş, H.; Baykara, M.; Gülez, K.

    2016-03-01

    In recent years, robots have started to become more commonplace in our lives, from factory floors to museums, festivals and shows. They have started to change how we work and play. With an increase in the population of the elderly, they have also been started to be used for caregiving services, and hence many countries have been investing in the robot development. The advancements in robotics and wireless communications has led to the emergence of autonomous caregiving robot teams which cooperate to accomplish a set of tasks assigned by human operators. Although wireless communications and devices are flexible and convenient, they are vulnerable to many risks compared to traditional wired networks. Since robots with wireless communication capability transmit all data types, including sensory, coordination, and control, through radio frequencies, they are open to intruders and attackers unless protected and their openness may lead to many security issues such as data theft, passive listening, and service interruption. In this paper, a secure web-based communication framework is proposed to address potential security threats due to wireless communication in robot-robot and human-robot interaction. The proposed framework is simple and practical, and can be used by caregiving robot teams in the exchange of sensory data as well as coordination and control data.

  9. Guidelines for Implementing Advanced Distribution Management Systems-Requirements for DMS Integration with DERMS and Microgrids

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jianhui [Argonne National Lab. (ANL), Argonne, IL (United States); Chen, Chen [Argonne National Lab. (ANL), Argonne, IL (United States); Lu, Xiaonan [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-08-01

    This guideline focuses on the integration of DMS with DERMS and microgrids connected to the distribution grid by defining generic and fundamental design and implementation principles and strategies. It starts by addressing the current status, objectives, and core functionalities of each system, and then discusses the new challenges and the common principles of DMS design and implementation for integration with DERMS and microgrids to realize enhanced grid operation reliability and quality power delivery to consumers while also achieving the maximum energy economics from the DER and microgrid connections.

  10. Pattern-Based Development of Enterprise Systems: from Conceptual Framework to Series of Implementations

    Directory of Open Access Journals (Sweden)

    Sergey V. Zykov

    2013-04-01

    Full Text Available Building enterprise software is a dramatic challenge due to data size, complexity and rapid growth of the both in time. The issue becomes even more dramatic when it gets to integrating heterogeneous applications. Therewith, a uniform approach is required, which combines formal models and CASE tools. The methodology is based on extracting common ERP module level patterns and applying them to series of heterogeneous implementations. The approach includes a lifecycle model, which extends conventional spiral model by formal data representation/management models and DSL-based "low-level" CASE tools supporting the formalisms. The methodology has been successfully implemented as a series of portal-based ERP systems in ITERA oil-and-gas corporation, and in a number of trading/banking enterprise applications for other enterprises. Semantic network-based airline dispatch system, and a 6D-model-driven nuclear power plant construction support system are currently in progress.

  11. Ecodesign maturity model: a management framework to support ecodesign implementation into manufacturing companies

    DEFF Research Database (Denmark)

    Pigosso, Daniela Cristina Antelmi; Rozenfeld, Henrique; McAloone, Tim C.

    2013-01-01

    Over the last few decades, ecodesign has emerged as a promising approach to integrate environmental concerns into the product development and related processes. Ecodesign aims to minimize environmental impacts throughout the product’s life cycle, without compromising other essential criteria...... and improvement projects to be applied, by adopting a continuous improvement approach for process improvement. The model is thus intended to support ecodesign managers in their deployment of strategic and tactical roadmaps for ecodesign implementation. The paper discusses the main concept of the model...... such as performance and cost. Despite the potential benefits of ecodesign and the existence of several tools and techniques for product design, the actual application of ecodesign has not reached companies worldwide, mainly due to difficulties in ecodesign implementation and management. This paper introduces...

  12. Hierarchical multiscale framework for materials modeling: Equation of state implementation and application to a Taylor anvil impact test of RDX

    Science.gov (United States)

    Barnes, Brian C.; Spear, Carrie E.; Leiter, Ken W.; Becker, Richard; Knap, Jaroslaw; Lísal, Martin; Brennan, John K.

    2017-01-01

    In order to progress towards a materials-by-design capability, we present work on a challenge in continuum-scale modeling: the direct incorporation of complex physical processes in the constitutive evaluation. In this work, we use an adaptive scale-bridging computational framework executing in parallel in a heterogeneous computational environment to couple a fine-scale, particle-based model computing the equation of state (EOS) to the constitutive response in a finite-element multi-physics simulation. The EOS is obtained from high fidelity materials simulations performed via dissipative particle dynamics (DPD) methods. This scale-bridging framework is progress towards an innovation infrastructure that will be of great utility for systems in which essential aspects of material response are too complex to capture by closed form material models. The design, implementation, and performance of the scale-bridging framework are discussed. Also presented is a proof-of-concept Taylor anvil impact test of non-reacting 1,3,5-trinitrohexahydro-s-triazine (RDX).

  13. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    Science.gov (United States)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  14. Canonical-Dissipative Nonequilibrium Energy Distributions: Parameter Estimation via Implicit Moment Method, Implementation and Application

    Science.gov (United States)

    Frank, T. D.; Kim, S.; Dotov, D. G.

    2013-11-01

    Canonical-dissipative nonequilibrium energy distributions play an important role in the life sciences. In one of the most fundamental forms, such energy distributions correspond to two-parametric normal distributions truncated to the left. We present an implicit moment method involving the first and second energy moments to estimate the distribution parameters. It is shown that the method is consistent with Cohen's 1949 formula. The implementation of the algorithm is discussed and the range of admissible parameter values is identified. In addition, an application to an earlier study on human oscillatory hand movements is presented. In this earlier study, energy was conceptualized as the energy of a Hamiltonian oscillator model. The canonical-dissipative approach allows for studying the systematic change of the model parameters with oscillation frequency. It is shown that the results obtained with the implicit moment method are consistent with those derived in the earlier study by other means.

  15. Guidelines for implementing medical operations in the Counterinsurgency (COIN) fight: a framework for engagement.

    Science.gov (United States)

    Hamid, Simon

    2011-01-01

    Several articles have been published over the last decade that describe the current role of medical operations (variously known as MEDCAPS- Medical Civic Action Programs, CMEs- Co-Operative Medical Engagements, etc.) in COIN and stability operations. Many of these articles focus on the experiences of healthcare and support personnel and their observations of inappropriately used U.S. Military healthcare resources. These medical assets were often used to provide fragmented and direct patient care to local populations. These operations were conducted in a non-sustainable fashion.2 Most importantly, poorly organized efforts damage COIN efforts and alienate local populations. Effective medical operations must be nested within the larger realm of overall COIN actions. In this paper, a fundamental framework is presented to align medical operations within COIN missions.

  16. Home Education: Global Education in Manufacturing: Basic Framework, Industrial Survey and Possible Implementation

    Directory of Open Access Journals (Sweden)

    Asbjorn Rolstadas

    2006-09-01

    Full Text Available Many new challenges and opportunities have arisen for Slovenia since May 2004 when it became a full member of the EU. On the one hand we have some successful economic players who can definitely gain from new opportunities, on the other hand some structural changes still have to be accomplished. One of the most demanding tasks is related to higher education and in particular to harmonization of EU and global educational systems. The paper presents the results of the international framework for a Master degree curriculum in manufacturing strategy and an example of the integration of competence in technology and business. A good example of meeting Bologna goals is to establish a system of easily recognisable and comparable educational degrees and to accelerate the employment of EU citizens as well as the competitiveness of the European higher educational system.

  17. Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation

    Science.gov (United States)

    Afjeh, Abdollah A.; Reed, John A.

    2003-01-01

    The following reports are presented on this project:A first year progress report on: Development of a Dynamically Configurable,Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; A second year progress report on: Development of a Dynamically Configurable, Object-Oriented Framework for Distributed, Multi-modal Computational Aerospace Systems Simulation; An Extensible, Interchangeable and Sharable Database Model for Improving Multidisciplinary Aircraft Design; Interactive, Secure Web-enabled Aircraft Engine Simulation Using XML Databinding Integration; and Improving the Aircraft Design Process Using Web-based Modeling and Simulation.

  18. MODELING AND IMPLEMENTATION OF A DISTRIBUTED SHOP FLOOR MANAGEMENT AND CONTROL SYSTEM

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Adopting distributed control architecture is the important development direction for shop floor management and control system,is also the requirement of making it agile,intelligent and concurrent. Some key problems in achieving distributed control architecture are researched. An activity model of shop floor is presented as the requirement definition of the prototype system. The multi-agent based software architecture is constructed. How the core part in shop floor management and control system,production plan and scheduling is achieved. The cooperation of different agents is illustrated. Finally,the implementation of the prototype system is narrated.

  19. Application Research and Implementation of the CORBA-Based Web Distributed Network Management System

    Institute of Scientific and Technical Information of China (English)

    WANG Feng; SHI Bing-xin

    2004-01-01

    The distributed management has become an important tendency of development for the NMS (Network Management System) with the development of Internet.Based on the analysis of CORBA (Common Object Request Broker Architecture) technique, we mainly discuss about the applicability of the approach by which CORBA combined with Java has been applied to the system model and Web architecture; and address the applied frame and the interface definitions that are the key techno-logies for implementing the Distributed Object Computing (DOC).In addition, we also conduct the research on its advantages and disadvantages and further expected improvements.

  20. Implementing sustainable drainage systems for urban surface water management within the regulatory framework in England and Wales.

    Science.gov (United States)

    Ellis, J Bryan; Lundy, Lian

    2016-12-01

    The UK 2007 floods resulted in damages estimated to exceed over £4 billion. This triggered a national review of strategic flood risk management (Pitt, 2008) with its recommendations informing and implemented by the Flood and Water Management, Act (FWMA, 2010). Estimating that up to two-thirds of properties flooded in the 2007 event as a direct result of overloaded sewer systems, the FWMA set out an ambitious overhaul of flood risk management approaches including identifying bodies responsible for the management of local flood risk (local municipalities) and the development of over-arching Lead Local Flood Authorities (LLFAs) at a regional level. LLFAs duties include developing local flood risk management strategies and, aligned with this, many LLFAs and local municipalities produced sustainable drainage system (SUDS) guidance notes. In parallel, changes to the national planning policy framework (NPPF) in England give priority to the use of SUDS in new major developments, as does the related Town and Country Planning Order (2015). However, whilst all three pieces of legislation refer to the preferential use of SUDs, these requirements remain "economically proportionate" and thus the inclusion of SUDS within development controls remain desirable - but not mandatory - obligations. Within this dynamic policy context, reignited most recently by the December 2015 floods, this paper examines some of the challenges to the implementation of SUDS in England and Wales posed by the new regulatory frameworks. In particular, it examines how emerging organisational procedures and processes are likely to impact on future SUDS implementation, and highlights the need for further cross-sectoral working to ensure opportunities for cross-sectoral benefits- such as that accrued by reducing stormwater flows within combined sewer systems for water companies, property developers and environmental protection - are not lost.