WorldWideScience

Sample records for decentralized workflow scheduling

  1. A three-level atomicity model for decentralized workflow management systems

    Science.gov (United States)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  2. Requirements for Secure Logging of Decentralized Cross-Organizational Workflow Executions

    NARCIS (Netherlands)

    Wombacher, Andreas; Wieringa, Roelf J.; Jonker, Willem; Knezevic, P.; Pokraev, S.; meersman, R; Tari, Z; herrero, p; Méndez, G.; Cavedon, L.; Martin, D.; Hinze, A.; Buchanan, G.

    2005-01-01

    The control of actions performed by parties involved in a decentralized cross-organizational workflow is done by several independent workflow engines. Due to the lack of a centralized coordination control, an auditing is required which supports a reliable and secure detection of malicious actions

  3. Decentralized Ground Staff Scheduling

    DEFF Research Database (Denmark)

    Sørensen, M. D.; Clausen, Jens

    2002-01-01

    scheduling is investigated. The airport terminal is divided into zones, where each zone consists of a set of stands geographically next to each other. Staff is assigned to work in only one zone and the staff scheduling is planned decentralized for each zone. The advantage of this approach is that the staff...... work in a smaller area of the terminal and thus spends less time walking between stands. When planning decentralized the allocation of stands to flights influences the staff scheduling since the workload in a zone depends on which flights are allocated to stands in the zone. Hence solving the problem...... depends on the actual stand allocation but also on the number of zones and the layout of these. A mathematical model of the problem is proposed, which integrates the stand allocation and the staff scheduling. A heuristic solution method is developed and applied on a real case from British Airways, London...

  4. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  5. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  6. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    Science.gov (United States)

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  7. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    Science.gov (United States)

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  8. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Sonia Yassa

    2013-01-01

    Full Text Available We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach.

  9. Multi-Objective Approach for Energy-Aware Workflow Scheduling in Cloud Computing Environments

    Science.gov (United States)

    Kadima, Hubert; Granado, Bertrand

    2013-01-01

    We address the problem of scheduling workflow applications on heterogeneous computing systems like cloud computing infrastructures. In general, the cloud workflow scheduling is a complex optimization problem which requires considering different criteria so as to meet a large number of QoS (Quality of Service) requirements. Traditional research in workflow scheduling mainly focuses on the optimization constrained by time or cost without paying attention to energy consumption. The main contribution of this study is to propose a new approach for multi-objective workflow scheduling in clouds, and present the hybrid PSO algorithm to optimize the scheduling performance. Our method is based on the Dynamic Voltage and Frequency Scaling (DVFS) technique to minimize energy consumption. This technique allows processors to operate in different voltage supply levels by sacrificing clock frequencies. This multiple voltage involves a compromise between the quality of schedules and energy. Simulation results on synthetic and real-world scientific applications highlight the robust performance of the proposed approach. PMID:24319361

  10. An extended Intelligent Water Drops algorithm for workflow scheduling in cloud computing environment

    Directory of Open Access Journals (Sweden)

    Shaymaa Elsherbiny

    2018-03-01

    Full Text Available Cloud computing is emerging as a high performance computing environment with a large scale, heterogeneous collection of autonomous systems and flexible computational architecture. Many resource management methods may enhance the efficiency of the whole cloud computing system. The key part of cloud computing resource management is resource scheduling. Optimized scheduling of tasks on the cloud virtual machines is an NP-hard problem and many algorithms have been presented to solve it. The variations among these schedulers are due to the fact that the scheduling strategies of the schedulers are adapted to the changing environment and the types of tasks. The focus of this paper is on workflows scheduling in cloud computing, which is gaining a lot of attention recently because workflows have emerged as a paradigm to represent complex computing problems. We proposed a novel algorithm extending the natural-based Intelligent Water Drops (IWD algorithm that optimizes the scheduling of workflows on the cloud. The proposed algorithm is implemented and embedded within the workflows simulation toolkit and tested in different simulated cloud environments with different cost models. Our algorithm showed noticeable enhancements over the classical workflow scheduling algorithms. We made a comparison between the proposed IWD-based algorithm with other well-known scheduling algorithms, including MIN-MIN, MAX-MIN, Round Robin, FCFS, and MCT, PSO and C-PSO, where the proposed algorithm presented noticeable enhancements in the performance and cost in most situations.

  11. Deadline-constrained workflow scheduling algorithms for Infrastructure as a Service Clouds

    NARCIS (Netherlands)

    Abrishami, S.; Naghibzadeh, M.; Epema, D.H.J.

    2013-01-01

    The advent of Cloud computing as a new model of service provisioning in distributed systems encourages researchers to investigate its benefits and drawbacks on executing scientific applications such as workflows. One of the most challenging problems in Clouds is workflow scheduling, i.e., the

  12. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    OpenAIRE

    Mitici, M.A.; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2014-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance of the estimation is below a targeted threshold. We analyze the waiting time for a collector to receive sufficient sensor observations. We show that, for sufficiently large sensor sets, the decentr...

  13. The robust schedule - A link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren; Wandahl, Søren

    2012-01-01

    -down the contractors, and force them to rigorously adhere to the initial schedule. If delayed the work-pace or manpower has to be increased to observe the schedule. In attempt to improve productivity three independent site-mangers have been interviewed about time-scheduling. Their experiences and opinions have been...... analyzed and weaknesses in existing time scheduling have been found. The findings showed a negative side effect of keeping the schedule to tight. A too tight schedule is inflexible and cannot absorb variability in production. Flexibility is necessary because of the contractors interacting and dependable....... The result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under non-optimal conditions. Even though productivity seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase productivity...

  14. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  15. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    NARCIS (Netherlands)

    Mitici, M.A.; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2014-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance

  16. Decentralized vs. centralized scheduling in wireless sensor networks for data fusion

    NARCIS (Netherlands)

    Mitici, Mihaela; Goseling, Jasper; de Graaf, Maurits; Boucherie, Richardus J.

    2013-01-01

    We consider the problem of data estimation in a sensor wireless network where sensors transmit their observations according to decentralized and centralized transmission schedules. A data collector is interested in achieving a data estimation using several sensor observations such that the variance

  17. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems.

    Science.gov (United States)

    Li, Xuejun; Xu, Jia; Yang, Yun

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.

  18. A Hybrid Metaheuristic for Multi-Objective Scientific Workflow Scheduling in a Cloud Environment

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-03-01

    Full Text Available Cloud computing has emerged as a high-performance computing environment with a large pool of abstracted, virtualized, flexible, and on-demand resources and services. Scheduling of scientific workflows in a distributed environment is a well-known NP-complete problem and therefore intractable with exact solutions. It becomes even more challenging in the cloud computing platform due to its dynamic and heterogeneous nature. The aim of this study is to optimize multi-objective scheduling of scientific workflows in a cloud computing environment based on the proposed metaheuristic-based algorithm, Hybrid Bio-inspired Metaheuristic for Multi-objective Optimization (HBMMO. The strong global exploration ability of the nature-inspired metaheuristic Symbiotic Organisms Search (SOS is enhanced by involving an efficient list-scheduling heuristic, Predict Earliest Finish Time (PEFT, in the proposed algorithm to obtain better convergence and diversity of the approximate Pareto front in terms of reduced makespan, minimized cost, and efficient load balance of the Virtual Machines (VMs. The experiments using different scientific workflow applications highlight the effectiveness, practicality, and better performance of the proposed algorithm.

  19. Decentralized Consistency Checking in Cross-organizational Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which

  20. MaGate Simulator: A Simulation Environment for a Decentralized Grid Scheduler

    Science.gov (United States)

    Huang, Ye; Brocco, Amos; Courant, Michele; Hirsbrunner, Beat; Kuonen, Pierre

    This paper presents a simulator for of a decentralized modular grid scheduler named MaGate. MaGate’s design emphasizes scheduler interoperability by providing intelligent scheduling serving the grid community as a whole. Each MaGate scheduler instance is able to deal with dynamic scheduling conditions, with continuously arriving grid jobs. Received jobs are either allocated on local resources, or delegated to other MaGates for remote execution. The proposed MaGate simulator is based on GridSim toolkit and Alea simulator, and abstracts the features and behaviors of complex fundamental grid elements, such as grid jobs, grid resources, and grid users. Simulation of scheduling tasks is supported by a grid network overlay simulator executing distributed ant-based swarm intelligence algorithms to provide services such as group communication and resource discovery. For evaluation, a comparison of behaviors of different collaborative policies among a community of MaGates is provided. Results support the use of the proposed approach as a functional ready grid scheduler simulator.

  1. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems

    Directory of Open Access Journals (Sweden)

    Xuejun Li

    2015-01-01

    Full Text Available Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO and Particle Swarm Optimization (PSO have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.

  2. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  3. Flexible Data-Aware Scheduling for Workflows over an In-Memory Object Store

    Energy Technology Data Exchange (ETDEWEB)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin; Wozniak, Justin M.; Carretero, Jesus; Ross, Rob

    2016-01-01

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  4. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  5. Low Latency Workflow Scheduling and an Application of Hyperspectral Brightness Temperatures

    Science.gov (United States)

    Nguyen, P. T.; Chapman, D. R.; Halem, M.

    2012-12-01

    New system analytics for Big Data computing holds the promise of major scientific breakthroughs and discoveries from the exploration and mining of the massive data sets becoming available to the science community. However, such data intensive scientific applications face severe challenges in accessing, managing and analyzing petabytes of data. While the Hadoop MapReduce environment has been successfully applied to data intensive problems arising in business, there are still many scientific problem domains where limitations in the functionality of MapReduce systems prevent its wide adoption by those communities. This is mainly because MapReduce does not readily support the unique science discipline needs such as special science data formats, graphic and computational data analysis tools, maintaining high degrees of computational accuracies, and interfacing with application's existing components across heterogeneous computing processors. We address some of these limitations by exploiting the MapReduce programming model for satellite data intensive scientific problems and address scalability, reliability, scheduling, and data management issues when dealing with climate data records and their complex observational challenges. In addition, we will present techniques to support the unique Earth science discipline needs such as dealing with special science data formats (HDF and NetCDF). We have developed a Hadoop task scheduling algorithm that improves latency by 2x for a scientific workflow including the gridding of the EOS AIRS hyperspectral Brightness Temperatures (BT). This workflow processing algorithm has been tested at the Multicore Computing Center private Hadoop based Intel Nehalem cluster, as well as in a virtual mode under the Open Source Eucalyptus cloud. The 55TB AIRS hyperspectral L1b Brightness Temperature record has been gridded at the resolution of 0.5x1.0 degrees, and we have computed a 0.9 annual anti-correlation to the El Nino Southern oscillation in

  6. Distributed late-binding micro-scheduling and data caching for data-intensive workflows

    International Nuclear Information System (INIS)

    Delgado Peris, A.

    2015-01-01

    Today's world is flooded with vast amounts of digital information coming from innumerable sources. Moreover, it seems clear that this trend will only intensify in the future. Industry, society and remarkably science are not indifferent to this fact. On the contrary, they are struggling to get the most out of this data, which means that they need to capture, transfer, store and process it in a timely and efficient manner, using a wide range of computational resources. And this task is not always simple. A very representative example of the challenges posed by the management and processing of large quantities of data is that of the Large Hadron Collider experiments, which handle tens of petabytes of physics information every year. Based on the experience of one of these collaborations, we have studied the main issues involved in the management of huge volumes of data and in the completion of sizeable workflows that consume it. In this context, we have developed a general-purpose architecture for the scheduling and execution of workflows with heavy data requirements: the Task Queue. This new system builds on the late-binding overlay model, which has helped experiments to successfully overcome the problems associated to the heterogeneity and complexity of large computational grids. Our proposal introduces several enhancements to the existing systems. The execution agents of the Task Queue architecture share a Distributed Hash Table (DHT) and perform job matching and assignment cooperatively. In this way, scalability problems of centralized matching algorithms are avoided and workflow execution times are improved. Scalability makes fine-grained micro-scheduling possible and enables new functionalities, like the implementation of a distributed data cache on the execution nodes and the integration of data location information in the scheduling decisions...(Author)

  7. Optimization of workflow scheduling in Utility Management System with hierarchical neural network

    Directory of Open Access Journals (Sweden)

    Srdjan Vukmirovic

    2011-08-01

    Full Text Available Grid computing could be the future computing paradigm for enterprise applications, one of its benefits being that it can be used for executing large scale applications. Utility Management Systems execute very large numbers of workflows with very high resource requirements. This paper proposes architecture for a new scheduling mechanism that dynamically executes a scheduling algorithm using feedback about the current status Grid nodes. Two Artificial Neural Networks were created in order to solve the scheduling problem. A case study is created for the Meter Data Management system with measurements from the Smart Metering system for the city of Novi Sad, Serbia. Performance tests show that significant improvement of overall execution time can be achieved by Hierarchical Artificial Neural Networks.

  8. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  9. Decentralized Utilitarian Mechanisms for Scheduling Games

    NARCIS (Netherlands)

    Cole, R.; Correa, J.; Gkatzelis, V.; Mirrokni, V.; Olver, N.K.

    2015-01-01

    Game Theory and Mechanism Design are by now standard tools for studying and designing massive decentralized systems. Unfortunately, designing mechanisms that induce socially efficient outcomes often requires full information and prohibitively large computational resources. In this work we study

  10. Task Balanced Workflow Scheduling Technique considering Task Processing Rate in Spot Market

    Directory of Open Access Journals (Sweden)

    Daeyong Jung

    2014-01-01

    Full Text Available Recently, the cloud computing is a computing paradigm that constitutes an advanced computing environment that evolved from the distributed computing. And the cloud computing provides acquired computing resources in a pay-as-you-go manner. For example, Amazon EC2 offers the Infrastructure-as-a-Service (IaaS instances in three different ways with different price, reliability, and various performances of instances. Our study is based on the environment using spot instances. Spot instances can significantly decrease costs compared to reserved and on-demand instances. However, spot instances give a more unreliable environment than other instances. In this paper, we propose the workflow scheduling scheme that reduces the out-of-bid situation. Consequently, the total task completion time is decreased. The simulation results reveal that, compared to various instance types, our scheme achieves performance improvements in terms of an average combined metric of 12.76% over workflow scheme without considering the processing rate. However, the cost in our scheme is higher than an instance with low performance and is lower than an instance with high performance.

  11. A performance study of grid workflow engines

    NARCIS (Netherlands)

    Stratan, C.; Iosup, A.; Epema, D.H.J.

    2008-01-01

    To benefit from grids, scientists require grid workflow engines that automatically manage the execution of inter-related jobs on the grid infrastructure. So far, the workflows community has focused on scheduling algorithms and on interface tools. Thus, while several grid workflow engines have been

  12. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  13. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  14. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  15. Data analysis with the DIANA meta-scheduling approach

    International Nuclear Information System (INIS)

    Anjum, A; McClatchey, R; Willers, I

    2008-01-01

    The concepts, design and evaluation of the Data Intensive and Network Aware (DIANA) meta-scheduling approach for solving the challenges of data analysis being faced by CERN experiments are discussed in this paper. Our results suggest that data analysis can be made robust by employing fault tolerant and decentralized meta-scheduling algorithms supported in our DIANA meta-scheduler. The DIANA meta-scheduler supports data intensive bulk scheduling, is network aware and follows a policy centric meta-scheduling. In this paper, we demonstrate that a decentralized and dynamic meta-scheduling approach is an effective strategy to cope with increasing numbers of users, jobs and datasets. We present 'quality of service' related statistics for physics analysis through the application of a policy centric fair-share scheduling model. The DIANA meta-schedulers create a peer-to-peer hierarchy of schedulers to accomplish resource management that changes with evolving loads and is dynamic and adapts to the volatile nature of the resources

  16. Decentralized Services Orchestration Using Intelligent Mobile Agents with Deadline Restrictions

    OpenAIRE

    Magalhães , Alex; Lung , Lau Cheuk; Rech , Luciana

    2010-01-01

    International audience; The necessity for better performance drives service orchestration towards decentralization. There is a recent approach where the integrator - that traditionally centralizes all corporative services and business logics - remains as a repository of interface services, but now lacks to know all business logics and business workflows. There are several techniques using this recent approach, including hybrid solutions, peer-to-peer solutions and trigger-based mechanisms. A ...

  17. Decentralizing the Team Station: Simulation before Reality as a Best-Practice Approach.

    Science.gov (United States)

    Charko, Jackie; Geertsen, Alice; O'Brien, Patrick; Rouse, Wendy; Shahid, Ammarah; Hardenne, Denise

    2016-01-01

    The purpose of this article is to share the logistical planning requirements and simulation experience of one Canadian hospital as it prepared its staff for the change from a centralized inpatient unit model to the decentralized design planned for its new community hospital. With the commitment and support of senior leadership, project management resources and clinical leads worked collaboratively to design a decentralized prototype in the form of a pod-style environment in the hospital's current setting. Critical success factors included engaging the right stakeholders, providing an opportunity to test new workflows and technology, creating a strong communication plan and building on lessons learned as subsequent pod prototypes are launched.

  18. Workflow Scheduling Using Hybrid GA-PSO Algorithm in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Ahmad M. Manasrah

    2018-01-01

    Full Text Available Cloud computing environment provides several on-demand services and resource sharing for clients. Business processes are managed using the workflow technology over the cloud, which represents one of the challenges in using the resources in an efficient manner due to the dependencies between the tasks. In this paper, a Hybrid GA-PSO algorithm is proposed to allocate tasks to the resources efficiently. The Hybrid GA-PSO algorithm aims to reduce the makespan and the cost and balance the load of the dependent tasks over the heterogonous resources in cloud computing environments. The experiment results show that the GA-PSO algorithm decreases the total execution time of the workflow tasks, in comparison with GA, PSO, HSGA, WSGA, and MTCT algorithms. Furthermore, it reduces the execution cost. In addition, it improves the load balancing of the workflow application over the available resources. Finally, the obtained results also proved that the proposed algorithm converges to optimal solutions faster and with higher quality compared to other algorithms.

  19. Integrating prediction, provenance, and optimization into high energy workflows

    Energy Technology Data Exchange (ETDEWEB)

    Schram, M.; Bansal, V.; Friese, R. D.; Tallent, N. R.; Yin, J.; Barker, K. J.; Stephan, E.; Halappanavar, M.; Kerbyson, D. J.

    2017-10-01

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  20. A decentralized scheduling algorithm for time synchronized channel hopping

    Directory of Open Access Journals (Sweden)

    Andrew Tinka

    2011-09-01

    Full Text Available Time Synchronized Channel Hopping (TSCH is an existing Medium Access Control scheme which enables robust communication through channel hopping and high data rates through synchronization. It is based on a time-slotted architecture, and its correct functioning depends on a schedule which is typically computed by a central node. This paper presents, to our knowledge, the first scheduling algorithm for TSCH networks which both is distributed and which copes with mobile nodes. Two variations on scheduling algorithms are presented. Aloha-based scheduling allocates one channel for broadcasting advertisements for new neighbors. Reservation- based scheduling augments Aloha-based scheduling with a dedicated timeslot for targeted advertisements based on gossip information. A mobile ad hoc motorized sensor network with frequent connectivity changes is studied, and the performance of the two proposed algorithms is assessed. This performance analysis uses both simulation results and the results of a field deployment of floating wireless sensors in an estuarial canal environment. Reservation-based scheduling performs significantly better than Aloha-based scheduling, suggesting that the improved network reactivity is worth the increased algorithmic complexity and resource consumption.

  1. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  2. Load scheduling for decentralized CHP plants

    DEFF Research Database (Denmark)

    Nielsen, Henrik Aalborg, orlov 31.07.2008; Madsen, Henrik; Nielsen, Torben Skov

    ) an interactive decision support tool by which optimal schedules can be found given the forecasts or user-defined modifications of the forecasts, and (iii) an automatic on-line system for monitoring when conditions have changed so that rescheduling is appropriate. In this report the focus is on methods applicable...... be obtained. Furthermore, we believe that all relevant forecasting methods are far too complicated to allow for this integration; both uncertainties originating from the dependence of heat load on climate and from meteorological forecasts need to be taken into account. Instead we suggest that the decision....... By letting the system find optimal schedules for each of these realizations the operator can gain some insight into the importance of the uncertainties. It is shown that with modern personal computers (e.g. 1 GHz Pentium III), operating systems (e.g. RedHat Linux 6.0), and compilers (e.g. GNU C 2...

  3. Concurrent processes scheduling with scarce resources in small and medium enterprises

    Institute of Scientific and Technical Information of China (English)

    马嵩华

    2016-01-01

    Scarce resources , precedence and non-determined time-lag are three constraints commonly found in small and medium manufacturing enterprises (SMEs), which are deemed to block the ap-plication of workflow management system ( WfMS ) .To tackle this problem , a workflow scheduling approach is proposed based on timing workflow net (TWF-net) and genetic algorithm (GA).The workflow is modelled in a form of TWF-net in favour of process simulation and resource conflict checking .After simplifying and reconstructing the set of workflow instance , the conflict resolution problem is transformed into a resource-constrained project scheduling problem ( RCPSP ) , which could be efficiently solved by a heuristic method , such as GA.Finally, problems of various sizes are utilized to test the performance of the proposed algorithm and to compare it with first-come-first-served ( FCFS) strategy.The evaluation demonstrates that the proposed method is an overwhelming and effective approach for scheduling the concurrent processes with precedence and resource con -straints .

  4. Consensus based scheduling of storage capacities in a virtual microgrid

    DEFF Research Database (Denmark)

    Brehm, Robert; Top, Søren; Mátéfi-Tempfli, Stefan

    2017-01-01

    We present a distributed, decentralized method for coordinated scheduling of charge/discharge intervals of storage capacities in a utility grid integrated microgrid. The decentralized algorithm is based on a consensus scheme and solves an optimisation problem with the objective of minimising......, by use of storage capacities, the power flow over a transformer substation from/to the utility grid integrated microgrid. It is shown that when using this coordinated scheduling algorithm, load profile flattening (peak-shaving) for the utility grid is achieved. Additionally, mutual charge...

  5. A Prudent Approach to Fair Use Workflow

    Directory of Open Access Journals (Sweden)

    Karey Patterson

    2018-02-01

    Full Text Available This poster will outline a new highly efficient workflow for the management of copyright materials that is prudent and accommodates generally and legally accepted Fair Use limits. The workflow allows library or copyright staff an easy means to keep on top of their copyright obligations, manage licenses and review and adjust schedules but is still a highly efficient means to cope with large numbers of requests to use materials. The poster details speed and efficiency gains for professors and library staff while reducing legal exposure.

  6. Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach

    Energy Technology Data Exchange (ETDEWEB)

    Messer, Bronson [ORNL; Sewell, Christopher [Los Alamos National Laboratory (LANL); Heitmann, Katrin [ORNL; Finkel, Dr. Hal J [Argonne National Laboratory (ANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Zagaris, George [Lawrence Livermore National Laboratory (LLNL); Pope, Adrian [Los Alamos National Laboratory (LANL); Habib, Salman [ORNL; Parete-Koon, Suzanne T [ORNL

    2015-01-01

    Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial in situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.

  7. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  8. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  9. Design of an autonomous decentralized MAC protocol for wireless sensor networks

    NARCIS (Netherlands)

    van Hoesel, L.F.W.; Dal Pont, L.; Havinga, Paul J.M.

    In this document the design of a MAC protocol for wireless sensor networks is discussed. The autonomous decentralized TDMA based MAC protocol minimizes power consumtion by efficiency implementing unicast/omnicast, scheduled rendezvous times and wakeup calls. The MAC protocol is an ongoing research

  10. Game-Based Virtual Worlds as Decentralized Virtual Activity Systems

    Science.gov (United States)

    Scacchi, Walt

    There is widespread interest in the development and use of decentralized systems and virtual world environments as possible new places for engaging in collaborative work activities. Similarly, there is widespread interest in stimulating new technological innovations that enable people to come together through social networking, file/media sharing, and networked multi-player computer game play. A decentralized virtual activity system (DVAS) is a networked computer supported work/play system whose elements and social activities can be both virtual and decentralized (Scacchi et al. 2008b). Massively multi-player online games (MMOGs) such as World of Warcraft and online virtual worlds such as Second Life are each popular examples of a DVAS. Furthermore, these systems are beginning to be used for research, deve-lopment, and education activities in different science, technology, and engineering domains (Bainbridge 2007, Bohannon et al. 2009; Rieber 2005; Scacchi and Adams 2007; Shaffer 2006), which are also of interest here. This chapter explores two case studies of DVASs developed at the University of California at Irvine that employ game-based virtual worlds to support collaborative work/play activities in different settings. The settings include those that model and simulate practical or imaginative physical worlds in different domains of science, technology, or engineering through alternative virtual worlds where players/workers engage in different kinds of quests or quest-like workflows (Jakobsson 2006).

  11. Abstract flexibility description for virtual power plant scheduling

    OpenAIRE

    Fröhling, Judith

    2017-01-01

    In the ongoing paradigm shift of the energy market from big power plants to more and more small and decentralized power plants, virtual power plants (VPPs) play an important role. VPPs bundle the capacities of the small and decentralized resources (DER). Planing of VPP operation, that is also called scheduling, relies on the flexibilities of controllable DER in the VPP, e.g., combined heat and power plants (CHPs), heat pumps and batteries. The aim of this thesis is the development of an abstr...

  12. Application of Workflow Technology for Big Data Analysis Service

    Directory of Open Access Journals (Sweden)

    Bin Zhang

    2018-04-01

    Full Text Available This study presents a lightweight representational state transfer-based cloud workflow system to construct a big data intelligent software-as-a-service (SaaS platform. The system supports the dynamic construction and operation of an intelligent data analysis application, and realizes rapid development and flexible deployment of the business analysis process that can improve the interaction and response time of the process. The proposed system integrates offline-batch and online-streaming analysis models that allow users to conduct batch and streaming computing simultaneously. Users can rend cloud capabilities and customize a set of big data analysis applications in the form of workflow processes. This study elucidates the architecture and application modeling, customization, dynamic construction, and scheduling of a cloud workflow system. A chain workflow foundation mechanism is proposed to combine several analysis components into a chain component that can promote efficiency. Four practical application cases are provided to verify the analysis capability of the system. Experimental results show that the proposed system can support multiple users in accessing the system concurrently and effectively uses data analysis algorithms. The proposed SaaS workflow system has been used in network operators and has achieved good results.

  13. Multi-core processing and scheduling performance in CMS

    International Nuclear Information System (INIS)

    Hernández, J M; Evans, D; Foulkes, S

    2012-01-01

    Commodity hardware is going many-core. We might soon not be able to satisfy the job memory needs per core in the current single-core processing model in High Energy Physics. In addition, an ever increasing number of independent and incoherent jobs running on the same physical hardware not sharing resources might significantly affect processing performance. It will be essential to effectively utilize the multi-core architecture. CMS has incorporated support for multi-core processing in the event processing framework and the workload management system. Multi-core processing jobs share common data in memory, such us the code libraries, detector geometry and conditions data, resulting in a much lower memory usage than standard single-core independent jobs. Exploiting this new processing model requires a new model in computing resource allocation, departing from the standard single-core allocation for a job. The experiment job management system needs to have control over a larger quantum of resource since multi-core aware jobs require the scheduling of multiples cores simultaneously. CMS is exploring the approach of using whole nodes as unit in the workload management system where all cores of a node are allocated to a multi-core job. Whole-node scheduling allows for optimization of the data/workflow management (e.g. I/O caching, local merging) but efficient utilization of all scheduled cores is challenging. Dedicated whole-node queues have been setup at all Tier-1 centers for exploring multi-core processing workflows in CMS. We present the evaluation of the performance scheduling and executing multi-core workflows in whole-node queues compared to the standard single-core processing workflows.

  14. PRACTICAL IMPLICATIONS OF LOCATION-BASED SCHEDULING

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    2007-01-01

    The traditional method for planning, scheduling and controlling activities and resources in construction projects is the CPM-scheduling, which has been the predominant scheduling method since its introduction in the late 1950s. Over the years, CPM has proven to be a very powerful technique...... that will be used in this study. LBS is a scheduling method that rests upon the theories of line-of-balance and which uses the graphic representation of a flowline chart. As such, LBS is adapted for planning and management of workflows and, thus, may provide a solution to the identified shortcomings of CPM. Even...

  15. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  16. Proportional green time scheduling for traffic lights

    NARCIS (Netherlands)

    P. Kovacs; Le, T. (Tung); R. Núñez Queija (Rudesindo); Vu, H. (Hai); N. Walton

    2016-01-01

    textabstractWe consider the decentralized scheduling of a large number of urban traffic lights. We investigate factors determining system performance, in particular, the length of the traffic light cycle and the proportion of green time allocated to each junction. We study the effect of the length

  17. Distributed late-binding micro-scheduling and data caching for data-intensive workflows; Microplanificación de asignación tardía y almacenamiento temporal distribuidos para flujos de trabajo intensivos en datos

    Energy Technology Data Exchange (ETDEWEB)

    Delgado Peris, A.

    2015-07-01

    Today's world is flooded with vast amounts of digital information coming from innumerable sources. Moreover, it seems clear that this trend will only intensify in the future. Industry, society and remarkably science are not indifferent to this fact. On the contrary, they are struggling to get the most out of this data, which means that they need to capture, transfer, store and process it in a timely and efficient manner, using a wide range of computational resources. And this task is not always simple. A very representative example of the challenges posed by the management and processing of large quantities of data is that of the Large Hadron Collider experiments, which handle tens of petabytes of physics information every year. Based on the experience of one of these collaborations, we have studied the main issues involved in the management of huge volumes of data and in the completion of sizeable workflows that consume it. In this context, we have developed a general-purpose architecture for the scheduling and execution of workflows with heavy data requirements: the Task Queue. This new system builds on the late-binding overlay model, which has helped experiments to successfully overcome the problems associated to the heterogeneity and complexity of large computational grids. Our proposal introduces several enhancements to the existing systems. The execution agents of the Task Queue architecture share a Distributed Hash Table (DHT) and perform job matching and assignment cooperatively. In this way, scalability problems of centralized matching algorithms are avoided and workflow execution times are improved. Scalability makes fine-grained micro-scheduling possible and enables new functionalities, like the implementation of a distributed data cache on the execution nodes and the integration of data location information in the scheduling decisions...(Author)

  18. Exploring Dental Providers' Workflow in an Electronic Dental Record Environment.

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N; Ye, Zhan; Acharya, Amit

    2016-01-01

    A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR.

  19. Decentralization and mechanism design for online machine scheduling

    NARCIS (Netherlands)

    Arge, Lars; Heydenreich, Birgit; Müller, Rudolf; Freivalds, Rusins; Uetz, Marc Jochen

    We study the online version of the classical parallel machine scheduling problem to minimize the total weighted completion time from a new perspective: We assume that the data of each job, namely its release date $r_j$, its processing time $p_j$ and its weight $w_j$ is only known to the job itself,

  20. Integration of the radiotherapy irradiation planning in the digital workflow

    International Nuclear Information System (INIS)

    Roehner, F.; Schmucker, M.; Henne, K.; Bruggmoser, G.; Grosu, A.L.; Frommhold, H.; Heinemann, F.E.; Momm, F.

    2013-01-01

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  1. Policy Implementation Decentralization Government in Indonesia

    Directory of Open Access Journals (Sweden)

    Kardin M. Simanjuntak

    2015-06-01

    Full Text Available Decentralization in Indonesia is that reforms not completed and until the current implementation is not maximized or have not been successful. The essence of decentralization is internalising cost and benefit' for the people and how the government closer to the people. That's the most important essence of essence 'decentralization’. However, the implementation of decentralization in Indonesia is still far from the expectations. It is shown that only benefits of decentralization elite and local authorities, decentralization is a neo-liberal octopus, decentralization of public services are lacking in character, decentralization without institutional efficiency, decentralization fosters corruption in the area, and quasi-fiscal decentralization.

  2. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  3. A Scheduling Algorithm for the Distributed Student Registration System in Transaction-Intensive Environment

    Science.gov (United States)

    Li, Wenhao

    2011-01-01

    Distributed workflow technology has been widely used in modern education and e-business systems. Distributed web applications have shown cross-domain and cooperative characteristics to meet the need of current distributed workflow applications. In this paper, the author proposes a dynamic and adaptive scheduling algorithm PCSA (Pre-Calculated…

  4. A market approach to decentralized control of a manufacturing cell

    International Nuclear Information System (INIS)

    Shao Xinyu; Ma Li; Guan Zailin

    2009-01-01

    Based on a fictitious market model, a decentralized approach is presented for the workstation scheduling in a CNC workshop. A multi-agent framework is proposed, where job agents and resource agents act as buyers and sellers of resource in the virtual market. With cost and benefit calculation of these agent activities, which reflects the state of the production environment, various, and often conflicting goals and interests influencing the scheduling process in practice can be balanced through a unified instrument offered by the markets. The paper first introduces a heuristic procedure that makes scheduling reservations in a periodic manner. A multi-agent framework is then introduced, in which job agents and resource agents seek appropriate job-workstation matches through bidding in the construction of the above periodic 'micro-schedules'. A pricing policy is proposed for the price-directed coordination of agent activities in this. Simulation results demonstrate the feasibility of the proposed approach and give some insights on the effects of some decision making parameters. Future work will be focused on the designing of some more sophisticated coordination mechanism and its deployment.

  5. A market approach to decentralized control of a manufacturing cell

    Energy Technology Data Exchange (ETDEWEB)

    Shao Xinyu [State Key Lab of Digital Manufacturing and Equipments, Huazhong University of Science and Technology, Wuhan 430074, Hubei (China)], E-mail: shaoxy@hust.edu.cn; Ma Li [State Key Lab of Digital Manufacturing and Equipments, Huazhong University of Science and Technology, Wuhan 430074, Hubei (China)], E-mail: china_ml@163.com; Guan Zailin [State Key Lab of Digital Manufacturing and Equipments, Huazhong University of Science and Technology, Wuhan 430074, Hubei (China)], E-mail: zlguan@hust.edu.cn

    2009-03-15

    Based on a fictitious market model, a decentralized approach is presented for the workstation scheduling in a CNC workshop. A multi-agent framework is proposed, where job agents and resource agents act as buyers and sellers of resource in the virtual market. With cost and benefit calculation of these agent activities, which reflects the state of the production environment, various, and often conflicting goals and interests influencing the scheduling process in practice can be balanced through a unified instrument offered by the markets. The paper first introduces a heuristic procedure that makes scheduling reservations in a periodic manner. A multi-agent framework is then introduced, in which job agents and resource agents seek appropriate job-workstation matches through bidding in the construction of the above periodic 'micro-schedules'. A pricing policy is proposed for the price-directed coordination of agent activities in this. Simulation results demonstrate the feasibility of the proposed approach and give some insights on the effects of some decision making parameters. Future work will be focused on the designing of some more sophisticated coordination mechanism and its deployment.

  6. Real-Time Electronic Dashboard Technology and Its Use to Improve Pediatric Radiology Workflow.

    Science.gov (United States)

    Shailam, Randheer; Botwin, Ariel; Stout, Markus; Gee, Michael S

    The purpose of our study was to create a real-time electronic dashboard in the pediatric radiology reading room providing a visual display of updated information regarding scheduled and in-progress radiology examinations that could help radiologists to improve clinical workflow and efficiency. To accomplish this, a script was set up to automatically send real-time HL7 messages from the radiology information system (Epic Systems, Verona, WI) to an Iguana Interface engine, with relevant data regarding examinations stored in an SQL Server database for visual display on the dashboard. Implementation of an electronic dashboard in the reading room of a pediatric radiology academic practice has led to several improvements in clinical workflow, including decreasing the time interval for radiologist protocol entry for computed tomography or magnetic resonance imaging examinations as well as fewer telephone calls related to unprotocoled examinations. Other advantages include enhanced ability of radiologists to anticipate and attend to examinations requiring radiologist monitoring or scanning, as well as to work with technologists and operations managers to optimize scheduling in radiology resources. We foresee increased utilization of electronic dashboard technology in the future as a method to improve radiology workflow and quality of patient care. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Real-time Energy Resource Scheduling considering a Real Portuguese Scenario

    DEFF Research Database (Denmark)

    Silva, Marco; Sousa, Tiago; Morais, Hugo

    2014-01-01

    The development in power systems and the introduction of decentralized gen eration and Electric Vehicles (EVs), both connected to distribution networks, represents a major challenge in the planning and operation issues. This new paradigm requires a new energy resources management approach which...... scheduling in smart grids, considering day - ahead, hour - ahead and real - time scheduling. The case study considers a 33 - bus distribution network with high penetration of distributed energy resources . The wind generation profile is base d o n a rea l Portuguese wind farm . Four scenarios are presented...... taking into account 0, 1, 2 and 5 periods (hours or minutes) ahead of the scheduling period in the hour - ahead and real - time scheduling...

  8. Lift scheduling organization : Lift Concept for Lemminkainen

    OpenAIRE

    Mingalimov, Iurii

    2015-01-01

    The purpose of the work was to make a simple schedule for the main contractors and clients to check and control workflow connected with lifts. It gathers works with electricity, construction, engineering networks, installing equipment and commissioning works. The schedule was carried out during working on the building site Aino in Saint Petersburg in Lemminkӓinen. The duration of work was 5 months. The lift concept in Lemminkӓinen is very well controlled in comparison with other buil...

  9. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  10. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  11. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  13. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  14. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  15. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  16. Coordinating decentralized optimization of truck and shovel mining operations

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, R.; Fraser Forbes, J. [Alberta Univ., Edmonton, AB (Canada). Dept. of Chemical and Materials Engineering; San Yip, W. [Suncor Energy, Fort McMurray, AB (Canada)

    2006-07-01

    Canada's oil sands contain the largest known reserve of oil in the world. Oil sands mining uses 3 functional processes, ore hauling, overburden removal and mechanical maintenance. The industry relies mainly on truck-and-shovel technology in its open-pit mining operations which contributes greatly to the overall mining operation cost. Coordination between operating units is crucial for achieving an enterprise-wide optimal operation level. Some of the challenges facing the industry include multiple or conflicting objectives such as minimizing the use of raw materials and energy while maximizing production. The large sets of constraints that define the feasible domain pose as challenge, as does the uncertainty in system parameters. One solution lies in assigning truck resources to various activities. This fully decentralized approach would treat the optimization of ore production, waste removal and equipment maintenance independently. It was emphasized that mine-wide optimal operation can only be achieved by coordinating ore hauling and overburden removal processes. For that reason, this presentation proposed a coordination approach for a decentralized optimization system. The approach is based on the Dantzig-Wolfe decomposition and auction-based methods that have been previously used to decompose large-scale optimization problems. The treatment of discrete variables and coordinator design was described and the method was illustrated with a simple truck and shovel mining simulation study. The approach can be applied to a wide range of applications such as coordinating decentralized optimal control systems and scheduling. 16 refs., 3 tabs., 2 figs.

  17. Decentralized Quasi-Newton Methods

    Science.gov (United States)

    Eisen, Mark; Mokhtari, Aryan; Ribeiro, Alejandro

    2017-05-01

    We introduce the decentralized Broyden-Fletcher-Goldfarb-Shanno (D-BFGS) method as a variation of the BFGS quasi-Newton method for solving decentralized optimization problems. The D-BFGS method is of interest in problems that are not well conditioned, making first order decentralized methods ineffective, and in which second order information is not readily available, making second order decentralized methods impossible. D-BFGS is a fully distributed algorithm in which nodes approximate curvature information of themselves and their neighbors through the satisfaction of a secant condition. We additionally provide a formulation of the algorithm in asynchronous settings. Convergence of D-BFGS is established formally in both the synchronous and asynchronous settings and strong performance advantages relative to first order methods are shown numerically.

  18. Federalism and Decentralization of Education in Argentina. Unintended Consequences of Decentralization of Expenditures in a Federal Country.

    Science.gov (United States)

    Falleti, Tulia G.

    By analyzing the process of decentralization of education in Argentina, this paper complements the existing literature on decentralization and federalism in two ways: (1) it studies the impact of federal institutions on the origins and evolution of decentralization; and (2) it analyzes a case of decentralization of education that, in a way not…

  19. An Organizational and Qualitative Approach to Improving University Course Scheduling

    Science.gov (United States)

    Hill, Duncan L.

    2010-01-01

    Focusing on the current timetabling process at the University of Toronto Mississauga (UTM), I apply David Wesson's theoretical framework in order to understand (1) how increasing enrollment interacts with a decentralized timetabling process to limit the flexibility of course schedules and (2) the resultant impact on educational quality. I then…

  20. The Two Edge Knife of Decentralization

    Directory of Open Access Journals (Sweden)

    Ahmad Khoirul Umam

    2011-07-01

    Full Text Available A centralistic government model has become a trend in a number of developing countries, in which the ideosycretic aspect becomes pivotal key in the policy making. The situation constitutes authoritarianism, cronyism, and corruption. To break the impasse, the decentralized system is proposed to make people closer to the public policy making. Decentralization is also convinced to be the solution to create a good governance. But a number of facts in the developing countries demonstrates that decentralization indeed has ignite emerges backfires such as decentralized corruption, parochialism, horizontal conflict, local political instability and others. This article elaborates the theoretical framework on decentralization's ouput as the a double-edge knife. In a simple words, the concept of decentralization does not have a permanent relationship with the creation of good governance and development. Without substantive democracy, decentralization is indeed potential to be a destructive political instrument threating the state's future.

  1. Decentralized neural control application to robotics

    CERN Document Server

    Garcia-Hernandez, Ramon; Sanchez, Edgar N; Alanis, Alma y; Ruz-Hernandez, Jose A

    2017-01-01

    This book provides a decentralized approach for the identification and control of robotics systems. It also presents recent research in decentralized neural control and includes applications to robotics. Decentralized control is free from difficulties due to complexity in design, debugging, data gathering and storage requirements, making it preferable for interconnected systems. Furthermore, as opposed to the centralized approach, it can be implemented with parallel processors. This approach deals with four decentralized control schemes, which are able to identify the robot dynamics. The training of each neural network is performed on-line using an extended Kalman filter (EKF). The first indirect decentralized control scheme applies the discrete-time block control approach, to formulate a nonlinear sliding manifold. The second direct decentralized neural control scheme is based on the backstepping technique, approximated by a high order neural network. The third control scheme applies a decentralized neural i...

  2. Access Control with Delegated Authorization Policy Evaluation for Data-Driven Microservice Workflows

    Directory of Open Access Journals (Sweden)

    Davy Preuveneers

    2017-09-01

    Full Text Available Microservices offer a compelling competitive advantage for building data flow systems as a choreography of self-contained data endpoints that each implement a specific data processing functionality. Such a ‘single responsibility principle’ design makes them well suited for constructing scalable and flexible data integration and real-time data flow applications. In this paper, we investigate microservice based data processing workflows from a security point of view, i.e., (1 how to constrain data processing workflows with respect to dynamic authorization policies granting or denying access to certain microservice results depending on the flow of the data; (2 how to let multiple microservices contribute to a collective data-driven authorization decision and (3 how to put adequate measures in place such that the data within each individual microservice is protected against illegitimate access from unauthorized users or other microservices. Due to this multifold objective, enforcing access control on the data endpoints to prevent information leakage or preserve one’s privacy becomes far more challenging, as authorization policies can have dependencies and decision outcomes cross-cutting data in multiple microservices. To address this challenge, we present and evaluate a workflow-oriented authorization framework that enforces authorization policies in a decentralized manner and where the delegated policy evaluation leverages feature toggles that are managed at runtime by software circuit breakers to secure the distributed data processing workflows. The benefit of our solution is that, on the one hand, authorization policies restrict access to the data endpoints of the microservices, and on the other hand, microservices can safely rely on other data endpoints to collectively evaluate cross-cutting access control decisions without having to rely on a shared storage backend holding all the necessary information for the policy evaluation.

  3. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  4. Development of a pharmacy resident rotation to expand decentralized clinical pharmacy services.

    Science.gov (United States)

    Hill, John D; Williams, Jonathan P; Barnes, Julie F; Greenlee, Katie M; Cardiology, Bcps-Aq; Leonard, Mandy C

    2017-07-15

    The development of a pharmacy resident rotation to expand decentralized clinical pharmacy services is described. In an effort to align with the initiatives proposed within the ASHP Practice Advancement Initiative, the department of pharmacy at Cleveland Clinic, a 1,400-bed academic, tertiary acute care medical center in Cleveland, Ohio, established a goal to provide decentralized clinical pharmacy services for 100% of patient care units within the hospital. Patient care units that previously had no decentralized pharmacy services were evaluated to identify opportunities for expansion. Metrics analyzed included number of medication orders verified per hour, number of pharmacy dosing consultations, and number of patient discharge counseling sessions. A pilot study was conducted to assess the feasibility of this service and potential resident learning opportunities. A learning experience description was drafted, and feedback was solicited regarding the development of educational components utilized throughout the rotation. Pharmacists who were providing services to similar patient populations were identified to serve as preceptors. Staff pharmacists were deployed to previously uncovered patient care units, with pharmacy residents providing decentralized services on previously covered areas. A rotating preceptor schedule was developed based on geographic proximity and clinical expertise. An initial postimplementation assessment of this resident-driven service revealed that pharmacy residents provided a comparable level of pharmacy services to that of staff pharmacists. Feedback collected from nurses, physicians, and pharmacy staff also supported residents' ability to operate sufficiently in this role to optimize patient care. A learning experience developed for pharmacy residents in a large medical center enabled the expansion of decentralized clinical services without requiring additional pharmacist full-time equivalents. Copyright © 2017 by the American Society of

  5. Organizational decentralization in radiology.

    Science.gov (United States)

    Aas, I H Monrad

    2006-01-01

    At present, most hospitals have a department of radiology where images are captured and interpreted. Decentralization is the opposite of centralization and means 'away from the centre'. With a Picture Archiving and Communication System (PACS) and broadband communications, transmitting radiology images between sites will be far easier than before. Qualitative interviews of 26 resource persons were performed in Norway. There was a response rate of 90%. Decentralization of radiology interpretations seems less relevant than centralization, but several forms of decentralization have a role to play. The respondents mentioned several advantages, including exploitation of capacity and competence. They also mentioned several disadvantages, including splitting professional communities and reduced contact between radiologists and clinicians. With the new technology decentralization and centralization of image interpretation are important possibilities in organizational change. This will be important for the future of teleradiology.

  6. (DeCentralization of the Global Informational Ecosystem

    Directory of Open Access Journals (Sweden)

    Johanna Möller

    2017-09-01

    Full Text Available Centralization and decentralization are key concepts in debates that focus on the (antidemocratic character of digital societies. Centralization is understood as the control over communication and data flows, and decentralization as giving it (back to users. Communication and media research focuses on centralization put forward by dominant digital media platforms, such as Facebook and Google, and governments. Decentralization is investigated regarding its potential in civil society, i.e., hacktivism, (encryption technologies, and grass-root technology movements. As content-based media companies increasingly engage with technology, they move into the focus of critical media studies. Moreover, as formerly nationally oriented companies now compete with global media platforms, they share several interests with civil society decentralization agents. Based on 26 qualitative interviews with leading media managers, we investigate (decentralization strategies applied by content-oriented media companies. Theoretically, this perspective on media companies as agents of (decentralization expands (decentralization research beyond traditional democratic stakeholders by considering economic actors within the “global informational ecosystem” (Birkinbine, Gómez, & Wasko, 2017. We provide a three-dimensional framework to empirically investigate (decentralization. From critical media studies, we borrow the (decentralization of data and infrastructures, from media business research, the (decentralization of content distribution.

  7. Decentralization in Air Transportation

    NARCIS (Netherlands)

    Udluft, H.

    2017-01-01

    In this work,we demonstrate that decentralized control can result in stable, efficient, and robust operations in the Air Transportation System. We implement decentralized control for aircraft taxiing operations and use Agent-Based Modeling and Simulation to analyze the resulting system behavior

  8. Decentralized control of complex systems

    CERN Document Server

    Siljak, Dragoslav D

    2011-01-01

    Complex systems require fast control action in response to local input, and perturbations dictate the use of decentralized information and control structures. This much-cited reference book explores the approaches to synthesizing control laws under decentralized information structure constraints.Starting with a graph-theoretic framework for structural modeling of complex systems, the text presents results related to robust stabilization via decentralized state feedback. Subsequent chapters explore optimization, output feedback, the manipulative power of graphs, overlapping decompositions and t

  9. What supervisors want to know about decentralization.

    Science.gov (United States)

    Boissoneau, R; Belton, P

    1991-06-01

    Many organizations in various industries have tended to move away from strict centralization, yet some centralization is still vital to top management. With 19 of the 22 executives interviewed favoring or implementing some form of decentralization, it is probable that traditionally centralized organizations will follow the trend and begin to decentralize their organizational structures. The incentives and advantages of decentralization are too attractive to ignore. Decentralization provides responsibility, clear objectives, accountability for results, and more efficient and effective decision making. However, one must remember that decentralization can be overextended and that centralization is still viable in certain functions. Finding the correct balance between control and autonomy is a key to decentralization. Too much control and too much autonomy are the primary reasons for decentralization failures. In today's changing, competitive environment, structures must be continuously redefined, with the goal of finding an optimal balance between centralization and decentralization. Organizations are cautioned not to seek out and install a single philosopher-king to impose unified direction, but to unify leadership goals, participation, style, and control to develop improved methods of making all responsible leaders of one mind about the organization's needs and goals.

  10. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, Cirano; Chiao, Carolina; Hess, Guillermo; Nascimento, Gleison; Thom, Lucinéia; Reichert, Manfred

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  11. Location-based Scheduling

    DEFF Research Database (Denmark)

    Andersson, Niclas; Christensen, Knud

    on the market. However, CPM is primarily an activity based method that takes the activity as the unit of focus and there is criticism raised, specifically in the case of construction projects, on the method for deficient management of construction work and continuous flow of resources. To seek solutions...... to the identified limitations of the CPM method, an alternative planning and scheduling methodology that includes locations is tested. Location-based Scheduling (LBS) implies a shift in focus, from primarily the activities to the flow of work through the various locations of the project, i.e. the building. LBS uses...... the graphical presentation technique of Line-of-balance, which is adapted for planning and management of work-flows that facilitates resources to perform their work without interruptions caused by other resources working with other activities in the same location. As such, LBS and Lean Construction share...

  12. Decentralized portfolio management

    OpenAIRE

    Coutinho, Paulo; Tabak, Benjamin Miranda

    2003-01-01

    We use a mean-variance model to analyze the problem of decentralized portfolio management. We find the solution for the optimal portfolio allocation for a head trader operating in n different markets, which is called the optimal centralized portfolio. However, as there are many traders specialized in different markets, the solution to the problem of optimal decentralized allocation should be different from the centralized case. In this paper we derive conditions for the solutions to be equiva...

  13. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  14. Wage Dispersion and Decentralization of Wage Bargaining

    DEFF Research Database (Denmark)

    Dahl, Christian Møller; le Maire, Christian Daniel; Munch, Jakob R.

    2013-01-01

    This article studies how decentralization of wage bargaining from sector to firm level influences wage levels and wage dispersion. We use detailed panel data covering a period of decentralization in the Danish labor market. The decentralization process provides variation in the individual worker......'s wage-setting system that facilitates identification of the effects of decentralization. We find a wage premium associated with firm-level bargaining relative to sector-level bargaining and that the return to skills is higher under the more decentralized wage-setting systems. Using quantile regression......, we also find that wages are more dispersed under firm-level bargaining compared to more centralized wage-setting systems....

  15. Decentralization and Governance in Indonesia

    NARCIS (Netherlands)

    Holzhacker, Ronald; Wittek, Rafael; Woltjer, Johan

    2016-01-01

    I. Theoretical Reflections on Decentralization and Governance for Sustainable Society 1. Decentralization and Governance for Sustainable Society in Indonesia Ronald Holzhacker, Rafael Wittek and Johan Woltjer 2. Good Governance Contested: Exploring Human Rights and Sustainability as Normative Goals

  16. Workflow in Almaraz NPP

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    2000-01-01

    Almaraz NPP decided to incorporate Workflow into its information system in response to the need to provide exhaustive follow-up and monitoring of each phase of the different procedures it manages. Oracle's Workflow was chosen for this purpose and it was integrated with previously developed applications. The objectives to be met in the incorporation of Workflow were as follows: Strict monitoring of procedures and processes. Detection of bottlenecks in the flow of information. Notification of those affected by pending tasks. Flexible allocation of tasks to user groups. Improved monitoring of management procedures. Improved communication. Similarly, special care was taken to: Integrate workflow processes with existing control panels. Synchronize workflow with installation procedures. Ensure that the system reflects use of paper forms. At present the Corrective Maintenance Request module is being operated using Workflow and the Work Orders and Notice of Order modules are about to follow suit. (Author)

  17. A contingency approach to decentralization

    NARCIS (Netherlands)

    Fleurke, F.; Hulst, J.R.

    2006-01-01

    After decades of centralization, in 1980 the central government of the Netherlands embarked upon an ambitious project to decentralize the administrative system. It proclaimed a series of general decentralization measures that aimed to improve the performance of the administrative system and to boost

  18. On Decentralization and Life Satisfaction

    DEFF Research Database (Denmark)

    Bjørnskov, Christian; Dreher, Axel; Fischer, Justina A.V.

    2008-01-01

    We empirically analyze the impact of fiscal and political decentralization on subjective well-being in a cross-section of 60,000 individuals from 66 countries. More spending or revenue decentralization raises well-being while greater local autonomy is beneficial only via government consumption sp...

  19. Wage Dispersion and Decentralization of Wage Bargaining

    DEFF Research Database (Denmark)

    Dahl, Christian M.; Le Maire, Christian Daniel; Munch, Jakob Roland

    in the individual worker's wage-setting system that facilitates identification of the effects of decentralization. Consistent with predictions we find that wages are more dispersed under firm-level bargaining compared to more centralized wage-setting systems. However, the differences across wage-setting systems......This paper studies how decentralization of wage bargaining from sector to firm level influences wage levels and wage dispersion. We use a detailed panel data set covering a period of decentralization in the Danish labor market. The decentralization process provides exogenous variation...

  20. Decentralized Procurement in Light of Strategic Inventories

    DEFF Research Database (Denmark)

    Frimor, Hans; Arya, Anil; Mittendorf, Brian

    2015-01-01

    The centralization versus decentralization choice is perhaps the quintessential organizational structure decision. In the operations realm, this choice is particularly critical when it comes to the procurement function. Why firms may opt to decentralize procurement has been often studied and conf......The centralization versus decentralization choice is perhaps the quintessential organizational structure decision. In the operations realm, this choice is particularly critical when it comes to the procurement function. Why firms may opt to decentralize procurement has been often studied...... and confirmed to be a multifaceted choice. This paper complements existing studies by detailing the trade-offs in the centralization versus decentralization decision in light of firm's decision to cede procurement choices to its individual devisions can help moderate inventory levels and provide a natural salve...

  1. Wind Farm Decentralized Dynamic Modeling With Parameters

    DEFF Research Database (Denmark)

    Soltani, Mohsen; Shakeri, Sayyed Mojtaba; Grunnet, Jacob Deleuran

    2010-01-01

    Development of dynamic wind flow models for wind farms is part of the research in European research FP7 project AEOLUS. The objective of this report is to provide decentralized dynamic wind flow models with parameters. The report presents a structure for decentralized flow models with inputs from...... local models. The results of this report are especially useful, but not limited, to design a decentralized wind farm controller, since in centralized controller design one can also use the model and update it in a central computing node.......Development of dynamic wind flow models for wind farms is part of the research in European research FP7 project AEOLUS. The objective of this report is to provide decentralized dynamic wind flow models with parameters. The report presents a structure for decentralized flow models with inputs from...

  2. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  3. Ubiquitous consultation tool for decentral knowledge workers

    OpenAIRE

    Nazari Shirehjini, A.A.; Rühl, C.; Noll, S.

    2003-01-01

    The special issue of this initial study is to examine the current work situation of consulting companies, and to elaborate a concept for supporting decentralized working consultants. The concept addresses significant challenges of decentralized work processes by deploying the Peer-to-Peer methodology to decentralized expert and Knowledge Management, cooperation, and enterprise resource planning.

  4. Query Optimizations over Decentralized RDF Graphs

    KAUST Repository

    Abdelaziz, Ibrahim; Mansour, Essam; Ouzzani, Mourad; Aboulnaga, Ashraf; Kalnis, Panos

    2017-01-01

    Applications in life sciences, decentralized social networks, Internet of Things, and statistical linked dataspaces integrate data from multiple decentralized RDF graphs via SPARQL queries. Several approaches have been proposed to optimize query

  5. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  6. Decentralized Job Scheduling in the Cloud Based on a Spatially Generalized Prisoner’s Dilemma Game

    Directory of Open Access Journals (Sweden)

    Gąsior Jakub

    2015-12-01

    Full Text Available We present in this paper a novel distributed solution to a security-aware job scheduling problem in cloud computing infrastructures. We assume that the assignment of the available resources is governed exclusively by the specialized brokers assigned to individual users submitting their jobs to the system. The goal of this scheme is allocating a limited quantity of resources to a specific number of jobs minimizing their execution failure probability and total completion time. Our approach is based on the Pareto dominance relationship and implemented at an individual user level. To select the best scheduling strategies from the resulting Pareto frontiers and construct a global scheduling solution, we developed a decision-making mechanism based on the game-theoretic model of Spatial Prisoner’s Dilemma, realized by selfish agents operating in the two-dimensional cellular automata space. Their behavior is conditioned by the objectives of the various entities involved in the scheduling process and driven towards a Nash equilibrium solution by the employed social welfare criteria. The performance of the scheduler applied is verified by a number of numerical experiments. The related results show the effectiveness and scalability of the scheme in the presence of a large number of jobs and resources involved in the scheduling process.

  7. ADRES : autonomous decentralized regenerative energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Brauner, G.; Einfalt, A.; Leitinger, C.; Tiefgraber, D. [Vienna Univ. of Technology (Austria)

    2007-07-01

    The autonomous decentralized regenerative energy systems (ADRES) research project demonstrates that decentralized network independent microgrids are the target power systems of the future. This paper presented a typical structure of a microgrid, demonstrating that all types of generation available can be integrated, from wind and small hydro to photovoltaic, fuel cell, biomass or biogas operated stirling motors and micro turbines. In grid connected operation the balancing energy and reactive power for voltage control will come from the public grid. If there is no interconnection to a superior grid, it will form an autonomous micro grid. In order to reduce peak power demand and base energy, autonomous microgrid technology requires highly efficient appliances. Otherwise large collector design, high storage and balancing generation capacities would be necessary, which would increase costs. End-use energy efficiency was discussed with reference to demand side management (DSM) strategies that match energy demand with actual supply in order to minimize the storage size needed. This paper also discussed network controls that comprise active and reactive power. Decentralized robust algorithms were investigated with reference to black-start ability and congestion management features. It was concluded that the trend to develop small decentralized grids in parallel to existing large systems will improve security of supply and reduce greenhouse gas emissions. Decentralized grids will also increase energy efficiency because regenerative energy will be used where it is collected in the form of electricity and heat, thus avoiding transport and the extension of transmission lines. Decentralized energy technology is now becoming more economic by efficient and economic mass production of components. Although decentralized energy technology requires energy automation, computer intelligence is becoming increasingly cost efficient. 2 refs., 4 figs.

  8. Decentralization and the local development state

    DEFF Research Database (Denmark)

    Emmenegger, Rony Hugo

    2016-01-01

    This article explores the politics of decentralization and state-peasant encounters in rural Oromiya, Ethiopia. Breaking with a centralized past, the incumbent government of the Ethiopian People's Revolutionary Democratic Front (EPRDF) committed itself to a decentralization policy in the early 1990......s and has since then created a number of new sites for state-citizen interactions. In the context of electoral authoritarianism, however, decentralization has been interpreted as a means for the expansion of the party-state at the grass-roots level. Against this backdrop, this article attempts...... between the 2005 and 2010 elections. Based on ethnographic field research, the empirical case presented discloses that decentralization and state-led development serve the expansion of state power into rural areas, but that state authority is simultaneously constituted and undermined in the course...

  9. Decentralized or Centralized Systems for Colleges and Universities?

    Science.gov (United States)

    Heydinger, Richard B.; Norris, Donald M.

    1979-01-01

    Arguments for and against decentralization of data management, analysis, and planning systems are presented. It is suggested that technological advances have encouraged decentralization. Caution in this direction is urged and the development of an articulated decentralization program is proposed. (SF)

  10. REPNET: project scheduling and workflow optimization for Construction Projects

    Directory of Open Access Journals (Sweden)

    Marco Alvise Bragadin

    2013-10-01

    Full Text Available Project planning and control are core processes for construction management. In practice project planning is achieved by network - based techniques like Precedence Diagramming Method (PDM.Indeed many researchers and practitioners claims that networking techniques as such do not provide a suitable model for construction projects. Construction process modeling should incorporate for specific features of resource flows through project activities. So an improved resource scheduling method for construction is developed, called REPNET, based on a precedence network plotted on a resource–space chart and presented with a flow-line chart. The heuristics of REPNET are used to carry out resource timing while optimizing processes flows and resource usage. The method has been tested on a sample project.

  11. Decentralization and Economic Growth per capita in Europe

    NARCIS (Netherlands)

    Crucq, Pieter; Hemminga, Hendrik-Jan

    2007-01-01

    In this paper the relationship between decentralization and economic growth is investigated. The focus is on decentralization from the national government to the highest substate level in a country, which we define as regional decentralization. Section 2 discusses the different dimensions of

  12. MACROECONOMIC IMPACT OF DECENTRALIZATION

    Directory of Open Access Journals (Sweden)

    Emilia Cornelia STOICA

    2014-05-01

    Full Text Available The concept of decentralization has a variety of expressions, but the meaning generally accepted refers to the transfer of authority and responsibility of the public functions from central government to sub-national public entities or even to the private sector. Decentralization process is complex, affecting many aspects of social and economic life and public management, and its design and implementation cover several stages, depending on the cyclical and structural developments of the country. From an economic perspective, decentralization is seen as a means of primary importance in terms of improving the effectiveness and efficiency of public services and macroeconomic stability due to the redistribution of public finances while in a much closer logic of the government policy objectives. But the decentralization process behaves as well some risks, because it involves the implementation of appropriate mechanisms for the establishment of income and expenditure programming at the subnational level, which, if is not correlated with macroeconomic policy imperatives can lead to major imbalances, both financially as in termes of economic and social life. Equally, ensuring the balance of the budget at the local level is imperative to fulfill, this goal imposing a legal framework and specific procedures to size transfers of public funds, targeted or untargeted. Also, public and local authorities have to adopt appropriate laws and regulations such that sub-national public entities can access loans - such as bank loans or debentures from domestic or external market - in terms of a strict monitoring national financial stability. In all aspects of decentralization - political, administrative, financial -, public authorities should develop and implement the most effective mechanisms to coordinate macroeconomic objectives and both sectoral and local interests and establish clear responsibilities - exclusive or shared - for all parties involved in the

  13. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  14. Performance Analysis of the Decentralized Eigendecomposition and ESPRIT Algorithm

    Science.gov (United States)

    Suleiman, Wassim; Pesavento, Marius; Zoubir, Abdelhak M.

    2016-05-01

    In this paper, we consider performance analysis of the decentralized power method for the eigendecomposition of the sample covariance matrix based on the averaging consensus protocol. An analytical expression of the second order statistics of the eigenvectors obtained from the decentralized power method which is required for computing the mean square error (MSE) of subspace-based estimators is presented. We show that the decentralized power method is not an asymptotically consistent estimator of the eigenvectors of the true measurement covariance matrix unless the averaging consensus protocol is carried out over an infinitely large number of iterations. Moreover, we introduce the decentralized ESPRIT algorithm which yields fully decentralized direction-of-arrival (DOA) estimates. Based on the performance analysis of the decentralized power method, we derive an analytical expression of the MSE of DOA estimators using the decentralized ESPRIT algorithm. The validity of our asymptotic results is demonstrated by simulations.

  15. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  16. (De)centralization of the global informational ecosystem

    OpenAIRE

    Möller, Johanna; Rimscha, M. Bjørn von

    2017-01-01

    Centralization and decentralization are key concepts in debates that focus on the (anti)democratic character of digital societies. Centralization is understood as the control over communication and data flows, and decentralization as giving it (back) to users. Communication and media research focuses on centralization put forward by dominant digital media platforms, such as Facebook and Google, and governments. Decentralization is investigated regarding its potential in civil society, i.e., h...

  17. Decentralized control using compositional analysis techniques

    NARCIS (Netherlands)

    Kerber, F.; van der Schaft, A. J.

    2011-01-01

    Decentralized control strategies aim at achieving a global control target by means of distributed local controllers acting on individual subsystems of the overall plant. In this sense, decentralized control is a dual problem to compositional analysis where a global verification task is decomposed

  18. Decentralization of Health System in Islamic Republic of Iran

    Directory of Open Access Journals (Sweden)

    MJ Kabir

    2008-10-01

    Full Text Available Decentralization is the process of dispersing decision-making closer to the point of peripheral area, service or action. Basically decentralized governance, if properly planned and implemented, offers important opportunities for enhanced human development. The studies about this issue in different countries show that most of the decentralizations have been implemented in European countries and in comparison, the Middle East countries have been utilized lower degrees of the decentralization process. In fact, decentralization in the health system is a policy pursued for a variety of purposes including; increase in service delivery effectiveness and equity, improving efficiency and quality, fairness of financial contribution and planning for choosing the most appropriate interventions for the health priorities in peripheral regions. To implement decentralized governance, there is a spectrum of different choices that the government should regulate their degrees. Providing an appropriate atmosphere for decentralization is essential, otherwise lack of planning and achievement can result in complications for the system.

  19. Partially Decentralized Control Architectures for Satellite Formations

    Science.gov (United States)

    Carpenter, J. Russell; Bauer, Frank H.

    2002-01-01

    In a partially decentralized control architecture, more than one but less than all nodes have supervisory capability. This paper describes an approach to choosing the number of supervisors in such au architecture, based on a reliability vs. cost trade. It also considers the implications of these results for the design of navigation systems for satellite formations that could be controlled with a partially decentralized architecture. Using an assumed cost model, analytic and simulation-based results indicate that it may be cheaper to achieve a given overall system reliability with a partially decentralized architecture containing only a few supervisors, than with either fully decentralized or purely centralized architectures. Nominally, the subset of supervisors may act as centralized estimation and control nodes for corresponding subsets of the remaining subordinate nodes, and act as decentralized estimation and control peers with respect to each other. However, in the context of partially decentralized satellite formation control, the absolute positions and velocities of each spacecraft are unique, so that correlations which make estimates using only local information suboptimal only occur through common biases and process noise. Covariance and monte-carlo analysis of a simplified system show that this lack of correlation may allow simplification of the local estimators while preserving the global optimality of the maneuvers commanded by the supervisors.

  20. DECENTRALIZATION IN THE SYSTEM OF NATIONAL ECONOMY MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Stepaniuk Nataliia

    2018-03-01

    Full Text Available Introduction. Article deals with the investigation of the theoretical approaches to the notion of decentralization in the system of management of the national economy. Purpose. It has been found that for the effective functioning of the state it is necessary to achieve a rational relationship between centralization and decentralization, change the role, responsibility and powers for local self-government and executive authority. Results. t is substantiated that most of the scientific works are devoted to the study of the issue of decentralization of power, the implementation of reform of public finances, the transfer of power to the place as a guarantee of the development of the national economy. It is emphasized that the main idea of decentralization is to transfer competence to local government to address local needs issues. Consequently, decentralization is closely linked to the organization of public administration, promotes the building of effective relations between state authorities and local government. The main advantages of decentralization are: simplified management on the local area, establishing closer connection with civil society, increasing transparency of managerial decisions and raising the level of responsibility to the territorial community. Considered organizational and legal aspects of introduction of decentralization in Ukraine. It is noted that the course on decentralization outlines both prospects and implementation problems. Among the main risks of decentralization are the inconsistencies of the development of separate territorial units and strategic goals, the loss of state mobility, reduction of workplaces of the state apparatus, risks of complication of coordination between levels of management. Conclusions. It has been determined that for efficiency and effectiveness of the reform decentralization principles are necessary for wide introduction in the administrative, political, budgetary, financial and social spheres

  1. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  2. Centralized vs. de-centralized multinationals and taxes

    OpenAIRE

    Nielsen, Søren Bo; Raimondos-Møller, Pascalis; Schjelderup, Guttorm

    2005-01-01

    The paper examines how country tax differences affect a multinational enterprise's choice to centralize or de-centralize its decision structure. Within a simple model that emphasizes the multiple conflicting roles of transfer prices in MNEs – here, as a strategic pre-commitment device and a tax manipulation instrument –, we show that (de-)centralized decisions are more profitable when tax differentials are (small) large. Keywords: Centralized vs. de-centralized decisions, taxes, MNEs. ...

  3. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  4. Sustainability evaluation of decentralized electricity generation

    International Nuclear Information System (INIS)

    Karger, Cornelia R.; Hennings, Wilfried

    2009-01-01

    Decentralized power generation is gaining significance in liberalized electricity markets. An increasing decentralization of power supply is expected to make a particular contribution to climate protection. This article investigates the advantages and disadvantages of decentralized electricity generation according to the overall concept of sustainable development. On the basis of a hierarchically structured set of sustainability criteria, four future scenarios for Germany are assessed, all of which describe different concepts of electricity supply in the context of the corresponding social and economic developments. The scenarios are developed in an explorative way according to the scenario method and the sustainability criteria are established by a discursive method with societal actors. The evaluation is carried out by scientific experts. By applying an expanded analytic hierarchy process (AHP), a multicriteria evaluation is conducted that identifies dissent among the experts. The results demonstrate that decentralized electricity generation can contribute to climate protection. The extent to which it simultaneously guarantees security of supply is still a matter of controversy. However, experts agree that technical and economic boundary conditions are of major importance in this field. In the final section, the article discusses the method employed here as well as implications for future decentralized energy supply. (author)

  5. Decentralized Decision Making Toward Educational Goals.

    Science.gov (United States)

    Monahan, William W.; Johnson, Homer M.

    This monograph provides guidelines to help those school districts considering a more decentralized form of management. The authors discuss the levels at which different types of decisions should be made, describe the changing nature of the educational environment, identify different centralization-decentralization models, and suggest a flexible…

  6. Decentralized Software Architecture

    National Research Council Canada - National Science Library

    Khare, Rohit

    2002-01-01

    .... While the term "decentralization" is familiar from political and economic contexts, it has been applied extensively, if indiscriminately, to describe recent trends in software architecture towards...

  7. Computing for Decentralized Systems (lecture 1)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    With the rise of Bitcoin, Ethereum, and other cryptocurrencies it is becoming apparent the paradigm shift towards decentralized computing. Computer engineers will need to understand this shift when developing systems in the coming years. Transferring value over the Internet is just one of the first working use cases of decentralized systems, but it is expected they will be used for a number of different services such as general purpose computing, data storage, or even new forms of governance. Decentralized systems, however, pose a series of challenges that cannot be addressed with traditional approaches in computing. Not having a central authority implies truth must be agreed upon rather than simply trusted and, so, consensus protocols, cryptographic data structures like the blockchain, and incentive models like mining rewards become critical for the correct behavior of decentralized system. This series of lectures will be a fast track to introduce these fundamental concepts through working examples and pra...

  8. Computing for Decentralized Systems (lecture 2)

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    With the rise of Bitcoin, Ethereum, and other cryptocurrencies it is becoming apparent the paradigm shift towards decentralized computing. Computer engineers will need to understand this shift when developing systems in the coming years. Transferring value over the Internet is just one of the first working use cases of decentralized systems, but it is expected they will be used for a number of different services such as general purpose computing, data storage, or even new forms of governance. Decentralized systems, however, pose a series of challenges that cannot be addressed with traditional approaches in computing. Not having a central authority implies truth must be agreed upon rather than simply trusted and, so, consensus protocols, cryptographic data structures like the blockchain, and incentive models like mining rewards become critical for the correct behavior of decentralized system. This series of lectures will be a fast track to introduce these fundamental concepts through working examples and pra...

  9. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  10. Decentralization or centralization: striking a balance.

    Science.gov (United States)

    Dirschel, K M

    1994-09-01

    An Executive Vice President for Nursing can provide the necessary link to meet diverse clinical demands when encountering centralization--decentralization decisions. Centralized communication links hospital departments giving nurses a unified voice. Decentralization acknowledges the need for diversity and achieves the right balance of uniformity through a responsive communications network.

  11. Decentralized Bribery and Market Participation

    OpenAIRE

    Popov, Sergey V.

    2012-01-01

    I propose a bribery model that examines decentralized bureaucratic decision-making. There are multiple stable equilibria. High levels of bribery reduce an economy's productivity because corruption suppresses small business, and reduces the total graft, even though the size of an individual bribe might increase. Decentralization prevents movement towards a Pareto-dominant equilibrium. Anticorruption efforts, even temporary ones, might be useful to improve participation, if they lower the bribe...

  12. Supporting Real-Time Operations and Execution through Timeline and Scheduling Aids

    Science.gov (United States)

    Marquez, Jessica J.; Pyrzak, Guy; Hashemi, Sam; Ahmed, Samia; McMillin, Kevin Edward; Medwid, Joseph Daniel; Chen, Diana; Hurtle, Esten

    2013-01-01

    Since 2003, the NASA Ames Research Center has been actively involved in researching and advancing the state-of-the-art of planning and scheduling tools for NASA mission operations. Our planning toolkit SPIFe (Scheduling and Planning Interface for Exploration) has supported a variety of missions and field tests, scheduling activities for Mars rovers as well as crew on-board International Space Station and NASA earth analogs. The scheduled plan is the integration of all the activities for the day/s. In turn, the agents (rovers, landers, spaceships, crew) execute from this schedule while the mission support team members (e.g., flight controllers) follow the schedule during execution. Over the last couple of years, our team has begun to research and validate methods that will better support users during realtime operations and execution of scheduled activities. Our team utilizes human-computer interaction principles to research user needs, identify workflow processes, prototype software aids, and user test these. This paper discusses three specific prototypes developed and user tested to support real-time operations: Score Mobile, Playbook, and Mobile Assistant for Task Execution (MATE).

  13. A customizable, scalable scheduling and reporting system.

    Science.gov (United States)

    Wood, Jody L; Whitman, Beverly J; Mackley, Lisa A; Armstrong, Robert; Shotto, Robert T

    2014-06-01

    Scheduling is essential for running a facility smoothly and for summarizing activities in use reports. The Penn State Hershey Clinical Simulation Center has developed a scheduling interface that uses off-the-shelf components, with customizations that adapt to each institution's data collection and reporting needs. The system is designed using programs within the Microsoft Office 2010 suite. Outlook provides the scheduling component, while the reporting is performed using Access or Excel. An account with a calendar is created for the main schedule, with separate resource accounts created for each room within the center. The Outlook appointment form's 2 default tabs are used, in addition to a customized third tab. The data are then copied from the calendar into either a database table or a spreadsheet, where the reports are generated.Incorporating this system into an institution-wide structure allows integration of personnel lists and potentially enables all users to check the schedule from their desktop. Outlook also has a Web-based application for viewing the basic schedule from outside the institution, although customized data cannot be accessed. The scheduling and reporting functions have been used for a year at the Penn State Hershey Clinical Simulation Center. The schedule has increased workflow efficiency, improved the quality of recorded information, and provided more accurate reporting. The Penn State Hershey Clinical Simulation Center's scheduling and reporting system can be adapted easily to most simulation centers and can expand and change to meet future growth with little or no expense to the center.

  14. On Lifecycle Constraints of Artifact-Centric Workflows

    Science.gov (United States)

    Kucukoguz, Esra; Su, Jianwen

    Data plays a fundamental role in modeling and management of business processes and workflows. Among the recent "data-aware" workflow models, artifact-centric models are particularly interesting. (Business) artifacts are the key data entities that are used in workflows and can reflect both the business logic and the execution states of a running workflow. The notion of artifacts succinctly captures the fluidity aspect of data during workflow executions. However, much of the technical dimension concerning artifacts in workflows is not well understood. In this paper, we study a key concept of an artifact "lifecycle". In particular, we allow declarative specifications/constraints of artifact lifecycle in the spirit of DecSerFlow, and formulate the notion of lifecycle as the set of all possible paths an artifact can navigate through. We investigate two technical problems: (Compliance) does a given workflow (schema) contain only lifecycle allowed by a constraint? And (automated construction) from a given lifecycle specification (constraint), is it possible to construct a "compliant" workflow? The study is based on a new formal variant of artifact-centric workflow model called "ArtiNets" and two classes of lifecycle constraints named "regular" and "counting" constraints. We present a range of technical results concerning compliance and automated construction, including: (1) compliance is decidable when workflow is atomic or constraints are regular, (2) for each constraint, we can always construct a workflow that satisfies the constraint, and (3) sufficient conditions where atomic workflows can be constructed.

  15. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  16. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  17. Decentralization and Participatory Rural Development: A Literature Review

    Directory of Open Access Journals (Sweden)

    Muhammad Shakil Ahmad

    2011-12-01

    Full Text Available Most of the developing nations are still struggling for efficient use of their resources. In order to overcome physical and administrative constraints of the development, it is necessary to transfer the power from the central government to local authorities. Distribution of power from improves the management of resources and community participation which is considered key to sustainable development. Advocates of decentralization argue that decentralized government is source to improve community participation in rural development. Decentralized government is considered more responsive towards local needs and development of poor peoples. There are many obstacles to expand the citizen participation in rural areas. There are many approaches for participatory development but all have to face the same challenges. Current paper highlights the literature about Decentralization and participatory rural development. Concept and modalities of Decentralization, dimensions of participation, types of rural participation and obstacles to participation are also the part of this paper.

  18. Trends in research on forestry decentralization policies

    DEFF Research Database (Denmark)

    Lund, Jens Friis; Rutt, Rebecca Leigh; Ribot, Jesse

    2018-01-01

    institutions; studies focusing on power and the role of elites in forestry decentralization, and; studies that historicize and contextualize forestry decentralization as reflective of broader societal phenomena. We argue that these strands reflect disciplinary differences in values, epistemologies, and methods...

  19. Comparison of centralized and decentralized energy supply systems

    OpenAIRE

    Pfeifer, Thomas; Fahl, Ulrich; Voß, Alfred

    1991-01-01

    Communal energy programs are often embedded in a conception of a decentralized energy supply system where electricity is produced by a number of smaller power plants. For a comprehensive survey the question arises whether these decentralized systems are more advantageous than centralized systems with regard to the criterions energy consumption, safety of supply, environmental compatibility and economy. In the following, after a definition of the term "decentralized", the present structure of ...

  20. Decentralized Control of Autonomous Vehicles

    Science.gov (United States)

    2003-01-01

    Autonomous Vehicles by John S. Baras, Xiaobo Tan, Pedram Hovareshti CSHCN TR 2003-8 (ISR TR 2003-14) Report Documentation Page Form ApprovedOMB No. 0704...AND SUBTITLE Decentralized Control of Autonomous Vehicles 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT...Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Decentralized Control of Autonomous Vehicles ∗ John S. Baras, Xiaobo Tan, and Pedram

  1. Robust Decentralized Formation Flight Control

    Directory of Open Access Journals (Sweden)

    Zhao Weihua

    2011-01-01

    Full Text Available Motivated by the idea of multiplexed model predictive control (MMPC, this paper introduces a new framework for unmanned aerial vehicles (UAVs formation flight and coordination. Formulated using MMPC approach, the whole centralized formation flight system is considered as a linear periodic system with control inputs of each UAV subsystem as its periodic inputs. Divided into decentralized subsystems, the whole formation flight system is guaranteed stable if proper terminal cost and terminal constraints are added to each decentralized MPC formulation of the UAV subsystem. The decentralized robust MPC formulation for each UAV subsystem with bounded input disturbances and model uncertainties is also presented. Furthermore, an obstacle avoidance control scheme for any shape and size of obstacles, including the nonapriorily known ones, is integrated under the unified MPC framework. The results from simulations demonstrate that the proposed framework can successfully achieve robust collision-free formation flights.

  2. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  3. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  4. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  5. Decentralization: Another Perspective

    Science.gov (United States)

    Chapman, Robin

    1973-01-01

    This paper attempts to pursue the centralization-decentralization dilemma. A setting for this discussion is provided by noting some of the uses of terminology, followed by a consideration of inherent difficulties in conceptualizing. (Author)

  6. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  7. Distribution of decentralized renewable energy resources

    International Nuclear Information System (INIS)

    Bal, J.L.; Benque, J.P.

    1996-01-01

    The existence of a great number of inhabitants without electricity, living in areas of low population density, with modest energy requirements and low income provides a major potential market for decentralized renewable energy sources. Ademe and EDF in 1993 made two agreements concerning the development of Renewable Energy Sources. The first aims at promoting their decentralized use in France in pertinent cases. The second agreement concerns other countries and has two ambitions: facilitate short-term developments and produce in the longer term a standardised proposal for decentralized energy production using Renewable Energy Sources to a considerable extent. These ideas are explained, and the principles behind the implementation of both Ademe-EDF agreements as well as their future prospects are described. (R.P.)

  8. Towards Automatic Decentralized Control Structure Selection

    DEFF Research Database (Denmark)

    for decentralized control is determined automatically, and the resulting decentralized control structure is automatically tuned using standard techniques. Dynamic simulation of the resulting process system gives immediate feedback to the process design engineer regarding practical operability of the process......A subtask in integration of design and control of chemical processes is the selection of a control structure. Automating the selection of the control structure enables sequential integration of process and controld esign. As soon as the process is specified or computed, a structure....... The control structure selection problem is formulated as a special MILP employing cost coefficients which are computed using Parseval's theorem combined with RGA and IMC concepts. This approach enables selection and tuning of large-scale plant-wide decentralized controllers through efficient combination...

  9. Towards Automatic Decentralized Control Structure Selection

    DEFF Research Database (Denmark)

    Jørgensen, John Bagterp; Jørgensen, Sten Bay

    2000-01-01

    for decentralized control is determined automatically, and the resulting decentralized control structure is automatically tuned using standard techniques. Dynamic simulation of the resulting process system gives immediate feedback to the process design engineer regarding practical operability of the process......A subtask in integration of design and control of chemical processes is the selection of a control structure. Automating the selection of the control structure enables sequential integration of process and control design. As soon as the process is specified or computed, a structure....... The control structure selection problem is formulated as a special MILP employing cost coefficients which are computed using Parseval's theorem combined with RGA and IMC concepts. This approach enables selection and tuning of large-scale plant-wide decentralized controllers through efficient combination...

  10. Multidetector-row CT: economics and workflow

    International Nuclear Information System (INIS)

    Pottala, K.M.; Kalra, M.K.; Saini, S.; Ouellette, K.; Sahani, D.; Thrall, J.H.

    2005-01-01

    With rapid evolution of multidetector-row CT (MDCT) technology and applications, several factors such ad technology upgrade and turf battles for sharing cost and profitability affect MDCT workflow and economics. MDCT workflow optimization can enhance productivity and reduce unit costs as well as increase profitability, in spite of decrease in reimbursement rates. Strategies for workflow management include standardization, automation, and constant assessment of various steps involved in MDCT operations. In this review article, we describe issues related to MDCT economics and workflow. (orig.)

  11. Decentral Smart Grid Control

    Science.gov (United States)

    Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk

    2015-01-01

    Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals.

  12. Decentral Smart Grid Control

    International Nuclear Information System (INIS)

    Schäfer, Benjamin; Matthiae, Moritz; Timme, Marc; Witthaut, Dirk

    2015-01-01

    Stable operation of complex flow and transportation networks requires balanced supply and demand. For the operation of electric power grids—due to their increasing fraction of renewable energy sources—a pressing challenge is to fit the fluctuations in decentralized supply to the distributed and temporally varying demands. To achieve this goal, common smart grid concepts suggest to collect consumer demand data, centrally evaluate them given current supply and send price information back to customers for them to decide about usage. Besides restrictions regarding cyber security, privacy protection and large required investments, it remains unclear how such central smart grid options guarantee overall stability. Here we propose a Decentral Smart Grid Control, where the price is directly linked to the local grid frequency at each customer. The grid frequency provides all necessary information about the current power balance such that it is sufficient to match supply and demand without the need for a centralized IT infrastructure. We analyze the performance and the dynamical stability of the power grid with such a control system. Our results suggest that the proposed Decentral Smart Grid Control is feasible independent of effective measurement delays, if frequencies are averaged over sufficiently large time intervals. (paper)

  13. Policy Recommendations on Decentralization, Local Power and ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-12-22

    Policy Recommendations on Decentralization, Local Power and Women's Rights. December 22, 2010. Image. The present document comprises a set of policy recommendations that define a global agenda on gender and decentralization. It emerged from the analysis and experiences shared during the Conference and the ...

  14. Rethinking Decentralization in Education in terms of Administrative Problems

    Directory of Open Access Journals (Sweden)

    Vasiliki Papadopoulou

    2013-11-01

    Full Text Available The general purpose of this study is to thoroughly examine decentralization in education according to the literature and previous research, and to discuss the applicability of educational decentralization practices in Turkey. The literature was reviewed for the study and findings reported. It has been observed that decentralization in education practices were realized in many countries after the 1980’s. It is obvious that the educational system in Turkey has difficulty in meeting the needs, and encounters many problems due to its present centralist state. Educational decentralization can provide effective solutions for stakeholder engagement, educational financing and for problems in decision making and operation within the education system. However, the present state of local governments, the legal framework, geographical, cultural and social features indicate that Turkey’s conditions are not ready for decentralization in education. A decentralization model realized in the long run according to Turkey’s conditions, and as a result of a social consensus, can help resolve the problems of the Turkish education system.

  15. Safety and feasibility of STAT RAD: Improvement of a novel rapid tomotherapy-based radiation therapy workflow by failure mode and effects analysis.

    Science.gov (United States)

    Jones, Ryan T; Handsfield, Lydia; Read, Paul W; Wilson, David D; Van Ausdal, Ray; Schlesinger, David J; Siebers, Jeffrey V; Chen, Quan

    2015-01-01

    The clinical challenge of radiation therapy (RT) for painful bone metastases requires clinicians to consider both treatment efficacy and patient prognosis when selecting a radiation therapy regimen. The traditional RT workflow requires several weeks for common palliative RT schedules of 30 Gy in 10 fractions or 20 Gy in 5 fractions. At our institution, we have created a new RT workflow termed "STAT RAD" that allows clinicians to perform computed tomographic (CT) simulation, planning, and highly conformal single fraction treatment delivery within 2 hours. In this study, we evaluate the safety and feasibility of the STAT RAD workflow. A failure mode and effects analysis (FMEA) was performed on the STAT RAD workflow, including development of a process map, identification of potential failure modes, description of the cause and effect, temporal occurrence, and team member involvement in each failure mode, and examination of existing safety controls. A risk probability number (RPN) was calculated for each failure mode. As necessary, workflow adjustments were then made to safeguard failure modes of significant RPN values. After workflow alterations, RPN numbers were again recomputed. A total of 72 potential failure modes were identified in the pre-FMEA STAT RAD workflow, of which 22 met the RPN threshold for clinical significance. Workflow adjustments included the addition of a team member checklist, changing simulation from megavoltage CT to kilovoltage CT, alteration of patient-specific quality assurance testing, and allocating increased time for critical workflow steps. After these modifications, only 1 failure mode maintained RPN significance; patient motion after alignment or during treatment. Performing the FMEA for the STAT RAD workflow before clinical implementation has significantly strengthened the safety and feasibility of STAT RAD. The FMEA proved a valuable evaluation tool, identifying potential problem areas so that we could create a safer workflow

  16. GUEST EDITOR'S INTRODUCTION: Guest Editor's introduction

    Science.gov (United States)

    Chrysanthis, Panos K.

    1996-12-01

    of the critical organizational/business processes. In particular, this paper examines the issues of execution atomicity and failure atomicity, differentiating between correctness requirements of system failures and logical failures, and surveys techniques that can be used to ensure data consistency in workflow management systems. While the first paper is concerned with correctness assuming transactional workflows in which selective transactional properties are associated with individual tasks or the entire workflow, the second paper, `Scheduling workflows by enforcing intertask dependencies' by Attie et al, assumes that the tasks can be either transactions or other activities involving legacy systems. This second paper describes the modelling and specification of conditions involving events and dependencies among tasks within a workflow using temporal logic and finite state automata. It also presents a scheduling algorithm that enforces all stated dependencies by executing at any given time only those events that are allowed by all the dependency automata and in an order as specified by the dependencies. In any system with decentralized control, there is a need to effectively cope with the tension that exists between autonomy and consistency requirements. In `A three-level atomicity model for decentralized workflow management systems', Ben-Shaul and Heineman focus on the specific requirement of enforcing failure atomicity in decentralized, autonomous and interacting workflow management systems. Their paper describes a model in which each workflow manager must be able to specify the sequence of tasks that comprise an atomic unit for the purposes of correctness, and the degrees of local and global atomicity for the purpose of cooperation with other workflow managers. The paper also discusses a realization of this model in which treaties and summits provide an agreement mechanism, while underlying transaction managers are responsible for maintaining failure atomicity

  17. Energy and air emission implications of a decentralized wastewater system

    International Nuclear Information System (INIS)

    Shehabi, Arman; Stokes, Jennifer R; Horvath, Arpad

    2012-01-01

    Both centralized and decentralized wastewater systems have distinct engineering, financial and societal benefits. This paper presents a framework for analyzing the environmental effects of decentralized wastewater systems and an evaluation of the environmental impacts associated with two currently operating systems in California, one centralized and one decentralized. A comparison of energy use, greenhouse gas emissions and criteria air pollutants from the systems shows that the scale economies of the centralized plant help lower the environmental burden to less than a fifth of that of the decentralized utility for the same volume treated. The energy and emission burdens of the decentralized plant are reduced when accounting for high-yield wastewater reuse if it supplants an energy-intensive water supply like a desalination one. The centralized facility also reduces greenhouse gases by flaring methane generated during the treatment process, while methane is directly emitted from the decentralized system. The results are compelling enough to indicate that the life-cycle environmental impacts of decentralized designs should be carefully evaluated as part of the design process. (letter)

  18. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  19. Decentralization in Botswana: the reluctant process | Dipholo ...

    African Journals Online (AJOL)

    Botswana\\'s decentralization process has always been justified in terms of democracy and development. Consequently, the government has always argued that it is fully committed to decentralization in order to promote popular participation as well as facilitating sustainable rural development. Yet the government does not ...

  20. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  1. Decentralized indirect methods for learning automata games.

    Science.gov (United States)

    Tilak, Omkar; Martin, Ryan; Mukhopadhyay, Snehasis

    2011-10-01

    We discuss the application of indirect learning methods in zero-sum and identical payoff learning automata games. We propose a novel decentralized version of the well-known pursuit learning algorithm. Such a decentralized algorithm has significant computational advantages over its centralized counterpart. The theoretical study of such a decentralized algorithm requires the analysis to be carried out in a nonstationary environment. We use a novel bootstrapping argument to prove the convergence of the algorithm. To our knowledge, this is the first time that such analysis has been carried out for zero-sum and identical payoff games. Extensive simulation studies are reported, which demonstrate the proposed algorithm's fast and accurate convergence in a variety of game scenarios. We also introduce the framework of partial communication in the context of identical payoff games of learning automata. In such games, the automata may not communicate with each other or may communicate selectively. This comprehensive framework has the capability to model both centralized and decentralized games discussed in this paper.

  2. The effects of fiscal decentralization in Albania

    Directory of Open Access Journals (Sweden)

    Dr.Sc. Blerta Dragusha

    2012-06-01

    Full Text Available “Basically decentralization is a democratic reform which seeks to transfer the political, administrative, financial and planning authority from central to local government. It seeks to develop civic participation, empowerment of local people in decision making process and to promote accountability and reliability: To achieve efficiency and effectiveness in the collection and management of resources and service delivery”1 The interest and curiosity of knowing how our country is doing in this process, still unfinished, served as a motivation forme to treat this topic: fiscal decentralization as a process of giving 'power' to local governments, not only in terms of rights deriving from this process but also on the responsibilities that come with it. Which are the stages before and after decentralization, and how has it affected the process in several key indicators? Is decentralization a good process only, or can any of its effects be seen as an disadvantage?

  3. Subsidiarity in Principle: Decentralization of Water Resources Management

    Directory of Open Access Journals (Sweden)

    Ryan Stoa

    2014-05-01

    Full Text Available The subsidiarity principle of water resources management suggests that water management and service delivery should take place at the lowest appropriate governance level. The principle is attractive for several reasons, primarily because: 1 the governance level can be reduced to reflect environmental characteristics, such as the hydrological borders of a watershed that would otherwise cross administrative boundaries; 2 decentralization promotes community and stakeholder engagement when decision-making is localized; 3 inefficiencies are reduced by eliminating reliance on central government bureaucracies and budgetary constraints; and 4 laws and institutions can be adapted to reflect localized conditions at a scale where integrated natural resources management and climate change adaptation is more focused. Accordingly, the principle of subsidiarity has been welcomed by many states committed to decentralized governance, integrated water resources management, and/or civic participation. However, applications of decentralization have not been uniform, and in some cases have produced frustrating outcomes for states and water resources. Successful decentralization strategies are heavily dependent on dedicated financial resources and human resource capacity. This article explores the nexus between the principle of subsidiarity and the enabling environment, in the hope of articulating factors likely to contribute to, or detract from, the success of decentralized water resources management. Case studies from Haiti, Rwanda, and the United States’ Florida Water Management Districts provide examples of the varied stages of decentralization.

  4. FISCAL DECENTRALIZATION IN THE DRC: EVIDENCE OFREVENUE ASSIGNMENT

    Directory of Open Access Journals (Sweden)

    Angelita Kithatu-Kiwekete

    2017-07-01

    Full Text Available The rationalefor central government to devolve resources for service provisionhas been debated in decentralization literature. Decentralization enhancesdemocracy,encouragesparticipation in local development initiativesandpromotes local political accountability.This discourse has been complemented bythe implementation of fiscal decentralization to increase the ability of sub-nationalgovernment in financing municipal service delivery. Fiscal decentralization hasoften been adopted by African statessince the onset ofthe New PublicManagement erain an effortto improvethe standard ofgovernance. The concernis that African states have taken minimal steps to adopt fiscal devolution thatpromotes revenue assignment which in turn limits sub-nationalgovernments’ability to generate own source revenues.This article examines the revenue assignment function of fiscal decentralization inthe Democratic Republic of Congo(DRCinthelight of decentralizationconcerns that have been raised by civil society, as the country charts its course todemocracy. The article is a desktop study that will consider documents andpoliciesin theDRCon thenational, provincialand locallevel as far asstaterevenue sourcesare concerned. Revenue assignment should enable DRC’sprovinces and local authoritiestogeneratesignificantrevenueindependently.However, post-conflict reconstruction and development efforts in the Great Lakesregion and in the DRC have largely isolated decentralization which wouldotherwise entrench local fiscalautonomy infinancing for local services anddevelopment. The article concludes that revenue generation for local authoritiesandtheprovinces in the DRC is still very centralised by the national government.Thearticleproposes policy recommendations that will be useful for the country toensurethatdecentralization effortsinclude fiscal devolution toenhance thefinancing for local development initiatives.

  5. Fuzzy-Logic-Based Gain-Scheduling Control for State-of-Charge Balance of Distributed Energy Storage Systems for DC Microgrids

    DEFF Research Database (Denmark)

    Aldana, Nelson Leonardo Diaz; Dragicevic, Tomislav; Vasquez, Juan Carlos

    2014-01-01

    -charge or deep-discharge in one of the energy storage units. Primary control in a microgrid is responsible for power sharing among units; and droop control is typically used in this stage. This paper proposes a modular and decentralized gain-scheduling control strategy based on fuzzy logic that ensures balanced...

  6. Least-cost network evaluation of centralized and decentralized contributions to global electrification

    International Nuclear Information System (INIS)

    Levin, Todd; Thomas, Valerie M.

    2012-01-01

    The choice between centralized and decentralized electricity generation is examined for 150 countries as a function of population distribution, electricity consumption, transmission cost, and the cost difference between decentralized and centralized electricity generation. A network algorithm is developed to find the shortest centralized transmission network that spans a given fraction of the population in a country. The least-cost combination of centralized and decentralized electricity that serves the country is determined. Case studies of Botswana, Uganda, and Bangladesh illustrate situations that are more and less suited for decentralized electrification. Specific maps for centralized and decentralized generation are presented to show how the least-cost option varies with the relative costs of centralized and decentralized generation and transmission cost. Centralized and decentralized fractions are calculated for 150 countries. For most of the world's population, centralized electricity is the least-cost option. For a number of countries, particularly in Africa, substantial populations and regions may be most cost-effectively served by decentralized electricity. - Highlights: ► Centralized and decentralized electrification are compared for 150 countries. ► A cost-optimized network algorithm finds the least-cost electrification system. ► Least-cost infrastructures combine centralized and decentralized portions. ► For most people, centralized electricity is cheapest option. ► In much of Africa, decentralized electricity may be cheaper than centralized.

  7. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  8. Responsiveness and flexibility in a Decentralized Supply Chain

    DEFF Research Database (Denmark)

    Petersen, Kristian Rasmus; Bilberg, Arne; Hadar, Ronen

    Today’s supply chains are not capable of managing the instabilities that is the case in the market. Instead, there is a need to develop supply chains that are capable of adapting to changes. Through a case study of LEGO, the authors suggest a possible solution: a decentralized supply chain serving...... independent and self-sufficient local factories. The decentralized supply chain is provided with materials, parts and pre-assembled elements from local suppliers and supplies the local market in return. Keywords: Decentralize, Responsiveness, Flexibility...

  9. Decentralized Portfolio Management

    Directory of Open Access Journals (Sweden)

    Benjamin Miranda Tabak

    2003-12-01

    Full Text Available We use a mean-variance model to analyze the problem of decentralized portfolio management. We find the solution for the optimal portfolio allocation for a head trader operating in n different markets, which is called the optimal centralized portfolio. However, as there are many traders specialized in different markets, the solution to the problem of optimal decentralized allocation should be different from the centralized case. In this paper we derive conditions for the solutions to be equivalent. We use multivariate normal returns and a negative exponential function to solve the problem analytically. We generate the equivalence of solutions by assuming that different traders face different interest rates for borrowing and lending. This interest rate is dependent on the ratio of the degrees of risk aversion of the trader and the head trader, on the excess return, and on the correlation between asset returns.

  10. Communication network for decentralized remote tele-science during the Spacelab mission IML-2

    Science.gov (United States)

    Christ, Uwe; Schulz, Klaus-Juergen; Incollingo, Marco

    1994-01-01

    The ESA communication network for decentralized remote telescience during the Spacelab mission IML-2, called Interconnection Ground Subnetwork (IGS), provided data, voice conferencing, video distribution/conferencing and high rate data services to 5 remote user centers in Europe. The combination of services allowed the experimenters to interact with their experiments as they would normally do from the Payload Operations Control Center (POCC) at MSFC. In addition, to enhance their science results, they were able to make use of reference facilities and computing resources in their home laboratory, which typically are not available in the POCC. Characteristics of the IML-2 communications implementation were the adaptation to the different user needs based on modular service capabilities of IGS and the cost optimization for the connectivity. This was achieved by using a combination of traditional leased lines, satellite based VSAT connectivity and N-ISDN according to the simulation and mission schedule for each remote site. The central management system of IGS allows minimization of staffing and the involvement of communications personnel at the remote sites. The successful operation of IGS for IML-2 as a precursor network for the Columbus Orbital Facility (COF) has proven the concept for communications to support the operation of the COF decentralized scenario.

  11. Decentralization, Local Rights and the Construction of Women's ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Decentralization, Local Rights and the Construction of Women's Citizenship : a Comparative Study in Kenya, Tanzania and Uganda - Phase II. Kenya, Tanzania and Uganda have adopted new land laws, policies and institutional arrangements to accommodate decentralization of land administration and management.

  12. Decentralized Blended Acquisition

    NARCIS (Netherlands)

    Berkhout, A.J.

    2013-01-01

    The concept of blending and deblending is reviewed, making use of traditional and dispersed source arrays. The network concept of distributed blended acquisition is introduced. A million-trace robot system is proposed, illustrating that decentralization may bring about a revolution in the way we

  13. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  14. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  15. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  16. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  17. The Two Edge Knife of Decentralization

    OpenAIRE

    Umam, Ahmad Khoirul

    2011-01-01

    A centralistic government model has become a trend in a number of developing countries, in which the ideosycretic aspect becomes pivotal key in the policy making. The situation constitutes authoritarianism, cronyism, and corruption. To break the impasse, the decentralized system is proposed to make people closer to the public policy making. Decentralization is also convinced to be the solution to create a good governance. But a number of facts in the developing countries demonstrates that dec...

  18. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  19. Behavioral technique for workflow abstraction and matching

    NARCIS (Netherlands)

    Klai, K.; Ould Ahmed M'bareck, N.; Tata, S.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    This work is in line with the CoopFlow approach dedicated for workflow advertisement, interconnection, and cooperation in virtual organizations. In order to advertise workflows into a registry, we present in this paper a novel method to abstract behaviors of workflows into symbolic observation

  20. Decentralization : Local Partnerships for Health Services in the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Cameroon, like most other sub-Saharan African countries, has adopted laws devolving various responsibilities to local administrations. In the local political discourse, decentralization is seen as bringing essential services closer to the users, especially those in greatest need. However, the national decentralization program ...

  1. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  2. Decentralization and financial autonomy: a challenge for local public authorities in the Republic of Moldova

    Directory of Open Access Journals (Sweden)

    Tatiana MANOLE

    2017-09-01

    Full Text Available This article reflects the decentralization process currently taking place in the Republic of Moldova. The purpose of the research is to acquaint readers with the fundamental concept of decentralization, with the areas of administrative decentralization, with the forms of manifestation of financial decentralization: fiscal decentralization and budget decentralization. The priorities of the decentralization process are identified.

  3. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  4. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  5. Centralized, Decentralized, and Hybrid Purchasing Organizations

    DEFF Research Database (Denmark)

    Bals, Lydia; Turkulainen, Virpi

    This paper addresses one of the focal issues in purchasing and supply management – global sourcing – from an organizational design perspective. In particular, we elaborate the traditional classification of global sourcing organization designs into centralized, decentralized, and hybrid models. We...... organization we can identify organization designs beyond the classical centralization-decentralization continuum. We also provide explanations for the observed organization design at GCC. The study contributes to research on purchasing and supply management as well as research on organization design....

  6. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  7. DECENTRALIZATION OF MUNICIPAL SERVICES – LEARNING BY DOING

    Directory of Open Access Journals (Sweden)

    Cristina Elena NICOLESCU

    2017-05-01

    Full Text Available Public services decentralization is a major concern for policy makers when it comes to identifying the optimum model for reorganizing these services, in light of the 3Es of the organizational performance. The field experiences show that this process is different both from one state to another, and depending on the targeted activity sector, out of which the local transport service is distinguished as an ‘institutional orphan’. Taking into account one of the smart-cities’ recognition criteria, the urban mobility, the paper aims at substantiating that, despite the specific incrementalism of the public services decentralization, having a negative impact upon the services’ efficiency, in the case of local transport service, recognizing the right to mobility and the need to ensuring the environment for exercising this right, impels the ‘bureaucratic apparatus’ to accelerate and consolidate the decentralization of this service. Therefore, the paper puts forward a case study on the impact of decentralization upon the local public transport service of Bucharest municipality.

  8. Towards a Decentralized Magnetic Indoor Positioning System

    Science.gov (United States)

    Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg

    2015-01-01

    Decentralized magnetic indoor localization is a sophisticated method for processing sampled magnetic data directly on a mobile station (MS), thereby decreasing or even avoiding the need for communication with the base station. In contrast to central-oriented positioning systems, which transmit raw data to a base station, decentralized indoor localization pushes application-level knowledge into the MS. A decentralized position solution has thus a strong feasibility to increase energy efficiency and to prolong the lifetime of the MS. In this article, we present a complete architecture and an implementation for a decentralized positioning system. Furthermore, we introduce a technique for the synchronization of the observed magnetic field on the MS with the artificially-generated magnetic field from the coils. Based on real-time clocks (RTCs) and a preemptive operating system, this method allows a stand-alone control of the coils and a proper assignment of the measured magnetic fields on the MS. A stand-alone control and synchronization of the coils and the MS have an exceptional potential to implement a positioning system without the need for wired or wireless communication and enable a deployment of applications for rescue scenarios, like localization of miners or firefighters. PMID:26690145

  9. Towards a Decentralized Magnetic Indoor Positioning System

    Directory of Open Access Journals (Sweden)

    Zakaria Kasmi

    2015-12-01

    Full Text Available Decentralized magnetic indoor localization is a sophisticated method for processing sampled magnetic data directly on a mobile station (MS, thereby decreasing or even avoiding the need for communication with the base station. In contrast to central-oriented positioning systems, which transmit raw data to a base station, decentralized indoor localization pushes application-level knowledge into the MS. A decentralized position solution has thus a strong feasibility to increase energy efficiency and to prolong the lifetime of the MS. In this article, we present a complete architecture and an implementation for a decentralized positioning system. Furthermore, we introduce a technique for the synchronization of the observed magnetic field on the MS with the artificially-generated magnetic field from the coils. Based on real-time clocks (RTCs and a preemptive operating system, this method allows a stand-alone control of the coils and a proper assignment of the measured magnetic fields on the MS. A stand-alone control and synchronization of the coils and the MS have an exceptional potential to implement a positioning system without the need for wired or wireless communication and enable a deployment of applications for rescue scenarios, like localization of miners or firefighters.

  10. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes...

  11. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  12. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  13. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Iván Tomás Cotes-Ruiz

    Full Text Available Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS. The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  14. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  15. Decentralization: A panacea for functional education and national ...

    African Journals Online (AJOL)

    Decentralization of power from the federal government to state and local governments is the way to go, especially in the management of our education system. Education can be best delivered at the state and local government levels. Decentralization of educational management in Nigeria will encourage creativity and ...

  16. Decentralized flight trajectory planning of multiple aircraft

    OpenAIRE

    Yokoyama, Nobuhiro; 横山 信宏

    2008-01-01

    Conventional decentralized algorithms for optimal trajectory planning tend to require prohibitive computational time as the number of aircraft increases. To overcome this drawback, this paper proposes a novel decentralized trajectory planning algorithm adopting a constraints decoupling approach for parallel optimization. The constraints decoupling approach is formulated as the path constraints of the real-time trajectory optimization problem based on nonlinear programming. Due to the parallel...

  17. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    Science.gov (United States)

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  18. U-Form vs. M-Form: How to Understand Decision Autonomy Under Healthcare Decentralization?

    Science.gov (United States)

    Bustamante, Arturo Vargas

    2016-01-01

    For more than three decades healthcare decentralization has been promoted in developing countries as a way of improving the financing and delivery of public healthcare. Decision autonomy under healthcare decentralization would determine the role and scope of responsibility of local authorities. Jalal Mohammed, Nicola North, and Toni Ashton analyze decision autonomy within decentralized services in Fiji. They conclude that the narrow decision space allowed to local entities might have limited the benefits of decentralization on users and providers. To discuss the costs and benefits of healthcare decentralization this paper uses the U-form and M-form typology to further illustrate the role of decision autonomy under healthcare decentralization. This paper argues that when evaluating healthcare decentralization, it is important to determine whether the benefits from decentralization are greater than its costs. The U-form and M-form framework is proposed as a useful typology to evaluate different types of institutional arrangements under healthcare decentralization. Under this model, the more decentralized organizational form (M-form) is superior if the benefits from flexibility exceed the costs of duplication and the more centralized organizational form (U-form) is superior if the savings from economies of scale outweigh the costly decision-making process from the center to the regions. Budgetary and financial autonomy and effective mechanisms to maintain local governments accountable for their spending behavior are key decision autonomy variables that could sway the cost-benefit analysis of healthcare decentralization. PMID:27694684

  19. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  20. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  1. A comparison of decentralized, distributed, and centralized vibro-acoustic control.

    Science.gov (United States)

    Frampton, Kenneth D; Baumann, Oliver N; Gardonio, Paolo

    2010-11-01

    Direct velocity feedback control of structures is well known to increase structural damping and thus reduce vibration. In multi-channel systems the way in which the velocity signals are used to inform the actuators ranges from decentralized control, through distributed or clustered control to fully centralized control. The objective of distributed controllers is to exploit the anticipated performance advantage of the centralized control while maintaining the scalability, ease of implementation, and robustness of decentralized control. However, and in seeming contradiction, some investigations have concluded that decentralized control performs as well as distributed and centralized control, while other results have indicated that distributed control has significant performance advantages over decentralized control. The purpose of this work is to explain this seeming contradiction in results, to explore the effectiveness of decentralized, distributed, and centralized vibro-acoustic control, and to expand the concept of distributed control to include the distribution of the optimization process and the cost function employed.

  2. Decentralization : Local Partnerships for Health Services in the ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    However, the national decentralization program is having a hard time getting on track. In the face of day-to-day difficulties Zenü Network, a nongovernmental organization, would like to make a contribution to this social project. The Network would like to demonstrate that civil society can work with decentralized government ...

  3. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  4. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  5. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  6. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  7. Driving up Standards: Civil Service Management and Decentralization: Case Study of Uganda

    Directory of Open Access Journals (Sweden)

    Lazarus Nabaho

    2012-12-01

    Full Text Available There is a consensus that decentralization by devolution leads to improved service delivery, but debate on the appropriate type of personnel arrangements for delivering decentralized services is far from over. Put differently, the discourse on whether civil service management should be decentralized or devolved still rages on. Little wonder that countries which started off with decentralized civil service management models in the 1990s are currently centralizing some aspects of personnel management while others are having centralized and decentralized personnel arrangements operating side by side in sub-national governments. The paper argues that civil service management should be decentralized whenever a country chooses the path of decentralization by devolution. Using Uganda’s example, the paper highlights two major challenges of managing the civil service under separate personnel arrangements: civil service appointments devoid of merit, and the perennial failure to attract and retain qualified human resource. The paper presents proposals on how to ensure meritocracy in appointments and how to bolster attraction and retention of human capital in local governments.

  8. Taking advantage of HTML5 browsers to realize the concepts of session state and workflow sharing in web-tool applications

    Science.gov (United States)

    Suftin, I.; Read, J. S.; Walker, J.

    2013-12-01

    Scientists prefer not having to be tied down to a specific machine or operating system in order to analyze local and remote data sets or publish work. Increasingly, analysis has been migrating to decentralized web services and data sets, using web clients to provide the analysis interface. While simplifying workflow access, analysis, and publishing of data, the move does bring with it its own unique set of issues. Web clients used for analysis typically offer workflows geared towards a single user, with steps and results that are often difficult to recreate and share with others. Furthermore, workflow results often may not be easily used as input for further analysis. Older browsers further complicate things by having no way to maintain larger chunks of information, often offloading the job of storage to the back-end server or trying to squeeze it into a cookie. It has been difficult to provide a concept of "session storage" or "workflow sharing" without a complex orchestration of the back-end for storage depending on either a centralized file system or database. With the advent of HTML5, browsers gained the ability to store more information through the use of the Web Storage API (a browser-cookie holds a maximum of 4 kilobytes). Web Storage gives us the ability to store megabytes of arbitrary data in-browser either with an expiration date or just for a session. This allows scientists to create, update, persist and share their workflow without depending on the backend to store session information, providing the flexibility for new web-based workflows to emerge. In the DSASWeb portal ( http://cida.usgs.gov/DSASweb/ ), using these techniques, the representation of every step in the analyst's workflow is stored as plain-text serialized JSON, which we can generate as a text file and provide to the analyst as an upload. This file may then be shared with others and loaded back into the application, restoring the application to the state it was in when the session file

  9. EFFECT OF FISCAL DECENTRALIZATION ON CAPITAL EXPENDITURE, GROWTH, AND WELFARE

    OpenAIRE

    Badrudin, Rudy

    2013-01-01

    This research analyzes the influence of fiscal decentralization on capital expenditure, economic growth, and social welfare of 29 regencies and 6 cities in Central Java Province based on the data of year 2004 to 2008. The method used to analyze the hypotheses is the Partial Least Square. The results showes that fiscal decentralization has no significant effect on capital expenditure; fiscal decentralization has significant effect on economic growth and social welfare; capital expenditure has ...

  10. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  11. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  12. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Science.gov (United States)

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system

  13. Making decentralization work for women in Uganda

    OpenAIRE

    Lakwo, A.

    2009-01-01

    This book is about engendering local governance. It explores the euphoria with which Uganda's decentralization policy took centre stage as a sufficient driver to engender local development responsiveness and accountability. Using a case study of AFARD in Nebbi district, it shows first that decentralized governance is gendered and technocratic as grassroots women's effective participation is lacking. Second, it shows that the insertion of women in local governance is merely a symbolic politica...

  14. ADVANCED APPROACH TO PRODUCTION WORKFLOW COMPOSITION ON ENGINEERING KNOWLEDGE PORTALS

    OpenAIRE

    Novogrudska, Rina; Kot, Tatyana; Globa, Larisa; Schill, Alexander

    2016-01-01

    Background. In the environment of engineering knowledge portals great amount of partial workflows is concentrated. Such workflows are composed into general workflow aiming to perform real complex production task. Characteristics of partial workflows and general workflow structure are not studied enough, that affects the impossibility of general production workflowdynamic composition.Objective. Creating an approach to the general production workflow dynamic composition based on the partial wor...

  15. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  16. Centralization vs. Decentralization: A Location Analysis Approach for Librarians

    Science.gov (United States)

    Raffel, Jeffrey; Shishko, Robert

    1972-01-01

    An application of location theory to the question of centralized versus decentralized library facilities for a university, with relevance for special libraries is presented. The analysis provides models for a single library, for two or more libraries, or for decentralized facilities. (6 references) (Author/NH)

  17. The Rhetoric of Decentralization

    Science.gov (United States)

    Ravitch, Diane

    1974-01-01

    Questions the rationale for and possible consequences of political decentralization of New York City. Suggests that the disadvantages--reduced level of professionalism, increased expense in multiple government operation, "stabilization" of residential segregation, necessity for budget negotiations because of public disclosure of tax…

  18. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  19. Decentralized energy supply and electricity market structures

    OpenAIRE

    Weber, Christoph; Vogel, Philip

    2005-01-01

    Small decentralized power generation units (DG) are politically promoted because of their potential to reduce GHG-emissions and the existing dependency on fossil fuels. A long term goal of this promotion should be the creation of a level playing field for DG and conventional power generation. Due to the impact of DG on the electricity grid infrastructure, future regulation should consider the costs and benefits of the integration of decentralized energy generation units. Without an adequate c...

  20. Decentralized electricity production. v. 1 and 2

    International Nuclear Information System (INIS)

    1991-01-01

    The first part of the symposium is concerned with market analysis, case studies and prospectives for the decentralized production of electricity in France: cogeneration, heat networks, municipal waste incineration, etc. Financing systems and microeconomical analysis are presented. The second part is devoted to macroeconomical outlooks (France and Europe mainly) on decentralized electricity production (cogeneration, small-scale hydroelectric power plants), to other countries experience (PV systems connected to the grid, cogeneration, etc.) and to price contracts and regulations

  1. FINANCIAL CONSOLIDATION OF THE ADMINISTRATIVE-TERRITORIAL ENTITY IN THE LIGHT OF DECENTRALIZATION

    Directory of Open Access Journals (Sweden)

    Tatiana MANOLE

    2017-02-01

    Full Text Available „Should we head towards ‘self-government’ required by many of the participants, would that be a selfgovernmentof the citizens or the elect representatives? Whatever would happen, decentralization is, ina way, the book of our society, a book in which we find its aspirations, discrepancies and questions…It is well led from above, but it is well administered from the bottom.”(Xavier Frège, Paris, 1986 This article presents the results of study regarding the decentralization process, which is currentlyunderway in the Republic of Moldova. The purpose of the study is to highlight the fundamental concept ofdecentralization, the areas of administrative decentralization, the forms of manifestation of financialdecentralization (fiscal decentralization and budget decentralization, to identify the priorities of thedecentralization process, and to establish the indicators for measuring the degree of decentralization. Inbase of the statistical analysis and synthesis method, it was determined the current state of the art in theadministrative-territorial entities in the course of the decentralization process in relation to the publicfinance management reform. It were formulate proposals to accelerate the process of financialdecentralization and self-government.

  2. FISCAL DECENTRALIZATION IN ALBANIA: EFFECTS OF TERRITORIAL AND ADMINISTRATIVE REFORM

    Directory of Open Access Journals (Sweden)

    Mariola KAPIDANI

    2015-12-01

    Full Text Available The principle of decentralization is a fundamental principle for the establishment and operation of local government. It refers to the process of redistributing the authority and responsibility for certain functions from central government to local government units. In many countries, particularly in developing countries, fiscal decentralization and local governance issues are addressed as highly important to the economic development. According to Stigler (1957, fiscal decentralization brings government closer to the people and a representative government works best when it is closer to the people. Albania is still undergoing the process of decentralization in all aspects: political, economic, fiscal and administrative. Decentralization process is essential to sustainable economic growth and efficient allocation of resources to meet the needs of citizens. Albania has a fragmented system of local government with a very large number of local government units that have neither sufficient fiscal or human capacity to provide public services at a reasonable level (World Bank. However, recent administrative and territorial reform is expected to have a significant impact in many issues related to local autonomy and revenue management. This paper is focused on the progress of fiscal decentralization process in Albania, stating key issues and ongoing challenges for an improved system. The purpose of this study is to analyze the effects of recent territorial reform, identifying problems and opportunities to be addressed in the future.

  3. Decentralizing decision making in modularization strategies

    DEFF Research Database (Denmark)

    Israelsen, Poul; Jørgensen, Brian

    2011-01-01

    which distorts the economic effects of modularization at the level of the individual product. This has the implication that decisions on modularization can only be made by top management if decision authority and relevant information are to be aligned. To overcome this problem, we suggest a solution...... that aligns the descriptions of the economic consequences of modularization at the project and portfolio level which makes it possible to decentralize decision making while making sure that local goals are congruent with the global ones in order to avoid suboptimal behaviour. Keywords: Modularization......; Accounting; Cost allocation; Decision rule; Decentralization...

  4. Near optimal decentralized H_inf control

    DEFF Research Database (Denmark)

    Stoustrup, J.; Niemann, Hans Henrik

    It is shown that foir a class of decentralized control problems there does not exist a sequence of controllers of bounded order which obtains near optimal control. Neither does there exist an infinity dimentional optimal controller. Using the insight of the line of proof of these results, a heuri......It is shown that foir a class of decentralized control problems there does not exist a sequence of controllers of bounded order which obtains near optimal control. Neither does there exist an infinity dimentional optimal controller. Using the insight of the line of proof of these results...

  5. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  6. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  7. The Bases of Federalism and Decentralization in Education

    Directory of Open Access Journals (Sweden)

    Carlos Ornelas

    2003-05-01

    Full Text Available This essay uses the Weberian-type ideal to define the conceptual bases of federalism and the decentralization of education. Classic federalism, ficticious federalism (corporativism, the origins and the indigenous version of the new federalism are discussed. We conclude that Mexican constitutional federalism is baroque and ambiguous. Based on theory and the experiences of various countries, bureaucratic centralism and its main characteristics are defined. As a contrast, a typology of educational decentralization is developed. Taken into account are its political, judicial and administrative definitions; a distinction is made between delegation and decentralization. It is argued that with the signing of the Agreement for the Modernization of Basic Education, the Mexican government sought to increase its legitimacy without losing control of education.

  8. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  9. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  10. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  11. The political economy of decentralization of health and social services in Canada.

    Science.gov (United States)

    Tsalikis, G

    1989-01-01

    A trend to decentralization in Canada's 'welfare state' has received support from the Left and from the Right. Some social critics of the Left expect decentralization to result in holistic services adjusted to local needs. Others, moreover, feel we are in the dawn of a new epoch in which major economic transformations are to bring about, through new class alliances and conflict, decentralization of power and a better quality of life in communities. These assumptions and their theoretical pitfalls are discussed here following an historical overview of the centralization/decentralization issue in Canadian social policy. It is argued that recent proposals of decentralization are a continuation of reactionary tendencies to constrain social expenditures, but not a path to better quality of life.

  12. The interplay between decentralization and privacy: the case of blockchain technologies

    OpenAIRE

    De Filippi , Primavera

    2016-01-01

    International audience; Decentralized architectures are gaining popularity as a way to protect one's privacy against the ubiquitous surveillance of states and corporations. Yet, in spite of the obvious benefits they provide when it comes to data sovereignty, decentralized architectures also present certain characteristics that—if not properly accounted for—might ultimately impinge upon users' privacy. While they are capable of preserving the confidentiality of data, decentralized architecture...

  13. Centralization Versus Decentralization: A Location Analysis Approach for Librarians.

    Science.gov (United States)

    Shishko, Robert; Raffel, Jeffrey

    One of the questions that seems to perplex many university and special librarians is whether to move in the direction of centralizing or decentralizing the library's collections and facilities. Presented is a theoretical approach, employing location theory, to the library centralization-decentralization question. Location theory allows the analyst…

  14. SU-E-J-73: Extension of a Clinical OIS/EMR/R&V System to Deliver Safe and Efficient Adaptive Plan-Of-The-Day Treatments Using a Fully Customizable Plan-Library-Based Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Akhiat, A. [Erasmus MC Cancer Institute, Radiation Oncology, Rotterdam (Netherlands); Elekta, Sunnyvale, CA (United States); Kanis, A.P.; Penninkhof, J.J.; Sodjo, S.; O’Neill, T.; Quint, S.; Doorn, X. van; Schillemans, W.; Heijmen, B.; Hoogeman, M. [Erasmus MC Cancer Institute, Radiation Oncology, Rotterdam (Netherlands); Linton, N.; Coleman, A. [Elekta, Sunnyvale, CA (United States)

    2015-06-15

    Purpose: To extend a clinical Record and Verify (R&V) system to enable a safe and fast workflow for Plan-of-the-Day (PotD) adaptive treatments based on patient-specific plan libraries. Methods: Plan libraries for PotD adaptive treatments contain for each patient several pre-treatment generated treatment plans. They may be generated for various patient anatomies or CTV-PTV margins. For each fraction, a Cone Beam CT scan is acquired to support the selection of the plan that best fits the patient’s anatomy-of-the-day. To date, there are no commercial R&V systems that support PotD delivery strategies. Consequently, the clinical workflow requires many manual interventions. Moreover, multiple scheduled plans have a high risk of excessive dose delivery. In this work we extended a commercial R&V system (MOSAIQ) to support PotD workflows using IQ-scripting. The PotD workflow was designed after extensive risk analysis of the manual procedure, and all identified risks were incorporated as logical checks. Results: All manual PotD activities were automated. The workflow first identifies if the patient is scheduled for PotD, then performs safety checks, and continues to treatment plan selection only if no issues were found. The user selects the plan to deliver from a list of candidate plans. After plan selection, the workflow makes the treatment fields of the selected plan available for delivery by adding them to the treatment calendar. Finally, control is returned to the R&V system to commence treatment. Additional logic was added to incorporate off-line changes such as updating the plan library. After extensive testing including treatment fraction interrupts and plan-library updates during the treatment course, the workflow is running successfully in a clinical pilot, in which 35 patients have been treated since October 2014. Conclusion: We have extended a commercial R&V system for improved safety and efficiency in library-based adaptive strategies enabling a wide

  15. SU-E-J-73: Extension of a Clinical OIS/EMR/R&V System to Deliver Safe and Efficient Adaptive Plan-Of-The-Day Treatments Using a Fully Customizable Plan-Library-Based Workflow

    International Nuclear Information System (INIS)

    Akhiat, A.; Kanis, A.P.; Penninkhof, J.J.; Sodjo, S.; O’Neill, T.; Quint, S.; Doorn, X. van; Schillemans, W.; Heijmen, B.; Hoogeman, M.; Linton, N.; Coleman, A.

    2015-01-01

    Purpose: To extend a clinical Record and Verify (R&V) system to enable a safe and fast workflow for Plan-of-the-Day (PotD) adaptive treatments based on patient-specific plan libraries. Methods: Plan libraries for PotD adaptive treatments contain for each patient several pre-treatment generated treatment plans. They may be generated for various patient anatomies or CTV-PTV margins. For each fraction, a Cone Beam CT scan is acquired to support the selection of the plan that best fits the patient’s anatomy-of-the-day. To date, there are no commercial R&V systems that support PotD delivery strategies. Consequently, the clinical workflow requires many manual interventions. Moreover, multiple scheduled plans have a high risk of excessive dose delivery. In this work we extended a commercial R&V system (MOSAIQ) to support PotD workflows using IQ-scripting. The PotD workflow was designed after extensive risk analysis of the manual procedure, and all identified risks were incorporated as logical checks. Results: All manual PotD activities were automated. The workflow first identifies if the patient is scheduled for PotD, then performs safety checks, and continues to treatment plan selection only if no issues were found. The user selects the plan to deliver from a list of candidate plans. After plan selection, the workflow makes the treatment fields of the selected plan available for delivery by adding them to the treatment calendar. Finally, control is returned to the R&V system to commence treatment. Additional logic was added to incorporate off-line changes such as updating the plan library. After extensive testing including treatment fraction interrupts and plan-library updates during the treatment course, the workflow is running successfully in a clinical pilot, in which 35 patients have been treated since October 2014. Conclusion: We have extended a commercial R&V system for improved safety and efficiency in library-based adaptive strategies enabling a wide

  16. Multilevel Workflow System in the ATLAS Experiment

    International Nuclear Information System (INIS)

    Borodin, M; De, K; Navarro, J Garcia; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs are executed across more than a hundred distributed computing sites by PanDA - the ATLAS job-level workload management system. On the outer level, the Database Engine for Tasks (DEfT) empowers production managers with templated workflow definitions. On the next level, the Job Execution and Definition Interface (JEDI) is integrated with PanDA to provide dynamic job definition tailored to the sites capabilities. We report on scaling up the production system to accommodate a growing number of requirements from main ATLAS areas: Trigger, Physics and Data Preparation. (paper)

  17. Corruption and government spending : The role of decentralization

    OpenAIRE

    Korneliussen, Kristine

    2009-01-01

    This thesis points to a possible weakness of the empirical literature on corruption and government spending. That corruption affects the composition of government spending, and in particular that it affects education and health spending adversely, seems to be empirically well established. However, there exist additional literature closely related to corruption and government spending, treating(i) a relationship between corruption and decentralization, and (ii) a relationship between decentral...

  18. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  19. On Deciding How to Decide: To Centralize or Decentralize.

    Science.gov (United States)

    Chaffee, Ellen Earle

    Issues concerning whether to centralize or decentralize decision-making are addressed, with applications for colleges. Centralization/decentralization (C/D) must be analyzed with reference to a particular decision. Three components of C/D are locus of authority, breadth of participation, and relative contribution by the decision-maker's staff. C/D…

  20. Optimal resource assignment in workflows for maximizing cooperation

    NARCIS (Netherlands)

    Kumar, Akhil; Dijkman, R.M.; Song, Minseok; Daniel, Fl.; Wang, J.; Weber, B.

    2013-01-01

    A workflow is a team process since many actors work on various tasks to complete an instance. Resource management in such workflows deals with assignment of tasks to workers or actors. In team formation, it is necessary to ensure that members of a team are compatible with each other. When a workflow

  1. Research on calculation of the IOL tilt and decentration based on surface fitting.

    Science.gov (United States)

    Li, Lin; Wang, Ke; Yan, Yan; Song, Xudong; Liu, Zhicheng

    2013-01-01

    The tilt and decentration of intraocular lens (IOL) result in defocussing, astigmatism, and wavefront aberration after operation. The objective is to give a method to estimate the tilt and decentration of IOL more accurately. Based on AS-OCT images of twelve eyes from eight cases with subluxation lens after operation, we fitted spherical equation to the data obtained from the images of the anterior and posterior surfaces of the IOL. By the established relationship between IOL tilt (decentration) and the scanned angle, at which a piece of AS-OCT image was taken by the instrument, the IOL tilt and decentration were calculated. IOL tilt angle and decentration of each subject were given. Moreover, the horizontal and vertical tilt was also obtained. Accordingly, the possible errors of IOL tilt and decentration existed in the method employed by AS-OCT instrument. Based on 6-12 pieces of AS-OCT images at different directions, the tilt angle and decentration values were shown, respectively. The method of the surface fitting to the IOL surface can accurately analyze the IOL's location, and six pieces of AS-OCT images at three pairs symmetrical directions are enough to get tilt angle and decentration value of IOL more precisely.

  2. Research on Calculation of the IOL Tilt and Decentration Based on Surface Fitting

    Directory of Open Access Journals (Sweden)

    Lin Li

    2013-01-01

    Full Text Available The tilt and decentration of intraocular lens (IOL result in defocussing, astigmatism, and wavefront aberration after operation. The objective is to give a method to estimate the tilt and decentration of IOL more accurately. Based on AS-OCT images of twelve eyes from eight cases with subluxation lens after operation, we fitted spherical equation to the data obtained from the images of the anterior and posterior surfaces of the IOL. By the established relationship between IOL tilt (decentration and the scanned angle, at which a piece of AS-OCT image was taken by the instrument, the IOL tilt and decentration were calculated. IOL tilt angle and decentration of each subject were given. Moreover, the horizontal and vertical tilt was also obtained. Accordingly, the possible errors of IOL tilt and decentration existed in the method employed by AS-OCT instrument. Based on 6–12 pieces of AS-OCT images at different directions, the tilt angle and decentration values were shown, respectively. The method of the surface fitting to the IOL surface can accurately analyze the IOL’s location, and six pieces of AS-OCT images at three pairs symmetrical directions are enough to get tilt angle and decentration value of IOL more precisely.

  3. Performing Workflows in Pervasive Environments Based on Context Specifications

    OpenAIRE

    Xiping Liu; Jianxin Chen

    2010-01-01

    The workflow performance consists of the performance of activities and transitions between activities. Along with the fast development of varied computing devices, activities in workflows and transitions between activities could be performed in pervasive ways, whichcauses that the workflow performance need to migrate from traditional computing environments to pervasive environments. To perform workflows in pervasive environments needs to take account of the context information which affects b...

  4. The Wolf and the Caribou: Coexistence of Decentralized Economies and Competitive Markets

    Directory of Open Access Journals (Sweden)

    Andreas Freund

    2018-05-01

    Full Text Available Starting with BitTorrent and then Bitcoin, decentralized technologies have been on the rise over the last 15+ years, gaining significant momentum in the last 2+ years with the advent of platform ecosystems such as the Blockchain platform Ethereum. New projects have evolved from decentralized games to marketplaces to open funding models to decentralized autonomous organizations. The hype around cryptocurrency and the valuation of innovative projects drove the market cap of cryptocurrencies to over a trillion dollars at one point in 2017. These high valued technologies are now enabling something new: globally scaled and decentralized business models. Despite their valuation and the hype, these new business ecosystems are frail. This is not only because the underlying technology is rapidly evolving, but also because competitive markets see a profit opportunity in exponential cryptocurrency returns. This extracts value from these ecosystems, which could lead to their collapse, if unchecked. In this paper, we explore novel ways for decentralized economies to protect themselves from, and coexist with, competitive markets at a global scale utilizing decentralized technologies such as Blockchain.

  5. Analysis and design of robust decentralized controllers for nonlinear systems

    Energy Technology Data Exchange (ETDEWEB)

    Schoenwald, D.A.

    1993-07-01

    Decentralized control strategies for nonlinear systems are achieved via feedback linearization techniques. New results on optimization and parameter robustness of non-linear systems are also developed. In addition, parametric uncertainty in large-scale systems is handled by sensitivity analysis and optimal control methods in a completely decentralized framework. This idea is applied to alleviate uncertainty in friction parameters for the gimbal joints on Space Station Freedom. As an example of decentralized nonlinear control, singular perturbation methods and distributed vibration damping are merged into a control strategy for a two-link flexible manipulator.

  6. Decentralization in Ethiopia

    OpenAIRE

    Gemechu, Mulugeta Debebbe

    2012-01-01

    Ethiopia officially launched the District Level Decentralization Program (DLDP) by the year 2002. The program flagged core objectives such as institutionalizing viable development centers at local levels, deepening devolution of power, enhancing the democratization process through broad-based participatory strategy, promoting good governance and improving service delivery. Since the inception of this program two strategic planning terms (one strategic term is five years) have already elapsed ...

  7. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  8. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  9. Verifying generalized soundness for workflow nets

    NARCIS (Netherlands)

    Hee, van K.M.; Oanea, O.I.; Sidorova, N.; Voorhoeve, M.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    We improve the decision procedure from [10] for the problem of generalized soundness of workflow nets. A workflow net is generalized sound iff every marking reachable from an initial marking with k tokens on the initial place terminates properly, i.e. it can reach a marking with k tokens on the

  10. Decentralized control and communication

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír; Papík, Martin

    2012-01-01

    Roč. 36, č. 1 (2012), s. 1-10 ISSN 1367-5788 R&D Projects: GA MŠk(CZ) LG12014 Institutional research plan: CEZ:AV0Z10750506 Keywords : decentralization * communication * large-scale complex systems Subject RIV: BC - Control Systems Theory Impact factor: 1.289, year: 2012

  11. Worklist handling in workflow-enabled radiological application systems

    Science.gov (United States)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  12. A Mixed Method Research for Finding a Model of Administrative Decentralization

    OpenAIRE

    Tahereh Feizy; Alireza Moghali; Masuod Geramipoor; Reza Zare

    2015-01-01

    One of the critical issues of administrative decentralization in translating theory into practice is understanding its meaning. An important method to identify administrative decentralization is to address how it can be planned and implemented, and what are its implications, and how it would overcome challenges. The purpose of this study is finding a model for analyzing and evaluating administrative decentralization, so a mixed method research was used to explore and confirm the model of Admi...

  13. A virtual radiation therapy workflow training simulation

    International Nuclear Information System (INIS)

    Bridge, P.; Crowe, S.B.; Gibson, G.; Ellemor, N.J.; Hargrave, C.; Carmichael, M.

    2016-01-01

    Aim: Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method: Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results: Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion: Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment

  14. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  15. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  16. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  17. Analysis of power and frequency control requirements in view of increased decentralized production and market liberalization

    International Nuclear Information System (INIS)

    Roffel, B.; Boer, W.W. de

    2003-01-01

    This paper presents a systematic approach of the analysis of the minimum control requirements that are imposed on power producing units in the Netherlands, especially in the case when decentralized production increases. Also some effects of the liberalization on the control behavior are analyzed. First an overview is given of the amount and type of power production in the Netherlands, followed by a review of the control requirements. Next models are described, including a simplified model for the UCTE power system. The model was tested against frequency and power measurements after failure of a 558 MW production unit in the Netherlands. Agreement between measurements and model predictions proved to be good. The model was subsequently used to analyze the primary and secondary control requirements and the impact of an increase in decentralized power production on the fault restoration capabilities of the power system. Since the latter production units are not actively participating in primary and secondary control, fault restoration takes longer and becomes unacceptable when only 35% of the power producing units participate in secondary control. Finally, the model was used to study the impact of deregulation, especially the effect of 'block scheduling', on additional control actions of the secondary control. (Author)

  18. Decentralized Interleaving of Paralleled Dc-Dc Buck Converters: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, Brian B [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Rodriguez, Miguel [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Sinha, Mohit [University of Minnesota; Dhople, Sairaj [University of Minnesota; Poon, Jason [University of California at Berkeley

    2017-09-01

    We present a decentralized control strategy that yields switch interleaving among parallel connected dc-dc buck converters without communication. The proposed method is based on the digital implementation of the dynamics of a nonlinear oscillator circuit as the controller. Each controller is fully decentralized, i.e., it only requires the locally measured output current to synthesize the pulse width modulation (PWM) carrier waveform. By virtue of the intrinsic electrical coupling between converters, the nonlinear oscillator-based controllers converge to an interleaved state with uniform phase-spacing across PWM carriers. To the knowledge of the authors, this work represents the first fully decentralized strategy for switch interleaving of paralleled dc-dc buck converters.

  19. Decentralization can help reduce deforestation when user groups engage with local government

    Science.gov (United States)

    Wright, Glenn D.; Gibson, Clark C.; Evans, Tom P.

    2016-01-01

    Policy makers around the world tout decentralization as an effective tool in the governance of natural resources. Despite the popularity of these reforms, there is limited scientific evidence on the environmental effects of decentralization, especially in tropical biomes. This study presents evidence on the institutional conditions under which decentralization is likely to be successful in sustaining forests. We draw on common-pool resource theory to argue that the environmental impact of decentralization hinges on the ability of reforms to engage local forest users in the governance of forests. Using matching techniques, we analyze longitudinal field observations on both social and biophysical characteristics in a large number of local government territories in Bolivia (a country with a decentralized forestry policy) and Peru (a country with a much more centralized forestry policy). We find that territories with a decentralized forest governance structure have more stable forest cover, but only when local forest user groups actively engage with the local government officials. We provide evidence in support of a possible causal process behind these results: When user groups engage with the decentralized units, it creates a more enabling environment for effective local governance of forests, including more local government-led forest governance activities, fora for the resolution of forest-related conflicts, intermunicipal cooperation in the forestry sector, and stronger technical capabilities of the local government staff. PMID:27956644

  20. Decentralization can help reduce deforestation when user groups engage with local government.

    Science.gov (United States)

    Wright, Glenn D; Andersson, Krister P; Gibson, Clark C; Evans, Tom P

    2016-12-27

    Policy makers around the world tout decentralization as an effective tool in the governance of natural resources. Despite the popularity of these reforms, there is limited scientific evidence on the environmental effects of decentralization, especially in tropical biomes. This study presents evidence on the institutional conditions under which decentralization is likely to be successful in sustaining forests. We draw on common-pool resource theory to argue that the environmental impact of decentralization hinges on the ability of reforms to engage local forest users in the governance of forests. Using matching techniques, we analyze longitudinal field observations on both social and biophysical characteristics in a large number of local government territories in Bolivia (a country with a decentralized forestry policy) and Peru (a country with a much more centralized forestry policy). We find that territories with a decentralized forest governance structure have more stable forest cover, but only when local forest user groups actively engage with the local government officials. We provide evidence in support of a possible causal process behind these results: When user groups engage with the decentralized units, it creates a more enabling environment for effective local governance of forests, including more local government-led forest governance activities, fora for the resolution of forest-related conflicts, intermunicipal cooperation in the forestry sector, and stronger technical capabilities of the local government staff.

  1. Decentralization of health care systems and health outcomes: Evidence from a natural experiment.

    Science.gov (United States)

    Jiménez-Rubio, Dolores; García-Gómez, Pilar

    2017-09-01

    While many countries worldwide are shifting responsibilities for their health systems to local levels of government, there is to date insufficient evidence about the potential impact of these policy reforms. We estimate the impact of decentralization of the health services on infant and neonatal mortality using a natural experiment: the devolution of health care decision making powers to Spanish regions. The devolution was implemented gradually and asymmetrically over a twenty-year period (1981-2002). The order in which the regions were decentralized was driven by political factors and hence can be considered exogenous to health outcomes. In addition, we exploit the dynamic effect of decentralization of health services and allow for heterogeneous effects by the two main types of decentralization implemented across regions: full decentralization (political and fiscal powers) versus political decentralization only. Our difference in differences results based on a panel dataset for the 50 Spanish provinces over the period 1980 to 2010 show that the lasting benefit of decentralization accrues only to regions which enjoy almost full fiscal and political powers and which are also among the richest regions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Decentralized Budgeting: Getting the Most Out of Disbursements of Funds.

    Science.gov (United States)

    Jefferson, Anne L.

    1995-01-01

    Decentralizing educational budgets allows the disbursement of funds aimed at maximizing student development. Three strategies for decentralizing budgets are program budgeting, which eliminates line-item budgeting and allows administrators to address questions regarding the relative value of educational programs; zero-based budgeting, which allows…

  3. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  4. Centralization or decentralization of facial structures in Korean young adults.

    Science.gov (United States)

    Yoo, Ja-Young; Kim, Jeong-Nam; Shin, Kang-Jae; Kim, Soon-Heum; Choi, Hyun-Gon; Jeon, Hyun-Soo; Koh, Ki-Seok; Song, Wu-Chul

    2013-05-01

    It is well known that facial beauty is dictated by facial type, and harmony between the eyes, nose, and mouth. Furthermore, facial impression is judged according to the overall facial contour and the relationship between the facial structures. The aims of the present study were to determine the optimal criteria for the assessment of gathering or separation of the facial structures and to define standardized ratios for centralization or decentralization of the facial structures.Four different lengths were measured, and 2 indexes were calculated from standardized photographs of 551 volunteers. Centralization and decentralization were assessed using the width index (interpupillary distance / facial width) and height index (eyes-mouth distance / facial height). The mean ranges of the width index and height index were 42.0 to 45.0 and 36.0 to 39.0, respectively. The width index did not differ with sex, but males had more decentralized faces, and females had more centralized faces, vertically. The incidence rate of decentralized faces among the men was 30.3%, and that of centralized faces among the women was 25.2%.The mean ranges in width and height indexes have been determined in a Korean population. Faces with width and height index scores under and over the median ranges are determined to be "centralized" and "decentralized," respectively.

  5. LabelFlow Framework for Annotating Workflow Provenance

    Directory of Open Access Journals (Sweden)

    Pinar Alper

    2018-02-01

    Full Text Available Scientists routinely analyse and share data for others to use. Successful data (reuse relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses is with workflows. A stand-out feature of workflows is their ability to record provenance from executions. Provenance is useful when analyses are executed with changing parameters (changing contexts and results need to be traced to respective parameters. In this paper we investigate whether provenance can be exploited to support reporting. Specifically; we outline a case-study based on a real-world workflow and set of reporting queries. We observe that provenance, as collected from workflow executions, is of limited use for reporting, as it supports queries partially. We identify that this is due to the generic nature of provenance, its lack of domain-specific contextual metadata. We observe that the required information is available in implicit form, embedded in data. We describe LabelFlow, a framework comprised of four Labelling Operators for decorating provenance with domain-specific Labels. LabelFlow can be instantiated for a domain by plugging it with domain-specific metadata extractors. We provide a tool that takes as input a workflow, and produces as output a Labelling Pipeline for that workflow, comprised of Labelling Operators. We revisit the case-study and show how Labels provide a more complete implementation of reporting queries.

  6. Economic Inequalities and the Level of Decentralization in European Countries: Cluster Analysis

    Directory of Open Access Journals (Sweden)

    Laboutková Šárka

    2016-12-01

    Full Text Available This submitted article identifies relations between the degree of decentralization and economic imbalances on the basis of a cluster (exploratory analysis. Two indicators have been chosen for measuring economic inequalities: an indicator of dispersion of regional GDP per capita as representative of the performance imbalances within countries (it measures the economic development gap among regions in European countries; and a multidimensional inequality-adjusted human development index as representative of inequalities in the distribution of wealth in the countries. Decentralization is measured by means of a decentralization index, which contains both quantitative and qualitative components. Although groups of countries characterised by a high degree of decentralization do not necessarily show the lowest degrees of economic imbalances, it is however possible to conclude that the countries in groups with a higher degree of decentralization are among those countries with more favourable values of the economic imbalances indicators monitored. As a part of the research, two clusters of countries were identified which are identical in their degree of decentralization, but differ in the results connected with economic imbalances. The differences are caused by different institutional qualities in the two groups.

  7. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert; Salihoglu, Semih; Widom, Jennifer

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  8. Decentralized guaranteed cost static output feedback vibration control for piezoelectric smart structures

    International Nuclear Information System (INIS)

    Jiang, Jian-ping; Li, Dong-xu

    2010-01-01

    This paper is devoted to the study of the decentralized guaranteed cost static output feedback vibration control for piezoelectric smart structures. A smart panel with collocated piezoelectric actuators and velocity sensors is modeled using a finite element method, and then the size of the model is reduced in the state space using the modal Hankel singular value. The necessary and sufficient conditions of decentralized guaranteed cost static output feedback control for the reduced system have been presented. The decentralized and centralized static output feedback matrices can be obtained from solving two linear matrix inequalities. A comparison between centralized control and decentralized control is performed in order to investigate their effectiveness in suppressing vibration of a smart panel. Numerical results show that when the system is subjected to initial displacement or white noise disturbance, the decentralized and centralized controls are both very effective and the control results are very close

  9. [Analysis of the healthcare service decentralization process in Côte d'Ivoire].

    Science.gov (United States)

    Soura, B D; Coulibaly, S S

    2014-01-01

    The decentralization of healthcare services is becoming increasingly important in strategies of public sector management. This concept is analyzed from various points of view, including legal, economic, political, and sociological. Several typologies have been proposed in the literature to analyze this decentralization process, which can take different forms ranging from simple deconcentration to more elaborate devolution. In some instances, decentralization can be analyzed by the degree of autonomy given to local authorities. This article applies these typologies to analyze the healthcare system decentralization process in Cote d'Ivoire. Special attention is paid to the new forms of community healthcare organizations. These decentralized structures enjoy a kind of autonomy, with characteristics closer to those of devolution. The model might serve as an example for population involvement in defining and managing healthcare problems in Cote d'Ivoire. We end with proposals for the improvement of the process.

  10. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  11. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  12. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  13. Decentralized Gauss-Newton method for nonlinear least squares on wide area network

    Science.gov (United States)

    Liu, Lanchao; Ling, Qing; Han, Zhu

    2014-10-01

    This paper presents a decentralized approach of Gauss-Newton (GN) method for nonlinear least squares (NLLS) on wide area network (WAN). In a multi-agent system, a centralized GN for NLLS requires the global GN Hessian matrix available at a central computing unit, which may incur large communication overhead. In the proposed decentralized alternative, each agent only needs local GN Hessian matrix to update iterates with the cooperation of neighbors. The detail formulation of decentralized NLLS on WAN is given, and the iteration at each agent is defined. The convergence property of the decentralized approach is analyzed, and numerical results validate the effectiveness of the proposed algorithm.

  14. WS-VLAM: A GT4 based workflow management system

    NARCIS (Netherlands)

    Wibisono, A.; Vasyunin, D.; Korkhov, V.; Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.

    2007-01-01

    Generic Grid middleware, e.g., Globus Toolkit 4 (GT4), provides basic services for scientific workflow management systems to discover, store and integrate workflow components. Using the state of the art Grid services can advance the functionality of workflow engine in orchestrating distributed Grid

  15. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  16. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    Science.gov (United States)

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  17. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  18. A Review of Characteristics and Experiences of Decentralization of Education

    Science.gov (United States)

    Mwinjuma, Juma Saidi; Kadir, Suhaida bte Abd.; Hamzah, Azimi; Basri, Ramli

    2015-01-01

    This paper scrutinizes decentralization of education with reference to some countries around the world. We consider discussion on decentralization to be complex, critical and broad question in the contemporary education planning, administration and politics of education reforms. Even though the debate on and implementation of decentralization…

  19. Decentralized energy studies: compendium of international studies and research

    Energy Technology Data Exchange (ETDEWEB)

    Wallace, C.

    1980-03-01

    The purpose of the compendium is to provide information about research activities in decentralized energy systems to researchers, government officials, and interested citizens. The compendium lists and briefly describes a number of studies in other industrialized nations that involve decentralized energy systems. A contact person is given for each of the activities listed so that interested readers can obtain more information.

  20. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  1. The Politics of Fiscal Decentralization Revisited: a Typology and Comparative Evidence

    Directory of Open Access Journals (Sweden)

    Jorge P. Gordin

    2009-06-01

    Full Text Available Although the practice of fiscal decentralization is worldwide and its implementation and effects vary from country to country, its political significance has been often neglected, or worse, treated as implicit to decentralization. This study considers the sources of politicization of fiscal decentralization, focusing on the determination and manipulation of intergovernmental transfers. It develops a new index of fiscal politicization and proposes an explanatory typology that takes into account subnational transfer dependency and the extent to which transfers are politically determined. This analysis renders a conceptual tool that captures nuanced facts about the intergovernmental level of conflict to a larger extent than conventional measures of fiscal decentralization do. We found that the effects of fiscal dependency are intertwined with political asymmetries derived from legislative overrepresentation of territorial units and intergovernmental bargaining strategies.

  2. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  3. Regional Development and Decentralization – two Options to Overcome Lack of Funding

    Directory of Open Access Journals (Sweden)

    Dubravka JURLINA ALIBEGOVIC

    2014-11-01

    Full Text Available Decentralization can be generally described as a process in which selected functions are assigned to sub-national units. The literature identifies a number of positive consequences of decentralization which all lead to a better satisfaction of citizens’ needs for public services. Although the decentralization process in Croatia started more than ten years ago, it has not yet been completed. While leading to a new allocation of authorities and responsibilities to local government units, the level of fiscal decentralization remained lower than in the EU countries.In this paper we analyze the fiscal capacity of local government units to provide an insight into the main problems of decentralization in Croatia. We show that most local government units have very low fiscal capacity, which is insufficient for financing basic public functions with their own resources. The paper presents the results of a survey relating to the decentralization process conducted among local councilors at the regional level in Croatia. We explore how local councilors at the regional level evaluate different goals of decentralization. With the lack of fiscal capacity in mind, we identify two possible solutions for an optimal provision of public functions. The first one is the level of political will for a joint provision of public functions by different local units, and the second one is a change in the territorial organization of the country. We measure the difference in the attitudes toward these questions across counties.

  4. Decentralization in Zambia: resource allocation and district performance.

    Science.gov (United States)

    Bossert, Thomas; Chitah, Mukosha Bona; Bowser, Diana

    2003-12-01

    Zambia implemented an ambitious process of health sector decentralization in the mid 1990s. This article presents an assessment of the degree of decentralization, called 'decision space', that was allowed to districts in Zambia, and an analysis of data on districts available at the national level to assess allocation choices made by local authorities and some indicators of the performance of the health systems under decentralization. The Zambian officials in health districts had a moderate range of choice over expenditures, user fees, contracting, targeting and governance. Their choices were quite limited over salaries and allowances and they did not have control over additional major sources of revenue, like local taxes. The study found that the formula for allocation of government funding which was based on population size and hospital beds resulted in relatively equal per capita expenditures among districts. Decentralization allowed the districts to make decisions on internal allocation of resources and on user fee levels and expenditures. General guidelines for the allocation of resources established a maximum and minimum percentage to be allocated to district offices, hospitals, health centres and communities. Districts tended to exceed the maximum for district offices, but the large urban districts and those without public district hospitals were not even reaching the minimum for hospital allocations. Wealthier and urban districts were more successful in raising revenue through user fees, although the proportion of total expenditures that came from user fees was low. An analysis of available indicators of performance, such as the utilization of health services, immunization coverage and family planning activities, found little variation during the period 1995-98 except for a decline in immunization coverage, which may have also been affected by changes in donor funding. These findings suggest that decentralization may not have had either a positive or

  5. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  6. Decentralized control: An overview

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír

    2008-01-01

    Roč. 32, č. 1 (2008), s. 87-98 ISSN 1367-5788 R&D Projects: GA AV ČR(CZ) IAA200750802; GA MŠk(CZ) LA 282 Institutional research plan: CEZ:AV0Z10750506 Keywords : decentralized control * large-scale systems * decomposition Subject RIV: BC - Control Systems Theory Impact factor: 1.109, year: 2008

  7. Economic analysis of centralized vs. decentralized electronic data capture in multi-center clinical studies.

    Science.gov (United States)

    Walden, Anita; Nahm, Meredith; Barnett, M Edwina; Conde, Jose G; Dent, Andrew; Fadiel, Ahmed; Perry, Theresa; Tolk, Chris; Tcheng, James E; Eisenstein, Eric L

    2011-01-01

    New data management models are emerging in multi-center clinical studies. We evaluated the incremental costs associated with decentralized vs. centralized models. We developed clinical research network economic models to evaluate three data management models: centralized, decentralized with local software, and decentralized with shared database. Descriptive information from three clinical research studies served as inputs for these models. The primary outcome was total data management costs. Secondary outcomes included: data management costs for sites, local data centers, and central coordinating centers. Both decentralized models were more costly than the centralized model for each clinical research study: the decentralized with local software model was the most expensive. Decreasing the number of local data centers and case book pages reduced cost differentials between models. Decentralized vs. centralized data management in multi-center clinical research studies is associated with increases in data management costs.

  8. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  9. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  10. A Model of Workflow Composition for Emergency Management

    Science.gov (United States)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  11. Reliable Decentralized Control of Fuzzy Discrete-Event Systems and a Test Algorithm.

    Science.gov (United States)

    Liu, Fuchun; Dziong, Zbigniew

    2013-02-01

    A framework for decentralized control of fuzzy discrete-event systems (FDESs) has been recently presented to guarantee the achievement of a given specification under the joint control of all local fuzzy supervisors. As a continuation, this paper addresses the reliable decentralized control of FDESs in face of possible failures of some local fuzzy supervisors. Roughly speaking, for an FDES equipped with n local fuzzy supervisors, a decentralized supervisor is called k-reliable (1 ≤ k ≤ n) provided that the control performance will not be degraded even when n - k local fuzzy supervisors fail. A necessary and sufficient condition for the existence of k-reliable decentralized supervisors of FDESs is proposed by introducing the notions of M̃uc-controllability and k-reliable coobservability of fuzzy language. In particular, a polynomial-time algorithm to test the k-reliable coobservability is developed by a constructive methodology, which indicates that the existence of k-reliable decentralized supervisors of FDESs can be checked with a polynomial complexity.

  12. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  13. Community Development in the context of the power decentralization in Ukraine

    Directory of Open Access Journals (Sweden)

    V. P. Berezinskiy

    2017-03-01

    Full Text Available The aim of the study is to define opportunities of development of the community in the implementation of the power decentralization reform in Ukraine. It has been shown that the principle of decentralization provides for territorial and political unity of the state by a legal delimitation of powers between central government agencies and regional public authorities (or local authorities. It makes it clear that the issue of power decentralization in Ukraine has a constitutional and legal framework, as the Main Law states that the power system is based on a combination of centralization and decentralization. The requirement of power decentralization has been constitutionally justified. It has been revealed that according to the State Regional Development Strategy, the following priorities of the state regional policy are: increase of the competitiveness of regions; territorial socio-economic integration and spatial development; effective governance in regional development. It has been disclosed that in Ukraine the deepening of the decentralization is aimed at the strengthening of the role of local self-government, empowerment of the representative authorities of local communities to get more authority for managing local affairs, deprivation of local power authorities for the preparation and fulfilment of budgets in regions, the transfer of significant powers and financial resources from government to local self-governmental authorities. It has been proved that decentralization contributes to the democratization of the local government and the development of local community as the ultimate goals of the reform of the power decentralization are the creation and maintenance of good living environment for citizens. This reform should correspond to interests of citizens in all spheres of life, and it must support on the relevant territory. In this regard, series of legislative acts were adopted («On a voluntary association of local communities»,

  14. Decentralization and Distribution Primary Education Access in Indonesia 2014

    OpenAIRE

    Benita, Novinaz

    2016-01-01

    This paper examines decentralisation and distribution of access to primary school in Indonesia. Data come from Indonesia National Socio Economic Survey 2014, and statistic reports from Ministry of education, Ministry Of Finance, and General Election Commision. Descriptive statistic is used to describe spatial distribution of decentralization in primary education system and distribution of primary education access. The results show there are districts disparities in decentralization of primary...

  15. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  16. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  17. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  18. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  19. On decentralized adaptive full-order sliding mode control of multiple UAVs.

    Science.gov (United States)

    Xiang, Xianbo; Liu, Chao; Su, Housheng; Zhang, Qin

    2017-11-01

    In this study, a novel decentralized adaptive full-order sliding mode control framework is proposed for the robust synchronized formation motion of multiple unmanned aerial vehicles (UAVs) subject to system uncertainty. First, a full-order sliding mode surface in a decentralized manner is designed to incorporate both the individual position tracking error and the synchronized formation error while the UAV group is engaged in building a certain desired geometric pattern in three dimensional space. Second, a decentralized virtual plant controller is constructed which allows the embedded low-pass filter to attain the chattering free property of the sliding mode controller. In addition, robust adaptive technique is integrated in the decentralized chattering free sliding control design in order to handle unknown bounded uncertainties, without requirements for assuming a priori knowledge of bounds on the system uncertainties as stated in conventional chattering free control methods. Subsequently, system robustness as well as stability of the decentralized full-order sliding mode control of multiple UAVs is synthesized. Numerical simulation results illustrate the effectiveness of the proposed control framework to achieve robust 3D formation flight of the multi-UAV system. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  1. Building and documenting workflows with python-based snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area....

  2. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical....... In this paper, we pro- pose a notion of concurrency for declarative process models, formulated in the context of Dynamic Condition Response (DCR) graphs, and exploiting the so-called “true concurrency” semantics of Labelled Asynchronous Transition Systems. We demonstrate how this semantic underpinning...

  3. Improving adherence to the Epic Beacon ambulatory workflow.

    Science.gov (United States)

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  4. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  5. (De)Centralization of Public Procurement at the Local Level in the EU

    OpenAIRE

    Boštjan BREZOVNIK; Žan Jan OPLOTNIK; Borut VOJINOVIĆ

    2015-01-01

    The so-called decentralization of public procurement in EU Member States is accepted as the most suitable design of the public procurement system, often justifi ed by greater economic effi ciency and by the possibility of boosting the development of small and medium-sized enterprises, which act on the public procurement market as providers of goods, services and works. Despite the existence of highly decentralized public procurement systems which refl ect the decentralization of administrativ...

  6. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  7. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  8. Optimizing perioperative decision making: improved information for clinical workflow planning.

    Science.gov (United States)

    Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph

    2012-01-01

    Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction.

  9. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  10. DECENTRALIZATION OF PUBLIC AND LOCAL AUTHORITIES IN UKRAINE

    Directory of Open Access Journals (Sweden)

    Lyudmila Pron’ko

    2016-11-01

    Full Text Available The purpose of research is to examine the purpose of a modern system of local government in Ukraine, scientific analysis of the feasibility and benefits of implemented reforms for decentralization and subsidiary of local authorities, decentralization of public power and public control, and the need to strengthen the political status of local governments. Methodology. The methodological base for research on decentralization and local government reforms to strengthen the political status of local government and decentralization of public power is the Constitution of Ukraine, Laws of Ukraine, Decrees of the President of Ukraine, as well as publications on these issues of domestic and foreign authors. As a result (Results study determined that according to Article 5 of the Law of Ukraine “On local government in Ukraine” The elements of local government are: local community; Village, town, city council; Village, town, city mayor; executive bodies of village, town and city councils; district (in the city Council, created in cities with district division by the decision of the territorial community, or city council; district and regional councils, which represent common interests of territorial communities of villages, towns and cities; BSP; system of government in Ukraine is not fulfilling the role assigned to it, because there is twofold subordination and uncertainty powers of representative and executive bodies. Today there is a three-level administrative division: basic level (village, town or city, district level and level area. There is a local government council and executive body (all the decisions and programs approved by the Regional Council performed by RSA, those public authorities. Thus there is a need for continued reform of local government on the principles of decentralization and subsidiary principle because they are building the foundation of the state; One of the hallmarks of a modern democratic society has become political

  11. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  12. Jealousy Graphs: Structure and Complexity of Decentralized Stable Matching

    Science.gov (United States)

    2013-01-01

    REPORT Jealousy Graphs: Structure and Complexity of Decentralized Stable Matching 14. ABSTRACT 16. SECURITY CLASSIFICATION OF: The stable matching...Franceschetti 858-822-2284 3. DATES COVERED (From - To) Standard Form 298 (Rev 8/98) Prescribed by ANSI Std. Z39.18 - Jealousy Graphs: Structure and...market. Using this structure, we are able to provide a ner analysis of the complexity of a subclass of decentralized matching markets. Jealousy

  13. Computational State Transfer: An Architectural Style for Decentralized Systems

    OpenAIRE

    Gorlick, Michael Martin

    2016-01-01

    A decentralized system is a distributed system that operates under multiple, distinct spheres of authority in which collaboration among the principals is characterized by mutual distrust. Now commonplace, decentralized systems appear in a number of disparate domains: commerce, logistics, medicine, software development, manufacturing, and financial trading to name but a few. These systems of systems face two overlapping demands: security and safety to protect against errors, omissions and thre...

  14. Decentralized investment management: evidence from the pension fund industry

    OpenAIRE

    Blake, David; Timmermann, Allan; Tonks, Ian; Wermers, Russ

    2010-01-01

    The past few decades have seen amajor shift from centralized to decentralized investment management by pension fund sponsors, despite the increased coordination problems that this brings. Using a unique, proprietary dataset of pension sponsors and managers, we identify two secular decentralization trends: sponsors switched (i) from generalist (balanced) to specialist managers across asset classes and (ii) from single to multiple competing managers within each asset class. We study the effe...

  15. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  16. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  17. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  18. Query Optimizations over Decentralized RDF Graphs

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-05-18

    Applications in life sciences, decentralized social networks, Internet of Things, and statistical linked dataspaces integrate data from multiple decentralized RDF graphs via SPARQL queries. Several approaches have been proposed to optimize query processing over a small number of heterogeneous data sources by utilizing schema information. In the case of schema similarity and interlinks among sources, these approaches cause unnecessary data retrieval and communication, leading to poor scalability and response time. This paper addresses these limitations and presents Lusail, a system for scalable and efficient SPARQL query processing over decentralized graphs. Lusail achieves scalability and low query response time through various optimizations at compile and run times. At compile time, we use a novel locality-aware query decomposition technique that maximizes the number of query triple patterns sent together to a source based on the actual location of the instances satisfying these triple patterns. At run time, we use selectivity-awareness and parallel query execution to reduce network latency and to increase parallelism by delaying the execution of subqueries expected to return large results. We evaluate Lusail using real and synthetic benchmarks, with data sizes up to billions of triples on an in-house cluster and a public cloud. We show that Lusail outperforms state-of-the-art systems by orders of magnitude in terms of scalability and response time.

  19. Decentralized energy policy turnaround. Opportunities and challenges; Dezentrale Energiewende. Chancen und Herausforderungen

    Energy Technology Data Exchange (ETDEWEB)

    Eiselt, Juergen

    2012-07-01

    This book supplies a comprehensive inventory of an already beginning decentralized energy policy turnaround. The potentials of an effective energy policy turnaround are described by means of present structures, technologies and scientifically proven results. The book presents new technologies and effective concepts in order to replace the centralised energy supply by decentralized structures. The reduction of expenses and energy-autonomous systems up to the zero-tariff heating are the main impulses for the utilization of decentralized forms of energy.

  20. On the Support of Scientific Workflows over Pub/Sub Brokers

    Directory of Open Access Journals (Sweden)

    Edwin Cedeño

    2013-08-01

    Full Text Available The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  1. On the support of scientific workflows over Pub/Sub brokers.

    Science.gov (United States)

    Morales, Augusto; Robles, Tomas; Alcarria, Ramon; Cedeño, Edwin

    2013-08-20

    The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  2. Decentralized H∞ Control of Interconnected Systems with Time-varying Delays

    Directory of Open Access Journals (Sweden)

    Amal Zouhri

    2017-01-01

    Full Text Available This paper focuses on the problem of delay dependent stability/stabilization of interconnected systems with time-varying delays. The approach is based on a new Lyapunov-Krasovskii functional. A decentralized delay-dependent stability analysis is performed to characterize linear matrix inequalities (LMIs based on the conditions under which every local subsystem of the linear interconnected delay system is asymptotically stable. Then we design a decentralized state-feedback stabilization scheme such that the family of closedloop feedback subsystems enjoys the delay-dependent asymptotic stability for each subsystem. The decentralized feedback gains are determined by convex optimization over LMIs. All the developed results are tested on a representative example and compared with some recent previous ones.

  3. The EOS imaging system: Workflow and radiation dose in scoliosis examinations

    DEFF Research Database (Denmark)

    Mussmann, Bo; Torfing, Trine; Jespersen, Stig

    Introduction The EOS imaging system is a biplane slot beam scanner capable of full body scans at low radiation dose and without geometrical distortion. It was implemented in our department primo 2012 and all scoliosis examinations are now performed in EOS. The system offers improved possibility...... to measure rotation of individual vertebrae and vertebral curves can be assessed in 3D. Leg length Discrepancy measurements are performed in one exposure without geometrical distortion and no stitching. Full body scans for sagittal balance are also performed with the equipment after spine surgery. Purpose...... The purpose of the study was to evaluate workflow defined as scheduled time pr. examination and radiation dose in scoliosis examinations in EOS compared to conventional x-ray evaluation. Materials and Methods: The Dose Area Product (DAP) was measured with a dosimeter and a comparison between conventional X...

  4. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  5. Workflow Lexicons in Healthcare: Validation of the SWIM Lexicon.

    Science.gov (United States)

    Meenan, Chris; Erickson, Bradley; Knight, Nancy; Fossett, Jewel; Olsen, Elizabeth; Mohod, Prerna; Chen, Joseph; Langer, Steve G

    2017-06-01

    For clinical departments seeking to successfully navigate the challenges of modern health reform, obtaining access to operational and clinical data to establish and sustain goals for improving quality is essential. More broadly, health delivery organizations are also seeking to understand performance across multiple facilities and often across multiple electronic medical record (EMR) systems. Interpreting operational data across multiple vendor systems can be challenging, as various manufacturers may describe different departmental workflow steps in different ways and sometimes even within a single vendor's installed customer base. In 2012, The Society for Imaging Informatics in Medicine (SIIM) recognized the need for better quality and performance data standards and formed SIIM's Workflow Initiative for Medicine (SWIM), an initiative designed to consistently describe workflow steps in radiology departments as well as defining operational quality metrics. The SWIM lexicon was published as a working model to describe operational workflow steps and quality measures. We measured the prevalence of the SWIM lexicon workflow steps in both academic and community radiology environments using real-world patient observations and correlated that information with automatically captured workflow steps from our clinical information systems. Our goal was to measure frequency of occurrence of workflow steps identified by the SWIM lexicon in a real-world clinical setting, as well as to correlate how accurately departmental information systems captured patient flow through our health facility.

  6. Decentralized control: Status and outlook

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír

    2014-01-01

    Roč. 38, č. 1 (2014), s. 71-80 ISSN 1367-5788 R&D Projects: GA ČR GA13-02149S Institutional support: RVO:67985556 Keywords : decentralized control * networked control systems * event-triggered approach Subject RIV: BC - Control Systems Theory Impact factor: 2.518, year: 2014

  7. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    Nilsen, Dimitri; Weber, Pavel

    2014-01-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  8. Decentralization – the way of democratization and modernization of the Republic of Moldova

    Directory of Open Access Journals (Sweden)

    Iurie ŢAP

    2017-06-01

    Full Text Available Decentralization as a way of organizing a state represents the path to its democratization and effectiveness. Furthermore, territorial decentralization establishes the relations between state and local communities, and in order to be efficient some fundamental theoretic conditions should be respected, guidelines followed and two great balances achieved. Generally, an appropriate decentralization can be a catalyst for development and a remedy to overcome internal crises.

  9. Corruption, accountability, and decentralization: theory and evidence from Mexico

    OpenAIRE

    Goodspeed, Timothy J.

    2011-01-01

    One of the fundamental tenets of fiscal federalism is that, absent various sorts of externalities, decentralized governments that rely on own-source revenues should be more fiscally efficient than decentralized governments that rely on grant financing. The argument relies in part on the idea that sub-national governments, being closer to the people, are more accountable to its citizens. Accountability to citizens is also important in understanding the presence of corruption in government. Thi...

  10. Peer Matcher : Decentralized Partnership Formation

    NARCIS (Netherlands)

    Bozdog, Nicolae Vladimir; Voulgaris, Spyros; Bal, Henri; van Halteren, Aart

    2015-01-01

    This paper presents Peer Matcher, a fully decentralized algorithm solving the k-clique matching problem. The aim of k-clique matching is to cluster a set of nodes having pair wise weights into k-size groups of maximal total weight. Since solving the problem requires exponential time, Peer Matcher

  11. Music Libraries: Centralization versus Decentralization.

    Science.gov (United States)

    Kuyper-Rushing, Lois

    2002-01-01

    Considers the decision that branch libraries, music libraries in particular, have struggled with concerning a centralized location in the main library versus a decentralized collection. Reports on a study of the Association of Research Libraries that investigated the location of music libraries, motivation for the location, degrees offered,…

  12. PeerMatcher: Decentralized Partnership Formation

    NARCIS (Netherlands)

    Bozdog, N.V.; Voulgaris, S.; Bal, H.E.; van Halteren, A.

    2015-01-01

    This paper presents PeerMatcher, a fully decentralized algorithm solving the k-clique matching problem. The aim of k-clique matching is to cluster a set of nodes having pairwise weights into k-size groups of maximal total weight. Since solving the problem requires exponential time, PeerMatcher

  13. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. The myth of standardized workflow in primary care.

    Science.gov (United States)

    Holman, G Talley; Beasley, John W; Karsh, Ben-Tzion; Stone, Jamie A; Smith, Paul D; Wetterneck, Tosha B

    2016-01-01

    Primary care efficiency and quality are essential for the nation's health. The demands on primary care physicians (PCPs) are increasing as healthcare becomes more complex. A more complete understanding of PCP workflow variation is needed to guide future healthcare redesigns. This analysis evaluates workflow variation in terms of the sequence of tasks performed during patient visits. Two patient visits from 10 PCPs from 10 different United States Midwestern primary care clinics were analyzed to determine physician workflow. Tasks and the progressive sequence of those tasks were observed, documented, and coded by task category using a PCP task list. Variations in the sequence and prevalence of tasks at each stage of the primary care visit were assessed considering the physician, the patient, the visit's progression, and the presence of an electronic health record (EHR) at the clinic. PCP workflow during patient visits varies significantly, even for an individual physician, with no single or even common workflow pattern being present. The prevalence of specific tasks shifts significantly as primary care visits progress to their conclusion but, notably, PCPs collect patient information throughout the visit. PCP workflows were unpredictable during face-to-face patient visits. Workflow emerges as the result of a "dance" between physician and patient as their separate agendas are addressed, a side effect of patient-centered practice. Future healthcare redesigns should support a wide variety of task sequences to deliver high-quality primary care. The development of tools such as electronic health records must be based on the realities of primary care visits if they are to successfully support a PCP's mental and physical work, resulting in effective, safe, and efficient primary care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Comparative LCA of decentralized wastewater treatment alternatives for non-potable urban reuse.

    Science.gov (United States)

    Opher, Tamar; Friedler, Eran

    2016-11-01

    Municipal wastewater (WW) effluent represents a reliable and significant source for reclaimed water, very much needed nowadays. Water reclamation and reuse has become an attractive option for conserving and extending available water sources. The decentralized approach to domestic WW treatment benefits from the advantages of source separation, which makes available simple small-scale systems and on-site reuse, which can be constructed on a short time schedule and occasionally upgraded with new technological developments. In this study we perform a Life Cycle Assessment to compare between the environmental impacts of four alternatives for a hypothetical city's water-wastewater service system. The baseline alternative is the most common, centralized approach for WW treatment, in which WW is conveyed to and treated in a large wastewater treatment plant (WWTP) and is then discharged to a stream. The three alternatives represent different scales of distribution of the WW treatment phase, along with urban irrigation and domestic non-potable water reuse (toilet flushing). The first alternative includes centralized treatment at a WWTP, with part of the reclaimed WW (RWW) supplied back to the urban consumers. The second and third alternatives implement de-centralized greywater (GW) treatment with local reuse, one at cluster level (320 households) and one at building level (40 households). Life cycle impact assessment results show a consistent disadvantage of the prevailing centralized approach under local conditions in Israel, where seawater desalination is the marginal source of water supply. The alternative of source separation and GW reuse at cluster level seems to be the most preferable one, though its environmental performance is only slightly better than GW reuse at building level. Centralized WW treatment with urban reuse of WWTP effluents is not advantageous over decentralized treatment of GW because the supply of RWW back to consumers is very costly in materials and

  16. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  17. Decentralization's impact on the health workforce: Perspectives of managers, workers and national leaders

    Directory of Open Access Journals (Sweden)

    Kolehmainen-Aitken Riitta-Liisa

    2004-05-01

    Full Text Available Abstract Designers and implementers of decentralization and other reform measures have focused much attention on financial and structural reform measures, but ignored their human resource implications. Concern is mounting about the impact that the reallocation of roles and responsibilities has had on the health workforce and its management, but the experiences and lessons of different countries have not been widely shared. This paper examines evidence from published literature on decentralization's impact on the demand side of the human resource equation, as well as the factors that have contributed to the impact. The elements that make such an impact analysis exceptionally complex are identified. They include the mode of decentralization that a country is implementing, the level of responsibility for the salary budget and pay determination, and the civil service status of transferred health workers. The main body of the paper is devoted to examining decentralization's impact on human resource issues from three different perspectives: that of local health managers, health workers themselves, and national health leaders. These three groups have different concerns in the human resource realm, and consequently, have been differently affected by decentralization processes. The paper concludes with recommendations regarding three key concerns that national authorities and international agencies should give prompt attention to. They are (1 defining the essential human resource policy, planning and management skills for national human resource managers who work in decentralized countries, and developing training programs to equip them with such skills; (2 supporting research that focuses on improving the knowledge base of how different modes of decentralization impact on staffing equity; and (3 identifying factors that most critically influence health worker motivation and performance under decentralization, and documenting the most cost-effective best

  18. Decentralization's impact on the health workforce: Perspectives of managers, workers and national leaders.

    Science.gov (United States)

    Kolehmainen-Aitken, Riitta-Liisa

    2004-05-14

    Designers and implementers of decentralization and other reform measures have focused much attention on financial and structural reform measures, but ignored their human resource implications. Concern is mounting about the impact that the reallocation of roles and responsibilities has had on the health workforce and its management, but the experiences and lessons of different countries have not been widely shared. This paper examines evidence from published literature on decentralization's impact on the demand side of the human resource equation, as well as the factors that have contributed to the impact. The elements that make such an impact analysis exceptionally complex are identified. They include the mode of decentralization that a country is implementing, the level of responsibility for the salary budget and pay determination, and the civil service status of transferred health workers.The main body of the paper is devoted to examining decentralization's impact on human resource issues from three different perspectives: that of local health managers, health workers themselves, and national health leaders. These three groups have different concerns in the human resource realm, and consequently, have been differently affected by decentralization processes. The paper concludes with recommendations regarding three key concerns that national authorities and international agencies should give prompt attention to. They are (1) defining the essential human resource policy, planning and management skills for national human resource managers who work in decentralized countries, and developing training programs to equip them with such skills; (2) supporting research that focuses on improving the knowledge base of how different modes of decentralization impact on staffing equity; and (3) identifying factors that most critically influence health worker motivation and performance under decentralization, and documenting the most cost-effective best practices to improve them

  19. Decentralized Economic Dispatch Scheme With Online Power Reserve for Microgrids

    DEFF Research Database (Denmark)

    Nutkani, I. U.; Loh, Poh Chiang; Wang, P.

    2017-01-01

    Decentralized economic operation schemes have several advantages when compared with the traditional centralized management system for microgrids. Specifically, decentralized schemes are more flexible, less computationally intensive, and easier to implement without relying on communication...... costs, their power ratings, and other necessary constraints, before deciding the DG dispatch priorities and droop characteristics. The proposed scheme also allows online power reserve to be set and regulated within the microgrid. This, together with the generation cost saved, has been verified...... infrastructure. Economic operation of existing decentralized schemes is also usually achieved by either tuning the droop characteristics of distributed generators (DGs) or prioritizing their dispatch order. For the latter, an earlier scheme has tried to prioritize the DG dispatch based on their no...

  20. Shorter Decentralized Attribute-Based Encryption via Extended Dual System Groups

    Directory of Open Access Journals (Sweden)

    Jie Zhang

    2017-01-01

    Full Text Available Decentralized attribute-based encryption (ABE is a special form of multiauthority ABE systems, in which no central authority and global coordination are required other than creating the common reference parameters. In this paper, we propose a new decentralized ABE in prime-order groups by using extended dual system groups. We formulate some assumptions used to prove the security of our scheme. Our proposed scheme is fully secure under the standard k-Lin assumption in random oracle model and can support any monotone access structures. Compared with existing fully secure decentralized ABE systems, our construction has shorter ciphertexts and secret keys. Moreover, fast decryption is achieved in our system, in which ciphertexts can be decrypted with a constant number of pairings.

  1. Decentralized Optimization for a Novel Control Structure of HVAC System

    Directory of Open Access Journals (Sweden)

    Shiqiang Wang

    2016-01-01

    Full Text Available A decentralized control structure is introduced into the heating, ventilation, and air conditioning (HVAC system to solve the high maintenance and labor cost problem in actual engineering. Based on this new control system, a decentralized optimization method is presented for sensor fault repair and optimal group control of HVAC equipment. Convergence property of the novel method is theoretically analyzed considering both convex and nonconvex systems with constraints. In this decentralized control system, traditional device is fitted with a control chip such that it becomes a smart device. The smart device can communicate and operate collaboratively with the other devices to accomplish some designated tasks. The effectiveness of the presented method is verified by simulations and hardware tests.

  2. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  3. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  5. What is needed for effective open access workflows?

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  6. The Dynamics of Decentralization Arrangements in Indonesia Constitutional System

    Directory of Open Access Journals (Sweden)

    Haposan Siallagan

    2016-06-01

    Full Text Available Local autonomy has long been implemented in Indonesia and has been experiencing a number of phases within governmental system. This writing is intended to fathom the dynamics of decentralization arrangement indeed. The discussion it-self shows that according to the substances in a number of decentralization policy which had/has been issued, the dynamics of local autonomy arrangements (as regulated in many decentralization policies are inclining to be captured in a broad meaning, which is frequently known as a broadest local autonomy. Through local autonomy mechanism, local governments are given a flexibility in order to manage and administer their own domestic household. In order to maximize the implementation of widest local autonomy, local government has to be pushed to be well prepared in handling many local governmental tasks. Such preparations are related to human resources capacity, the competences in running the tasks, and financial management capacity.

  7. Evaluation of Meta scheduler Architectures and Task assignment Policies for High throughput Computing

    CERN Document Server

    Caron, E; Tsaregorodtsev, A Yu

    2006-01-01

    In this paper we present a model and simulator for many clusters of heterogeneous PCs belonging to a local network. These clusters are assumed to be connected to each other through a global network and each cluster is managed via a local scheduler which is shared by many users. We validate our simulator by comparing the experimental and analytical results of a M/M/4 queuing system. These studies indicate that the simulator is consistent. Next, we do the comparison with a real batch system and we obtain an average error of 10.5\\% for the response time and 12\\% for the makespan. We conclude that the simulator is realistic and well describes the behaviour of a large-scale system. Thus we can study the scheduling of our system called \\dirac in a high throughput context. We justify our decentralized, adaptive and opportunistic approach in comparison to a centralized approach in such a context.

  8. Decentralized substations for low-temperature district heating with no Legionella risk, and low return temperatures

    International Nuclear Information System (INIS)

    Yang, Xiaochen; Li, Hongwei; Svendsen, Svend

    2016-01-01

    To improve energy efficiency and give more access to renewable energy sources, low-temperature district heating (LTDH) is a promising concept to be realized in the future. However, concern about Legionella proliferation restricts applying low-temperature district heating in conventional systems with domestic hot water (DHW) circulation. In this study, a system with decentralized substations was analysed as a solution to this problem. Furthermore, a modification for the decentralized substation system were proposed in order to reduce the average return temperature. Models of conventional system with medium-temperature district heating, decentralized substation system with LTDH, and innovative decentralized substation system with LTDH were built based on the information of a case building. The annual distribution heat loss and the operating costs of the three scenarios were calculated and compared. From the results, realizing LTDH by the decentralized substation unit, 30% of the annual distribution heat loss inside the building can be saved compared to a conventional system with medium-temperature district heating. Replacing the bypass pipe with an in-line supply pipe and a heat pump, the innovative decentralized substation system can reduce distribution heat loss by 39% compared to the conventional system and by 12% compared to the normal decentralized substation system with bypass. - Highlights: • The system of decentralized substations can realize low-temperature district heating without running the risk of Legionella. • Decentralized substations help reduce the distribution heat loss inside the building compared to conventional system. • A new concept that can reduce the return temperature for district heating is proposed and analysed.

  9. Centralized vs. decentralized child mental health services.

    Science.gov (United States)

    Adams, M S

    1977-09-01

    One of the basic tenets of the Community Mental Health Center movement is that services should be provided in the consumers' community. Various centers across the country have attempted to do this in either a centralized or decentralized fashion. Historically, most health services have been provided centrally, a good example being the traditional general hospital with its centralized medical services. Over the years, some of these services have become decentralized to take the form of local health centers, health maintenance organizations, community clinics, etc, and now various large mental health centers are also being broken down into smaller community units. An example of each type of mental health facility is delineated here.

  10. wft4galaxy: a workflow testing tool for galaxy.

    Science.gov (United States)

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  11. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  12. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  13. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  14. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  15. Decentralized control of discrete-time linear time invariant systems with input saturation

    NARCIS (Netherlands)

    Deliu, C.; Deliu, Ciprian; Malek, Babak; Roy, Sandip; Saberi, Ali; Stoorvogel, Antonie Arij

    We study decentralized stabilization of discrete-time linear time invariant (LTI) systems subject to actuator saturation, using LTI controllers. The requirement of stabilization under both saturation constraints and decentralization impose obvious necessary conditions on the open-loop plant, namely

  16. Decentralized control of discrete-time linear time invariant systems with input saturation

    NARCIS (Netherlands)

    Deliu, Ciprian; Deliu, C.; Malek, Babak; Roy, Sandip; Saberi, Ali; Stoorvogel, Antonie Arij

    2009-01-01

    We study decentralized stabilization of discrete time linear time invariant (LTI) systems subject to actuator saturation, using LTI controllers. The requirement of stabilization under both saturation constraints and decentralization impose obvious necessary conditions on the open-loop plant, namely

  17. Decentralizing conservation and diversifying livelihoods within Kanchenjunga Conservation Area, Nepal.

    Science.gov (United States)

    Parker, Pete; Thapa, Brijesh; Jacob, Aerin

    2015-12-01

    To alleviate poverty and enhance conservation in resource dependent communities, managers must identify existing livelihood strategies and the associated factors that impede household access to livelihood assets. Researchers increasingly advocate reallocating management power from exclusionary central institutions to a decentralized system of management based on local and inclusive participation. However, it is yet to be shown if decentralizing conservation leads to diversified livelihoods within a protected area. The purpose of this study was to identify and assess factors affecting household livelihood diversification within Nepal's Kanchenjunga Conservation Area Project, the first protected area in Asia to decentralize conservation. We randomly surveyed 25% of Kanchenjunga households to assess household socioeconomic and demographic characteristics and access to livelihood assets. We used a cluster analysis with the ten most common income generating activities (both on- and off-farm) to group the strategies households use to diversify livelihoods, and a multinomial logistic regression to identify predictors of livelihood diversification. We found four distinct groups of household livelihood strategies with a range of diversification that directly corresponded to household income. The predictors of livelihood diversification were more related to pre-existing socioeconomic and demographic factors (e.g., more landholdings and livestock, fewer dependents, receiving remittances) than activities sponsored by decentralizing conservation (e.g., microcredit, training, education, interaction with project staff). Taken together, our findings indicate that without direct policies to target marginalized groups, decentralized conservation in Kanchenjunga will continue to exclude marginalized groups, limiting a household's ability to diversify their livelihood and perpetuating their dependence on natural resources. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  19. THE DECENTRALIZATION PROCESS IN ROMANIA HAS BEEN AFFECTED BY THE FINANCIAL CRISIS OR NOT; ARGUMENTS IN FAVOR OR AGAINST DECENTRALIZATION IN THE MANAGEMENT OF THE FINANCIAL CRISIS

    Directory of Open Access Journals (Sweden)

    OLIVIA MANOLE

    2012-12-01

    Full Text Available Typically, the decentralization process is extremely complicated and involves many challenges, if we were to take into account local conflicts, the interests of the central government and the complexity of simultaneous decentralization in administrative, political and economic plan. The financial crisis has added another dimension to the complexity of this phenomenon, misbalancing the economy and creating a fiscal pressure both at central and local level. In this context it rises the problem whether the management of the financial crisis can be better realised within a decentralized system or whether it may lay pressure on the return to the centralized government form.

  20. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  1. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  2. Agent-based Decentralization of Applications in Distributed Smart Grid Systems

    DEFF Research Database (Denmark)

    Kienesberger, Georg; Xypolytou, Evangelia; Marchgraber, Jurgen

    2015-01-01

    systems (DMACS) and aims to give an overview on the different requirements and challenges on the way from current centralized control systems to DMACS. Therefore, different ICT scenarios and MAS topologies are employed to discuss the decentralization of three exemplary smart grid applications: voltage......Smart grid technology promises to prepare today’s power systems for the challenges of the future by extensive integration of information and communication technology (ICT). One key aspect is the control paradigm which will have to be shifted from completely centralized control systems to more...... dezentralized concepts in order to adapt to the distributed nature of smart grids. Multi-agent systems (MAS) are a very promising approach for designing distributed, decentralized systems, naturally also in the field of smart grids. This work introduces the notion of decentralized multi-agent-based control...

  3. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  4. Decentralized forest governance in central Vietnam

    NARCIS (Netherlands)

    Tran Nam, T.; Burgers, P.P.M.

    2012-01-01

    A major challenge in decentralized forest governance in Vietnam is developing a mechanism that would support both reforestation and poverty reduction among people in rural communities. To help address this challenge, Forest Land Allocation (FLA) policies recognize local communities and individuals

  5. Building and documenting workflows with python-based snakemake

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to

  6. Support for Taverna workflows in the VPH-Share cloud platform.

    Science.gov (United States)

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. OL-DEC-MDP Model for Multiagent Online Scheduling with a Time-Dependent Probability of Success

    Directory of Open Access Journals (Sweden)

    Cheng Zhu

    2014-01-01

    Full Text Available Focusing on the on-line multiagent scheduling problem, this paper considers the time-dependent probability of success and processing duration and proposes an OL-DEC-MDP (opportunity loss-decentralized Markov Decision Processes model to include opportunity loss into scheduling decision to improve overall performance. The success probability of job processing as well as the process duration is dependent on the time at which the processing is started. The probability of completing the assigned job by an agent would be higher when the process is started earlier, but the opportunity loss could also be high due to the longer engaging duration. As a result, OL-DEC-MDP model introduces a reward function considering the opportunity loss, which is estimated based on the prediction of the upcoming jobs by a sampling method on the job arrival. Heuristic strategies are introduced in computing the best starting time for an incoming job by each agent, and an incoming job will always be scheduled to the agent with the highest reward among all agents with their best starting policies. The simulation experiments show that the OL-DEC-MDP model will improve the overall scheduling performance compared with models not considering opportunity loss in heavy-loading environment.

  8. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  9. A Retrospective Analysis of the Development of Fiscal Decentralization

    Directory of Open Access Journals (Sweden)

    Rekova Nataliia Yu.

    2017-12-01

    Full Text Available The study forms the theoretical basis for the implementation of fiscal decentralization in Ukraine on the basis of determining the correspondence between the evolution of scientific approaches to the formation of an effective model of public administration and the degree of power centralization at a particular stage of the development of society. The views of thinkers of the ancient states of Egypt, Mesopotamia, India, China, Rome, Greece are generalized, and the priority of centralized public administration without segregation of centralization forms is determined. The degree of centralization in the period of development of feudal states is characterized. The scientific views of representatives of the neoinstitutional direction of economic thought are analyzed in detail, and the stages of the formation of decentralization, in particular fiscal, as a separate theory, are defined. The stages of and the corresponding organizational and legislative documents for the implementation of decentralization in Ukraine are outlined, and its results are characterized.

  10. Decentralization Challenges for Management of Cultural Patrimony in Ecuador

    Directory of Open Access Journals (Sweden)

    Dr.C. Carlos Leonel Escudero-Sánchez

    2015-10-01

    Full Text Available In Latin America  the new  decentralization policies in  the forms of government  challenging  institutional  processes  and management  practices.  In the context of  Ecuador  responds to  a constitutional  mandate expressed in  the reformulation of  the powers of the  municipal  autonomous governments. Consequently, the main  purpose of the article  is to present  the main  principles  expressed  in the  protection, evaluation and dissemination of cultural heritage.  Therefore  the  exercise of autonomy  and decentralization  is governed  by the principles  of solidarity, subsidiarity,  territorial equity, integration and participation.  Hence,  the main results  are  part of  the systematization of  such  socio-economic, institutional, governance  and participation, legal and financial  management of cultural heritage contexts.Keywords: decentralization, cultural heritage, cultural management, citizen participation.

  11. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  12. Impact of CGNS on CFD Workflow

    Science.gov (United States)

    Poinot, M.; Rumsey, C. L.; Mani, M.

    2004-01-01

    CFD tools are an integral part of industrial and research processes, for which the amount of data is increasing at a high rate. These data are used in a multi-disciplinary fluid dynamics environment, including structural, thermal, chemical or even electrical topics. We show that the data specification is an important challenge that must be tackled to achieve an efficient workflow for use in this environment. We compare the process with other software techniques, such as network or database type, where past experiences showed how difficult it was to bridge the gap between completely general specifications and dedicated specific applications. We show two aspects of the use of CFD General Notation System (CGNS) that impact CFD workflow: as a data specification framework and as a data storage means. Then, we give examples of projects involving CFD workflows where the use of the CGNS standard leads to a useful method either for data specification, exchange, or storage.

  13. Patient Experiences of Decentralized HIV Treatment and Care in Plateau State, North Central Nigeria: A Qualitative Study

    Directory of Open Access Journals (Sweden)

    Grace O. Kolawole

    2017-01-01

    Full Text Available Background. Decentralization of care and treatment for HIV infection in Africa makes services available in local health facilities. Decentralization has been associated with improved retention and comparable or superior treatment outcomes, but patient experiences are not well understood. Methods. We conducted a qualitative study of patient experiences in decentralized HIV care in Plateau State, north central Nigeria. Five decentralized care sites in the Plateau State Decentralization Initiative were purposefully selected. Ninety-three patients and 16 providers at these sites participated in individual interviews and focus groups. Data collection activities were audio-recorded and transcribed. Transcripts were inductively content analyzed to derive descriptive categories representing patient experiences of decentralized care. Results. Patient participants in this study experienced the transition to decentralized care as a series of “trade-offs.” Advantages cited included saving time and money on travel to clinic visits, avoiding dangers on the road, and the “family-like atmosphere” found in some decentralized clinics. Disadvantages were loss of access to ancillary services, reduced opportunities for interaction with providers, and increased risk of disclosure. Participants preferred decentralized services overall. Conclusion. Difficulty and cost of travel remain a fundamental barrier to accessing HIV care outside urban centers, suggesting increased availability of community-based services will be enthusiastically received.

  14. Detecting dissonance in clinical and research workflow for translational psychiatric registries.

    Science.gov (United States)

    Cofiel, Luciana; Bassi, Débora U; Ray, Ryan Kumar; Pietrobon, Ricardo; Brentani, Helena

    2013-01-01

    The interplay between the workflow for clinical tasks and research data collection is often overlooked, ultimately making it ineffective. To the best of our knowledge, no previous studies have developed standards that allow for the comparison of workflow models derived from clinical and research tasks toward the improvement of data collection processes. In this study we used the term dissonance for the occurrences where there was a discord between clinical and research workflows. We developed workflow models for a translational research study in psychiatry and the clinic where its data collection was carried out. After identifying points of dissonance between clinical and research models we derived a corresponding classification system that ultimately enabled us to re-engineer the data collection workflow. We considered (1) the number of patients approached for enrollment and (2) the number of patients enrolled in the study as indicators of efficiency in research workflow. We also recorded the number of dissonances before and after the workflow modification. We identified 22 episodes of dissonance across 6 dissonance categories: actor, communication, information, artifact, time, and space. We were able to eliminate 18 episodes of dissonance and increase the number of patients approached and enrolled in research study trough workflow modification. The classification developed in this study is useful for guiding the identification of dissonances and reveal modifications required to align the workflow of data collection and the clinical setting. The methodology described in this study can be used by researchers to standardize data collection process.

  15. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  16. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.; Zhang, L.J.; Watson, T.J.; Yang, J.; Hung, P.C.K.

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of

  17. Analysis for corruption and decentralization (Case study: earlier decentralization era in Indonesia)

    OpenAIRE

    Haryanto, Joko Tri; Astuti S.A., Esther Sri

    2017-01-01

    In many countries, relationship between decentralization of government activities and the extent of rent extraction by private parties is an important element in the recent debate on institutional design. The topic of corruption was actively, openly and debated in Indonesia by government, its development partners, and a broadly based group of political and civil society leaders are engaged in meetings and exchange on a daily basis. In the ongoing debate on corruption a lot of attention is pai...

  18. Decentralized Energy from Waste Systems

    Directory of Open Access Journals (Sweden)

    Blanca Antizar-Ladislao

    2010-01-01

    Full Text Available In the last five years or so, biofuels have been given notable consideration worldwide as an alternative to fossil fuels, due to their potential to reduce greenhouse gas emissions by partial replacement of oil as a transport fuel. The production of biofuels using a sustainable approach, should consider local production of biofuels, obtained from local feedstocks and adapted to the socio-economical and environmental characteristics of the particular region where they are developed. Thus, decentralized energy from waste systems will exploit local biomass to optimize their production and consumption. Waste streams such as agricultural and wood residues, municipal solid waste, vegetable oils, and algae residues can all be integrated in energy from waste systems. An integral optimization of decentralized energy from waste systems should not be based on the optimization of each single process, but the overall optimization of the whole process. This is by obtaining optimal energy and environmental benefits, as well as collateral beneficial co-products such as soil fertilizers which will result in a higher food crop production and carbon dioxide fixation which will abate climate change.

  19. Decentralized energy from waste systems

    International Nuclear Information System (INIS)

    Antizar-Ladislao, B.; Turrion-Gomez, J. L.

    2010-01-01

    In the last five years or so, biofuels have been given notable consideration worldwide as an alternative to fossil fuels, due to their potential to reduce greenhouse gas emissions by partial replacement of oil as a transport fuel. The production of biofuels using a sustainable approach, should consider local production of biofuels, obtained from local feedstocks and adapted to the socio-economical and environmental characteristics of the particular region where they are developed. Thus, decentralized energy from waste systems will exploit local biomass to optimize their production and consumption. Waste streams such as agricultural and wood residues, municipal solid waste, vegetable oils, and algae residues can all be integrated in energy from waste systems. An integral optimization of decentralized energy from waste systems should not be based on the optimization of each single process, but the overall optimization of the whole process. This is by obtaining optimal energy and environmental benefits, as well as collateral beneficial co-products such as soil fertilizers which will result in a higher food crop production and carbon dioxide fixation which will abate climate change. (author)

  20. The P2P approach to interorganizational workflows

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Weske, M.H.; Dittrich, K.R.; Geppert, A.; Norrie, M.C.

    2001-01-01

    This paper describes in an informal way the Public-To-Private (P2P) approach to interorganizational workflows, which is based on a notion of inheritance. The approach consists of three steps: (1) create a common understanding of the interorganizational workflow by specifying a shared public

  1. Open source workflow : a viable direction for BPM?

    NARCIS (Netherlands)

    Wohed, P.; Russell, N.C.; Hofstede, ter A.H.M.; Andersson, B.; Aalst, van der W.M.P.; Bellahsène, Z.; Léonard, M.

    2008-01-01

    With the growing interest in open source software in general and business process management and workflow systems in particular, it is worthwhile investigating the state of open source workflow management. The plethora of these offerings (recent surveys such as [4,6], each contain more than 30 such

  2. Privacy-aware workflow management

    NARCIS (Netherlands)

    Alhaqbani, B.; Adams, M.; Fidge, C.J.; Hofstede, ter A.H.M.; Glykas, M.

    2013-01-01

    Information security policies play an important role in achieving information security. Confidentiality, Integrity, and Availability are classic information security goals attained by enforcing appropriate security policies. Workflow Management Systems (WfMSs) also benefit from inclusion of these

  3. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp......Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics...

  4. Asynchronous decentralized method for interconnected electricity markets

    International Nuclear Information System (INIS)

    Huang, Anni; Joo, Sung-Kwan; Song, Kyung-Bin; Kim, Jin-Ho; Lee, Kisung

    2008-01-01

    This paper presents an asynchronous decentralized method to solve the optimization problem of interconnected electricity markets. The proposed method decomposes the optimization problem of combined electricity markets into individual optimization problems. The impact of neighboring markets' information is included in the objective function of the individual market optimization problem by the standard Lagrangian relaxation method. Most decentralized optimization methods use synchronous models of communication to exchange updated market information among markets during the iterative process. In this paper, however, the solutions of the individual optimization problems are coordinated through an asynchronous communication model until they converge to the global optimal solution of combined markets. Numerical examples are presented to demonstrate the advantages of the proposed asynchronous method over the existing synchronous methods. (author)

  5. Staffing, qualification and organization for centralized and decentralized training

    International Nuclear Information System (INIS)

    Holyoak, R.H.

    1985-01-01

    This paper covers an extensive area. First a brief history of the training at Commonwealth Edison is presented so that the reader can get some idea of why some of the problems mentioned exist. Next is a discussion of the centralized and decentralized Commonwealth Edison production training organization. A brief review of the development of the Instructor Qualification Program and the training of instructors follows. Finally, a review of the problems and some solutions related to managing a centralized/decentralized training system is included

  6. A set of decentralized PID controllers for an n–link robot manipulator

    Indian Academy of Sciences (India)

    A class of stabilizing decentralized proportional integral derivative (PID) controllers for an -link robot manipulator system is proposed. The range of decentralized PID controller parameters for an -link robot manipulator is obtained using Kharitonov theorem and stability boundary equations. Basically, the proposed design ...

  7. Decentralized Reinforcement Learning of robot behaviors

    NARCIS (Netherlands)

    Leottau, David L.; Ruiz-del-Solar, Javier; Babuska, R.

    2018-01-01

    A multi-agent methodology is proposed for Decentralized Reinforcement Learning (DRL) of individual behaviors in problems where multi-dimensional action spaces are involved. When using this methodology, sub-tasks are learned in parallel by individual agents working toward a common goal. In

  8. Satellite Power System (SPS) centralization/decentralization

    Science.gov (United States)

    Naisbitt, J.

    1978-01-01

    The decentralization of government in the United States of America is described and its effect on the solution of energy problems is given. The human response to the introduction of new technologies is considered as well as the behavioral aspects of multiple options.

  9. Decentralized substations for low-temperature district heating with no Legionella risk, and low return temperatures

    DEFF Research Database (Denmark)

    Yang, Xiaochen; Li, Hongwei; Svendsen, Svend

    2016-01-01

    . From the results, realizing LTDH by the decentralized substation unit, 30% of the annual distribution heat loss inside the building can be saved compared to a conventional system with medium-temperature district heating. Replacing the bypass pipe with an in-line supply pipe and a heat pump...... with domestic hot water (DHW) circulation. In this study, a system with decentralized substations was analysed as a solution to this problem. Furthermore, a modification for the decentralized substation system were proposed in order to reduce the average return temperature. Models of conventional system...... with medium-temperature district heating, decentralized substation system with LTDH, and innovative decentralized substation system with LTDH were built based on the information of a case building. The annual distribution heat loss and the operating costs of the three scenarios were calculated and compared...

  10. Fiscal Decentralization and Delivery of Public Services: Evidence from Education Sector in Pakistan

    Directory of Open Access Journals (Sweden)

    Rauf Abdur

    2017-04-01

    Full Text Available Fiscal Decentralization is the devolution of fiscal assignments to lower governments for high growth and better delivery of public services. The current study covering the period from 1972 to 2009 is an attempt to find out the impacts of fiscal decentralization on public services deliveries in Pakistan. Public services are proxy by Gross enrollment at primary school level while fiscal decentralization by fiscal transfer and expenditure sides of devolution. Using time series data, it is found that the individual impacts of fiscal transfer are although insignificant but still support the theoretical proposition regarding fiscal decentralization and public services relationship while delegation of expenditure responsibilities helps in improving the gross enrollment at primary school level. Furthermore the study evident that complete delegation of fiscal responsibilities to lower governments enhance enrollment ratio in Pakistan.

  11. Spectrum Allocation for Decentralized Transmission Strategies: Properties of Nash Equilibria

    Directory of Open Access Journals (Sweden)

    Peter von Wrycza

    2009-01-01

    Full Text Available The interaction of two transmit-receive pairs coexisting in the same area and communicating using the same portion of the spectrum is analyzed from a game theoretic perspective. Each pair utilizes a decentralized iterative water-filling scheme to greedily maximize the individual rate. We study the dynamics of such a game and find properties of the resulting Nash equilibria. The region of achievable operating points is characterized for both low- and high-interference systems, and the dependence on the various system parameters is explicitly shown. We derive the region of possible signal space partitioning for the iterative water-filling scheme and show how the individual utility functions can be modified to alter its range. Utilizing global system knowledge, we design a modified game encouraging better operating points in terms of sum rate compared to those obtained using the iterative water-filling algorithm and show how such a game can be imitated in a decentralized noncooperative setting. Although we restrict the analysis to a two player game, analogous concepts can be used to design decentralized algorithms for scenarios with more players. The performance of the modified decentralized game is evaluated and compared to the iterative water-filling algorithm by numerical simulations.

  12. The recent process of decentralization and democratic management of education in Brazil

    Science.gov (United States)

    Santos Filho, José Camilo Dos

    1993-09-01

    Brazilian society is beginning a new historical period in which the principle of decentralization is beginning to predominate over centralization, which held sway during the last 25 years. In contrast to recent Brazilian history, there is now a search for political, democratic and participatory decentralization more consonant with grass-roots aspirations. The first section of this article presents a brief analysis of some decentralization policies implemented by the military regime of 1964, and discusses relevant facts related to the resistance of civil society to state authoritarianism, and to the struggle for the democratization and organization of civil society up to the end of the 1970s. The second section analyzes some new experiences of democratic public school administration initiated in the 1970s and 1980s. The final section discusses the move toward decentralization and democratization of public school administration in the new Federal and State Constitutions, and in the draft of the new Law of National Education.

  13. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  14. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  15. Decentralized Networked Control of Building Structures

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír; Rehák, Branislav; Papík, Martin

    2016-01-01

    Roč. 31, č. 11 (2016), s. 871-886 ISSN 1093-9687 R&D Projects: GA ČR GA13-02149S Institutional support: RVO:67985556 Keywords : decentralized control * networked control * building structures Subject RIV: BC - Control Systems Theory Impact factor: 5.786, year: 2016

  16. Decentralized Development Planning and Fragmentation of ...

    African Journals Online (AJOL)

    Using the Greater Accra Metropolitan Area (GAMA) as a case study, this paper argues that the proliferation of autonomous local government areas within the context of urban sprawl and other challenges have inhibited metropolitan-wide development planning. Keywords: Decentralization; local government; urban growth; ...

  17. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  18. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  19. A set of decentralized PID controllers for an n – link robot manipulator

    Indian Academy of Sciences (India)

    The solution of decentralized tracking control problem for robot manipulator is slightly comp- lex since we .... Figure 1 shows decentralized control scheme for the ith joint of system (10). ...... Automatic Control 49(11): 2081–2084. Gahinet P ...

  20. Contract-Based Transaction Management in Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Grefen, P.W.P.J.

    Cross-organizational workflow management is an essential ingredient for process integration in virtual enterprises. To obtain cross-organizational workflow processes with robust semantics, these processes should be supported by highlevel cross-organizational transaction management. In this context,

  1. Optimal placement and decentralized robust vibration control for spacecraft smart solar panel structures

    International Nuclear Information System (INIS)

    Jiang, Jian-ping; Li, Dong-xu

    2010-01-01

    The decentralized robust vibration control with collocated piezoelectric actuator and strain sensor pairs is considered in this paper for spacecraft solar panel structures. Each actuator is driven individually by the output of the corresponding sensor so that only local feedback control is implemented, with each actuator, sensor and controller operating independently. Firstly, an optimal placement method for the location of the collocated piezoelectric actuator and strain gauge sensor pairs is developed based on the degree of observability and controllability indices for solar panel structures. Secondly, a decentralized robust H ∞ controller is designed to suppress the vibration induced by external disturbance. Finally, a numerical comparison between centralized and decentralized control systems is performed in order to investigate their effectiveness to suppress vibration of the smart solar panel. The simulation results show that the vibration can be significantly suppressed with permitted actuator voltages by the controllers. The decentralized control system almost has the same disturbance attenuation level as the centralized control system with a bit higher control voltages. More importantly, the decentralized controller composed of four three-order systems is a better practical implementation than a high-order centralized controller is

  2. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    Science.gov (United States)

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. COOPERATION AND TRUST IN THE CONTEXT OF DECENTRALIZATION REFORMS IN RURAL TANZANIA

    Directory of Open Access Journals (Sweden)

    Zacharia S. Masanyiwa

    2014-01-01

    Full Text Available This paper investigates the impact of decentralization reforms on cooperation andtrust at the village level in Tanzania, using a gender perspective. The paper drawson survey and qualitative data from ten villages intwo rural districts. The findingsshow that the reforms have revitalized 'formal’ cooperative efforts and socialnetworks and groups aimed at improving public services and poverty reduction.Citizen’s participation in decision-making processes and users’ satisfaction withpublic services are significantly related to socialand political trust, in whichgender plays a role as well. There is a two-way interface between trust and decent-ralization reforms. ‘Good’ decentralization outcomes generate trust while ‘bad’outcomes decrease trust.

  4. Centralized vs decentralized contests

    OpenAIRE

    Beviá, Carmen; Corchón, Luis C.

    2015-01-01

    We compare two contests, decentralized in which there are several independent contests with non overlapping contestants and centralized in which all contestants fight for a unique prize which is the sum of all prizes in the small contests. We study the relationship between payoffs and efforts between these two contests. The first author acknowledges financial support from ECO2008-04756 (Grupo Consolidado-C), SGR2014-515 and PROMETEO/2013/037. The second author acknowledges financial suppor...

  5. The impact of electronic medical record systems on outpatient workflows: a longitudinal evaluation of its workflow effects.

    Science.gov (United States)

    Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter

    2010-11-01

    The promise of the electronic medical record (EMR) lies in its ability to reduce the costs of health care delivery and improve the overall quality of care--a promise that is realized through major changes in workflows within the health care organization. Yet little systematic information exists about the workflow effects of EMRs. Moreover, some of the research to-date points to reduced satisfaction among physicians after implementation of the EMR and increased time, i.e., negative workflow effects. A better understanding of the impact of the EMR on workflows is, hence, vital to understanding what the technology really does offer that is new and unique. (i) To empirically develop a physician centric conceptual model of the workflow effects of EMRs; (ii) To use the model to understand the antecedents to the physicians' workflow expectation from the new EMR; (iii) To track physicians' satisfaction overtime, 3 months and 20 months after implementation of the EMR; (iv) To explore the impact of technology learning curves on physicians' reported satisfaction levels. The current research uses the mixed-method technique of concept mapping to empirically develop the conceptual model of an EMR's workflow effects. The model is then used within a controlled study to track physician expectations from a new EMR system as well as their assessments of the EMR's performance 3 months and 20 months after implementation. The research tracks the actual implementation of a new EMR within the outpatient clinics of a large northeastern research hospital. The pre-implementation survey netted 20 physician responses; post-implementation Time 1 survey netted 22 responses, and Time 2 survey netted 26 physician responses. The implementation of the actual EMR served as the intervention. Since the study was conducted within the same setting and tracked a homogenous group of respondents, the overall study design ensured against extraneous influences on the results. Outcome measures were derived

  6. A practical workflow for making anatomical atlases for biological research.

    Science.gov (United States)

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  7. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  8. DQM: Decentralized Quadratically Approximated Alternating Direction Method of Multipliers

    Science.gov (United States)

    Mokhtari, Aryan; Shi, Wei; Ling, Qing; Ribeiro, Alejandro

    2016-10-01

    This paper considers decentralized consensus optimization problems where nodes of a network have access to different summands of a global objective function. Nodes cooperate to minimize the global objective by exchanging information with neighbors only. A decentralized version of the alternating directions method of multipliers (DADMM) is a common method for solving this category of problems. DADMM exhibits linear convergence rate to the optimal objective but its implementation requires solving a convex optimization problem at each iteration. This can be computationally costly and may result in large overall convergence times. The decentralized quadratically approximated ADMM algorithm (DQM), which minimizes a quadratic approximation of the objective function that DADMM minimizes at each iteration, is proposed here. The consequent reduction in computational time is shown to have minimal effect on convergence properties. Convergence still proceeds at a linear rate with a guaranteed constant that is asymptotically equivalent to the DADMM linear convergence rate constant. Numerical results demonstrate advantages of DQM relative to DADMM and other alternatives in a logistic regression problem.

  9. Decentralization, healthcare access, and inequality in Mpumalanga, South Africa.

    Science.gov (United States)

    Winchester, Margaret S; King, Brian

    2018-04-27

    Healthcare access and utilization remain key challenges in the Global South. South Africa represents this given that more than twenty years after the advent of democratic elections, the national government continues to confront historical systems of spatial manipulation that generated inequities in healthcare access. While the country has made significant advancements, governmental agencies have mirrored international strategies of healthcare decentralization and focused on local provision of primary care to increase healthcare access. In this paper, we show the significance of place in shaping access and health experiences for rural populations. Using data from a structured household survey, focus group discussions, qualitative interviews, and clinic data conducted in northeast South Africa from 2013 to 2016, we argue that decentralization fails to resolve the uneven landscapes of healthcare in the contemporary period. This is evidenced by the continued variability across the study area in terms of government-sponsored healthcare, and constraints in the clinics in terms of staffing, privacy, and patient loads, all of which challenge the access-related assumptions of healthcare decentralization. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Design of a decentralized detection of interacting LTI systems

    Directory of Open Access Journals (Sweden)

    Shankar Shamanth

    2002-01-01

    Full Text Available In this paper, the problem of designing a decentralized detection filter for a large homogeneous collection of LTI systems is considered. The collection of systems considered here draws inspiration from platoons of vehicles, and the considered interactions amongst systems in the collection are banded and lower triangular, mimicking the typical “look-ahead” nature of interactions in a platoon of vehicles. A fault in a system propagates to other systems in the collection via such interactions. The decentralized detection filter for the collection is composed of interacting detection filters, one for each system. The feasibility of communicating the state estimates to other systems in the collection is assumed here. An important concern is the propagation of state estimation errors. In order that the state estimation errors not amplify as they propagate, a ℋ ∞ constraint on the state estimation error propagation dynamics is imposed. A sufficient condition for constructing a decentralized detection filter for the collection is presented. An example is provided to illustrate the design procedure.

  11. Subsecond Tsunamis and Delays in Decentralized Electronic Systems

    Directory of Open Access Journals (Sweden)

    Pedro D. Manrique

    2017-10-01

    Full Text Available Driven by technological advances and economic gain, society’s electronic systems are becoming larger, faster, more decentralized and autonomous, and yet with increasing global reach. A prime example are the networks of financial markets which—in contrast to popular perception—are largely all-electronic and decentralized with no top-down real-time controller. This prototypical system generates complex subsecond dynamics that emerge from a decentralized network comprising heterogeneous hardware and software components, communications links, and a diverse ecology of trading algorithms that operate and compete within this all-electronics environment. Indeed, these same technological and economic drivers are likely to generate a similarly competitive all-electronic ecology in a variety of future cyberphysical domains such as e-commerce, defense and the transportation system, including the likely appearance of large numbers of autonomous vehicles on the streets of many cities. Hence there is an urgent need to deepen our understanding of stability, safety and security across a wide range of ultrafast, large, decentralized all-electronic systems—in short, society will eventually need to understand what extreme behaviors can occur, why, and what might be the impact of both intentional and unintentional system perturbations. Here we set out a framework for addressing this issue, using a generic model of heterogeneous, adaptive, autonomous components where each has a realistic limit on the amount of information and processing power available to it. We focus on the specific impact of delayed information, possibly through an accidental shift in the latency of information transmission, or an intentional attack from the outside. While much remains to be done in terms of developing formal mathematical results for this system, our preliminary results indicate the type of impact that can occur and the structure of a mathematical theory which may

  12. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    Science.gov (United States)

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  13. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  14. Workflow management: an overview

    NARCIS (Netherlands)

    Ouyang, C.; Adams, M.; Wynn, M.T.; Hofstede, ter A.H.M.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Workflow management has its origin in the office automation systems of the seventies, but it is not until fairly recently that conceptual and technological breakthroughs have led to its widespread adoption. In fact, nowadays, processawareness has become an accepted and integral part of various types

  15. COINSTAC: Decentralizing the future of brain imaging analysis [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Jing Ming

    2017-08-01

    Full Text Available In the era of Big Data, sharing neuroimaging data across multiple sites has become increasingly important. However, researchers who want to engage in centralized, large-scale data sharing and analysis must often contend with problems such as high database cost, long data transfer time, extensive manual effort, and privacy issues for sensitive data. To remove these barriers to enable easier data sharing and analysis, we introduced a new, decentralized, privacy-enabled infrastructure model for brain imaging data called COINSTAC in 2016. We have continued development of COINSTAC since this model was first introduced. One of the challenges with such a model is adapting the required algorithms to function within a decentralized framework. In this paper, we report on how we are solving this problem, along with our progress on several fronts, including additional decentralized algorithms implementation, user interface enhancement, decentralized regression statistic calculation, and complete pipeline specifications.

  16. A history-tracing XML-based provenance framework for workflows

    NARCIS (Netherlands)

    Gerhards, M; Belloum, A.; Berretz, F.; Sander, V.; Skorupa, S.

    2010-01-01

    The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards

  17. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  18. Factors influencing flap and INTACS decentration after femtosecond laser application in normal and keratoconic eyes.

    Science.gov (United States)

    Ertan, Aylin; Karacal, Humeyra

    2008-10-01

    To compare accuracy of LASIK flap and INTACS centration following femtosecond laser application in normal and keratoconic eyes. This is a retrospective case series comprising 133 eyes of 128 patients referred for refractive surgery. All eyes were divided into two groups according to preoperative diagnosis: group 1 (LASIK group) comprised 74 normal eyes of 72 patients undergoing LASIK with a femtosecond laser (IntraLase), and group 2 (INTACS group) consisted of 59 eyes of 39 patients with keratoconus for whom INTACS were implanted using a femtosecond laser (IntraLase). Decentration of the LASIK flap and INTACS was analyzed using Pentacam. Temporal decentration was 612.56 +/- 384.24 microm (range: 30 to 2120 microm) in the LASIK group and 788.33 +/- 500.34 microm (range: 30 to 2450 microm) in the INTACS group. A statistically significant difference was noted between the groups in terms of decentration (P decentration of the LASIK flap and INTACS correlated with the central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group, respectively. Decentration with the IntraLase occurred in most cases, especially in keratoconic eyes. The applanation performed for centralization during IntraLase application may flatten and shift the pupil center, and thus cause decentralization of the LASIK flap and INTACS. Central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group proved to be statistically significant parameters associated with decentration.

  19. Governing decentralization in health care under tough budget constraint: what can we learn from the Italian experience?

    Science.gov (United States)

    Tediosi, Fabrizio; Gabriele, Stefania; Longo, Francesco

    2009-05-01

    In many European countries, since the World War II, there has been a trend towards decentralization of health policy to lower levels of governments, while more recently there have been re-centralization processes. Whether re-centralization will be the new paradigm of European health policy or not is difficult to say. In the Italian National Health Service (SSN) decentralization raised two related questions that might be interesting for the international debate on decentralization in health care: (a) what sort of regulatory framework and institutional balances are required to govern decentralization in health care in a heterogeneous country under tough budget constraints? (b) how can it be ensured that the most advanced parts of the country remain committed to solidarity, supporting the weakest ones? To address these questions this article describes the recent trends in SSN funding and expenditure, it reviews the strategy adopted by the Italian government for governing the decentralization process and discusses the findings to draw policy conclusions. The main lessons emerging from this experience are that: (1) when the differences in administrative and policy skills, in socio-economic standards and social capital are wide, decentralization may lead to undesirable divergent evolution paths; (2) even in decentralized systems, the role of the Central government can be very important to contain health expenditure; (3) a strong governance of the Central government may help and not hinder the enforcement of decentralization; and (4) supporting the weakest Regions and maintaining inter-regional solidarity is hard but possible. In Italy, despite an increasing role of the Central government in steering the SSN, the pattern of regional decentralization of health sector decision making does not seem at risk. Nevertheless, the Italian case confirms the complexity of decentralization and re-centralization processes that sometimes can be paradoxically reinforcing each other.

  20. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  1. Decentralization and centralization in a federal system: the case of democratic Brazil

    Directory of Open Access Journals (Sweden)

    Maria Hermínia Tavares de Almeida

    2006-01-01

    Full Text Available This paper discusses the contradictory impulses towards decentralization and centralization in Brazil during the 1990s and early 2000s. After discussing the analytical issues related to the specific nature of decentralization in federal systems, the paper examines two sets of policy issues: those regulating the fiscal relations between national and sub-national governments and those redefining responsibilities for social services provision (basic education, health care, social assistance. Against conventional academic wisdom, it sustains that although there has been some re-centralization of fiscal decisions and of targeted income transfer programs, a clear re-centralization tendency cannot be siad to exist. Decentralization and centralization trends coexist propelled by different forces, with different motives and different outcomes.

  2. Decentralized Development Planning and Fragmentation of ...

    African Journals Online (AJOL)

    Using the GAMA as a case study, this paper examines the proliferation of .... These spatial definitions give territorial meaning to decentralization as dis- ... Formulated and implemented under a military regime, the Provisional ..... increased to four in 2004 following the creation of new districts in the country, and as part of.

  3. FISCAL DECENTRALIZATION DETERMINANTS AND LOCAL ECONOMIC DEVELOPMENT IN EU COUNTRIES

    Directory of Open Access Journals (Sweden)

    Anca Florentina GAVRILUŢĂ (VATAMANU

    2017-12-01

    Full Text Available This work aims to assess the impact of fiscal decentralization on local (regional development in the EU Member States while controlling for macroeconomic and local autonomy specific factors. Using a panel data approach with dynamic effects, we examined the implications of fiscal decentralization on local development across European Union countries over the 1990-2004 period. The novelty of the study is emphasized by including in the analysis a variable which tests local fiscal discipline, more exactly, Fiscal Rule Strength Index for local level of government. Our findings suggest that prosperity of regions, measured in GDP growth depends on variables such as characteristics of decentralization undertaken by each country or local fiscal discipline, confirming our primary hypothesis. This supports the view that recently implemented reforms aiming to enforce fiscal discipline following-up the Fiscal Compact strengthened the local budgetary framework and restrained, therefore, the local discretionary power to act towards development.

  4. Decentralized DC Microgrid Monitoring and Optimization via Primary Control Perturbations

    Science.gov (United States)

    Angjelichinoski, Marko; Scaglione, Anna; Popovski, Petar; Stefanovic, Cedomir

    2018-06-01

    We treat the emerging power systems with direct current (DC) MicroGrids, characterized with high penetration of power electronic converters. We rely on the power electronics to propose a decentralized solution for autonomous learning of and adaptation to the operating conditions of the DC Mirogrids; the goal is to eliminate the need to rely on an external communication system for such purpose. The solution works within the primary droop control loops and uses only local bus voltage measurements. Each controller is able to estimate (i) the generation capacities of power sources, (ii) the load demands, and (iii) the conductances of the distribution lines. To define a well-conditioned estimation problem, we employ decentralized strategy where the primary droop controllers temporarily switch between operating points in a coordinated manner, following amplitude-modulated training sequences. We study the use of the estimator in a decentralized solution of the Optimal Economic Dispatch problem. The evaluations confirm the usefulness of the proposed solution for autonomous MicroGrid operation.

  5. Development of a completely decentralized control system for modular continuous conveyors

    OpenAIRE

    Mayer, Stephan H.

    2009-01-01

    To increase the flexibility of application of continuous conveyor systems, a completely decentralized control system for a modular conveyor system is introduced in the paper. This system is able to carry conveyor units without any centralized infrastructure. Based on existing methods of decentralized data transfer in IT networks, single modules operate autonomously and, after being positioned into the required topology, independently connect together to become a functioning conveyor system.

  6. Integrating centralized and decentralized organization structures: an education and development model.

    Science.gov (United States)

    Sheriff, R; Banks, A

    2001-01-01

    Organization change efforts have led to critically examining the structure of education and development departments within hospitals. This qualitative study evaluated an education and development model in an academic health sciences center. The model combines centralization and decentralization. The study results can be used by staff development educators and administrators when organization structure is questioned. This particular model maximizes the benefits and minimizes the limitations of centralized and decentralized structures.

  7. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Joosten, Stef M.M.; Guareis de farias, Cléver

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the

  8. Decentralized stabilization of semi-active vibrating structures

    Science.gov (United States)

    Pisarski, Dominik

    2018-02-01

    A novel method of decentralized structural vibration control is presented. The control is assumed to be realized by a semi-active device. The objective is to stabilize a vibrating system with the optimal rates of decrease of the energy. The controller relies on an easily implemented decentralized switched state-feedback control law. It uses a set of communication channels to exchange the state information between the neighboring subcontrollers. The performance of the designed method is validated by means of numerical experiments performed for a double cantilever system equipped with a set of elastomers with controlled viscoelastic properties. In terms of the assumed objectives, the proposed control strategy significantly outperforms the passive damping cases and is competitive with a standard centralized control. The presented methodology can be applied to a class of bilinear control systems concerned with smart structural elements.

  9. Financial management systems under decentralization and their effect on malaria control in Uganda.

    Science.gov (United States)

    Kivumbi, George W; Nangendo, Florence; Ndyabahika, Boniface Rutagira

    2004-01-01

    A descriptive case study with multiple sites and a single level of analysis was carried out in four purposefully selected administrative districts of Uganda to investigate the effect of financial management systems under decentralization on malaria control. Data were primarily collected from 36 interviews with district managers, staff at health units and local leaders. A review of records and documents related to decentralization at the central and district level was also used to generate data for the study. We found that a long, tedious, and bureaucratic process combined with lack of knowledge in working with new financial systems by several actors characterized financial flow under decentralization. This affected the timely use of financial resources for malaria control in that there were funds in the system that could not be accessed for use. We were also told that sometimes these funds were returned to the central government because of non-use due to difficulties in accessing them and/or stringent conditions not to divert them to other uses. Our data showed that a cocktail of bureaucratic control systems, corruption and incompetence make the financial management system under decentralization counter-productive for malaria control. The main conclusion is that good governance through appropriate and efficient financial management systems is very important for effective malaria control under decentralization.

  10. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  11. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  12. PLAStiCC: Predictive Look-Ahead Scheduling for Continuous dataflows on Clouds

    Energy Technology Data Exchange (ETDEWEB)

    Kumbhare, Alok [Univ. of Southern California, Los Angeles, CA (United States); Simmhan, Yogesh [Indian Inst. of Technology (IIT), Bangalore (India); Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States)

    2014-05-27

    Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application’s throughput. In this paper we propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based look-ahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from public and private IaaS clouds. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.

  13. The Paradox of Decentralizing Schools: Lessons from Business, Government, and the Catholic Church.

    Science.gov (United States)

    Murphy, Jerome T.

    1989-01-01

    By the year 2000, school decentralization could become another unfortunate, ineffectual pendulum swing. According to this article, a dynamic, ever-changing system of decentralization and centralization balances the benefits of local administrative autonomy with the pursuit of unified goals and helps each leadership level understand its…

  14. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  15. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  16. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2018-01-01

    From 2025 onwards, the ATLAS collaboration at the Large Hadron Collider (LHC) at CERN will experience a massive increase in data quantity as well as complexity. Including mitigating factors, the prevalent computing power by that time will only fulfil one tenth of the requirement. This contribution will focus on Cloud computing as an approach to help overcome this challenge by providing flexible hardware that can be configured to the specific needs of a workflow. Experience with Cloud computing exists, but there is a large uncertainty if and to which degree it can be able to reduce the burden by 2025. In order to understand and quantify the benefits of Cloud computing, the "Workflow and Infrastructure Model" was created. It estimates the viability of Cloud computing by combining different inputs from the workflow side with infrastructure specifications. The model delivers metrics that enable the comparison of different Cloud configurations as well as different Cloud offerings with each other. A wide range of r...

  17. Decentralized central heating

    Energy Technology Data Exchange (ETDEWEB)

    Savic, S.; Hudjera, A.

    1994-08-04

    The decentralized central heating is essentially based on new technical solutions for an independent heating unit, which allows up to 20% collectible energy savings and up to 15% savings in built-in-material. These savings are already made possible by the fact that the elements described under point A are thus eliminated from the classical heating. The thus superfluous made elements are replaced by new technical solutions described under point B - technical problem - and point E - patent claim. The technical solutions described in detail under point B and point E form together a technical unit and are essential parts of the invention protected by the patent. (author)

  18. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  19. Organizational precedents for ownership and management of decentralized renewable-energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Meunier, R.; Silversmith, J.A.

    1981-03-01

    Three existing organizational types that meet the decentralization criteria of local consumer ownership and control - cooperatives, Rural Electric Cooperatives, and municipal utilities - are examined. These three organizational precedents are analyzed in terms of their histories, structures, legal powers, sources of capital, and social and political aspects. Examples of related experiments with renewable energy technologies are given, and inferences are drawn regarding the organizations' suitability as vehicles for future implementation of decentralized renewable energy systems.

  20. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  1. Job life cycle management libraries for CMS workflow management projects

    International Nuclear Information System (INIS)

    Lingen, Frank van; Wilkinson, Rick; Evans, Dave; Foulkes, Stephen; Afaq, Anzar; Vaandering, Eric; Ryu, Seangchan

    2010-01-01

    Scientific analysis and simulation requires the processing and generation of millions of data samples. These tasks are often comprised of multiple smaller tasks divided over multiple (computing) sites. This paper discusses the Compact Muon Solenoid (CMS) workflow infrastructure, and specifically the Python based workflow library which is used for so called task lifecycle management. The CMS workflow infrastructure consists of three layers: high level specification of the various tasks based on input/output data sets, life cycle management of task instances derived from the high level specification and execution management. The workflow library is the result of a convergence of three CMS sub projects that respectively deal with scientific analysis, simulation and real time data aggregation from the experiment. This will reduce duplication and hence development and maintenance costs.

  2. To decentralize or to continue on the centralization track: The cases of authoritarian regimes in Russia and Kazakhstan

    Directory of Open Access Journals (Sweden)

    Irina Busygina

    2018-01-01

    Full Text Available Decisions on decentralization versus centralization come as a result of strategic choices made by politicians after weighing their costs and benefits. In authoritarian regimes, the highest-priority political task is that of restraining political competition and securing power in the hands of the incumbent. This task incentivizes politicians to restrict political decentralization (or at least block reforms promoting such decentralization. At the same time, external economic pressures (e.g. globalization place the task of national competitiveness in the global markets on the agenda, and increase incentives for fiscal and administrative decentralization. Thus, political and economic pressures create contradicting incentives, and in weighing costs and benefits, politicians in different authoritarian regimes make different choices that lead to variation in the form, degree and success of decentralization/centralization policies. In this article we compare authoritarian decentralization in Russia and Kazakhstan.

  3. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  4. Decentralization and public expenditure: Does special local autonomy affect regional economic growth?

    Directory of Open Access Journals (Sweden)

    Martapina Anggai

    2018-04-01

    Full Text Available This study examines the relationship between public expenditure within regional autonomy policy and economic growth in West Papua and Papua provinces. We distinguish two kinds of expenditure’s decentralization – operational and capital – and also private expenditures. We use an unbalanced panel data over the period of 2007-2010 to investigate those expenditures, whether they enhance regional economic growth or not. We find that the government’s operating and private expenditures have a positive effect on local economic growth, but there is no relationship between capital expenditure’s decentralization on economic growth. The findings did not conform to a-priori efficiency expectations, which suggest needing to reform regional autonomy and fiscal decentralization policy in both provinces.

  5. Centralized or decentralized electricity production

    International Nuclear Information System (INIS)

    Boer, H.A. de.

    1975-01-01

    Because of low overall efficiency in electric power generation, it is argued that energy provision based on gas, combined with locally decentralized electricity production, saves for the Netherlands slightly more fossile fuel than nuclear technologies and makes the country independent of uranium resources. The reason the Netherlands persues this approach is that a big part of the energy is finally used for heating in the normal or moderate temperatures

  6. Making decentralization work for women in Uganda

    NARCIS (Netherlands)

    Lakwo, A.

    2009-01-01

    This book is about engendering local governance. It explores the euphoria with which Uganda's decentralization policy took centre stage as a sufficient driver to engender local development responsiveness and accountability. Using a case study of AFARD in Nebbi district, it shows first that

  7. Decentralized data fusion with inverse covariance intersection

    NARCIS (Netherlands)

    Noack, B.; Sijs, J.; Reinhardt, M.; Hanebeck, U.D.

    2017-01-01

    In distributed and decentralized state estimation systems, fusion methods are employed to systematically combine multiple estimates of the state into a single, more accurate estimate. An often encountered problem in the fusion process relates to unknown common information that is shared by the

  8. Comment on ‘Energy and air emission implications of a decentralized wastewater system’

    International Nuclear Information System (INIS)

    Vedachalam, Sridhar; Riha, Susan J

    2013-01-01

    In the article ‘Energy and air emission implications of a decentralized wastewater system’ published in Environmental Research Letters (2012 Environ. Res. Lett. 7 024007), Shehabi et al compared a decentralized and a centralized system on the basis of energy use, greenhouse gas emissions and air pollutants, and claimed that economies of scale lower the environmental impacts from a centralized system on a per-volume basis. In this comment, we present literature and data from New York State, USA to argue that the authors’ comparison between a small decentralized system (0.015 MGD) and a large centralized system (66.5 MGD) is unconventional and inappropriate. (comment)

  9. Incorporating Workflow Interference in Facility Layout Design: The Quartic Assignment Problem

    OpenAIRE

    Wen-Chyuan Chiang; Panagiotis Kouvelis; Timothy L. Urban

    2002-01-01

    Although many authors have noted the importance of minimizing workflow interference in facility layout design, traditional layout research tends to focus on minimizing the distance-based transportation cost. This paper formalizes the concept of workflow interference from a facility layout perspective. A model, formulated as a quartic assignment problem, is developed that explicitly considers the interference of workflow. Optimal and heuristic solution methodologies are developed and evaluated.

  10. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  11. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert; Cho, Junsang; Fang, Charlie; Salihoglu, Semih; Torikai, Satoshi; Widom, Jennifer

    2012-01-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs

  12. Centralization vs. Decentralization in Medical School Libraries

    Science.gov (United States)

    Crawford, Helen

    1966-01-01

    Does the medical school library in the United States operate more commonly under the university library or the medical school administration? University-connected medical school libraries were asked to indicate (a) the source of their budgets, whether from the central library or the medical school, and (b) the responsibility for their acquisitions and cataloging. Returns received from sixtyeight of the seventy eligible institutions showed decentralization to be much the most common: 71 percent of the libraries are funded by their medical schools; 79 percent are responsible for their own acquisitions and processing. The factor most often associated with centralization of both budget and operation is public ownership. Decentralization is associated with service to one or two rather than three or more professional schools. Location of the medical school in a different city from the university is highly favorable to autonomy. Other factors associated with these trends are discussed. PMID:5945568

  13. Centralization vs. decentralization in medical school libraries.

    Science.gov (United States)

    Crawford, H

    1966-07-01

    Does the medical school library in the United States operate more commonly under the university library or the medical school administration? University-connected medical school libraries were asked to indicate (a) the source of their budgets, whether from the central library or the medical school, and (b) the responsibility for their acquisitions and cataloging. Returns received from sixtyeight of the seventy eligible institutions showed decentralization to be much the most common: 71 percent of the libraries are funded by their medical schools; 79 percent are responsible for their own acquisitions and processing. The factor most often associated with centralization of both budget and operation is public ownership. Decentralization is associated with service to one or two rather than three or more professional schools. Location of the medical school in a different city from the university is highly favorable to autonomy. Other factors associated with these trends are discussed.

  14. Papers by the Decentralized Wastewater Management MOU Partnership

    Science.gov (United States)

    Four position papers for state, local, and tribal government officials and interested stakeholders. These papers include information on the uses and benefits of decentralized wastewater treatment and examples of its effective use.

  15. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    Science.gov (United States)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  16. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    Directory of Open Access Journals (Sweden)

    Gryk Michael R

    2007-01-01

    Full Text Available Abstract Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting

  17. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  18. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  19. Decentralized Planning for Pre-Conflict and Post-Conflict ...

    African Journals Online (AJOL)

    Decentralized Planning for Pre-Conflict and Post-Conflict Management in the Bawku Municipal ... institutional arrangements for conflict monitoring and evaluation. Such processes are 'sine qua non' to pre-conflict and post-conflict prevention.

  20. Decentralized Control of Unmanned Aerial Robots for Wireless Airborne Communication Networks

    Directory of Open Access Journals (Sweden)

    Deok-Jin Lee

    2010-09-01

    Full Text Available This paper presents a cooperative control strategy for a team of aerial robotic vehicles to establish wireless airborne communication networks between distributed heterogeneous vehicles. Each aerial robot serves as a flying mobile sensor performing a reconfigurable communication relay node which enabls communication networks with static or slow-moving nodes on gorund or ocean. For distributed optimal deployment of the aerial vehicles for communication networks, an adaptive hill-climbing type decentralized control algorithm is developed to seek out local extremum for optimal localization of the vehicles. The sensor networks estabilished by the decentralized cooperative control approach can adopt its configuraiton in response to signal strength as the function of the relative distance between the autonomous aerial robots and distributed sensor nodes in the sensed environment. Simulation studies are conducted to evaluate the effectiveness of the proposed decentralized cooperative control technique for robust communication networks.

  1. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  2. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  3. Decentring the Creative Self

    DEFF Research Database (Denmark)

    Glaveanu, Vlad Petre; Lubart, Todd

    2014-01-01

    to themes depicting the interaction between these different others and the creator. Findings reveal both similarities and differences across the five domains in terms of the specific contribution of others to the creative process. Social interactions play a key formative, regulatory, motivational...... and informational role in relation to creative work. From ‘internalized’ to ‘distant’, other people are an integral part of the equation of creativity calling for a de-centring of the creative self and its re-centring in a social space of actions and interactions....

  4. Decentralization and Living Conditions in the EU

    NARCIS (Netherlands)

    Vries, M.S. de; Goymen, K.; Sazak, O.

    2014-01-01

    This paper investigates the effects of decentralization on living conditions in core cities in the European Union. It uses data from the Urban Audit to investigate whether the level of local expenditures relative to central government expenditures has an impact on the subjective appreciation of

  5. Decentralization Fails Women in Sudan | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-11-05

    Nov 5, 2010 ... In Sudan, decentralization is a process that has occurred over time and is ... In northern Sudan, some women travel three days to reach the nearest hospital. ... Accord stipulate that basic education is free, “in real life, it is not.”.

  6. Centralized vs. De-centralized Multinationals and Taxes

    DEFF Research Database (Denmark)

    Nielsen, Søren Bo; Raimondos-Møller, Pascalis; Schjelderup, Guttorm

    2005-01-01

    The paper examines how country tax differences affect a multinational enterprise's choice to centralize or de-centralize its decision structure. Within a simple model that emphasizes the multiple conflicting roles of transfer prices in MNEs - here, as a strategic pre-commitment device and a tax...

  7. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC environments

    Directory of Open Access Journals (Sweden)

    Athanassios M. Kintsakis

    2017-01-01

    Full Text Available Hermes introduces a new “describe once, run anywhere” paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  8. Scheduling the scheduling task : a time management perspective on scheduling

    NARCIS (Netherlands)

    Larco Martinelli, J.A.; Wiers, V.C.S.; Fransoo, J.C.

    2013-01-01

    Time is the most critical resource at the disposal of schedulers. Hence, an adequate management of time from the schedulers may impact positively on the scheduler’s productivity and responsiveness to uncertain scheduling environments. This paper presents a field study of how schedulers make use of

  9. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    Science.gov (United States)

    Hartman, Douglas J

    2015-06-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Remembering the Future of Centralized Control-Decentralized Execution

    National Research Council Canada - National Science Library

    Sheets, Patrick

    2003-01-01

    ... concepts which should drive system development. To realize the significance of the USAF C2 tenet of "centralized control-decentralized execution," one must understand how C2 is executed, in contingency theaters of operation...

  11. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. Decentralized control of units in smart grids for the support of renewable energy supply

    Energy Technology Data Exchange (ETDEWEB)

    Sonnenschein, Michael, E-mail: Michael.Sonnenschein@Uni-Oldenburg.DE [University of Oldenburg, Department of Computing Science, D-26111 Oldenburg (Germany); Lünsdorf, Ontje, E-mail: Ontje.Luensdorf@OFFIS.DE [OFFIS Institute for Information Technology, Escherweg 2, D-26121 Oldenburg (Germany); Bremer, Jörg, E-mail: Joerg.Bremer@Uni-Oldenburg.DE [University of Oldenburg, Department of Computing Science, D-26111 Oldenburg (Germany); Tröschel, Martin, E-mail: Martin.Troeschel@OFFIS.DE [OFFIS Institute for Information Technology, Escherweg 2, D-26121 Oldenburg (Germany)

    2015-04-15

    Due to the significant environmental impact of power production from fossil fuels and nuclear fission, future energy systems will increasingly rely on distributed and renewable energy sources (RES). The electrical feed-in from photovoltaic (PV) systems and wind energy converters (WEC) varies greatly both over short and long time periods (from minutes to seasons), and (not only) by this effect the supply of electrical power from RES and the demand for electrical power are not per se matching. In addition, with a growing share of generation capacity especially in distribution grids, the top-down paradigm of electricity distribution is gradually replaced by a bottom-up power supply. This altogether leads to new problems regarding the safe and reliable operation of power grids. In order to address these challenges, the notion of Smart Grids has been introduced. The inherent flexibilities, i.e. the set of feasible power schedules, of distributed power units have to be controlled in order to support demand–supply matching as well as stable grid operation. Controllable power units are e.g. combined heat and power plants, power storage systems such as batteries, and flexible power consumers such as heat pumps. By controlling the flexibilities of these units we are particularly able to optimize the local utilization of RES feed-in in a given power grid by integrating both supply and demand management measures with special respect to the electrical infrastructure. In this context, decentralized systems, autonomous agents and the concept of self-organizing systems will become key elements of the ICT based control of power units. In this contribution, we first show how a decentralized load management system for battery charging/discharging of electrical vehicles (EVs) can increase the locally used share of supply from PV systems in a low voltage grid. For a reliable demand side management of large sets of appliances, dynamic clustering of these appliances into uniformly

  13. Decentralized control of units in smart grids for the support of renewable energy supply

    International Nuclear Information System (INIS)

    Sonnenschein, Michael; Lünsdorf, Ontje; Bremer, Jörg; Tröschel, Martin

    2015-01-01

    Due to the significant environmental impact of power production from fossil fuels and nuclear fission, future energy systems will increasingly rely on distributed and renewable energy sources (RES). The electrical feed-in from photovoltaic (PV) systems and wind energy converters (WEC) varies greatly both over short and long time periods (from minutes to seasons), and (not only) by this effect the supply of electrical power from RES and the demand for electrical power are not per se matching. In addition, with a growing share of generation capacity especially in distribution grids, the top-down paradigm of electricity distribution is gradually replaced by a bottom-up power supply. This altogether leads to new problems regarding the safe and reliable operation of power grids. In order to address these challenges, the notion of Smart Grids has been introduced. The inherent flexibilities, i.e. the set of feasible power schedules, of distributed power units have to be controlled in order to support demand–supply matching as well as stable grid operation. Controllable power units are e.g. combined heat and power plants, power storage systems such as batteries, and flexible power consumers such as heat pumps. By controlling the flexibilities of these units we are particularly able to optimize the local utilization of RES feed-in in a given power grid by integrating both supply and demand management measures with special respect to the electrical infrastructure. In this context, decentralized systems, autonomous agents and the concept of self-organizing systems will become key elements of the ICT based control of power units. In this contribution, we first show how a decentralized load management system for battery charging/discharging of electrical vehicles (EVs) can increase the locally used share of supply from PV systems in a low voltage grid. For a reliable demand side management of large sets of appliances, dynamic clustering of these appliances into uniformly

  14. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  15. Analyzing the Gap between Workflows and their Natural Language Descriptions

    NARCIS (Netherlands)

    Groth, P.T.; Gil, Y

    2009-01-01

    Scientists increasingly use workflows to represent and share their computational experiments. Because of their declarative nature, focus on pre-existing component composition and the availability of visual editors, workflows provide a valuable start for creating user-friendly environments for end

  16. Assessment of the Nurse Medication Administration Workflow Process

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2016-01-01

    Full Text Available This paper presents findings of an observational study of the Registered Nurse (RN Medication Administration Process (MAP conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  17. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    Science.gov (United States)

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  18. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  19. Dynamical Orders of Decentralized H-infinity Controllers

    DEFF Research Database (Denmark)

    Stoustrup, Jakob; Niemann, Hans Henrik

    1996-01-01

    The problem of decentralized control is addressed, i.e. theproblem of designing a controller where each control input is allowedto use only some of the measurements. It is shown that such problemsthere does not always exist a sequence of controllers of bounded orderwhich obtains near optimal cont...

  20. Women's Political Representation and Participation in Decentralized ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Huairou Commission User

    facilitate people's participation in national development through ensuring sound local level politics. • RC evolved into local councils which then led to the implementation of decentralization through the local government act (1997). • This policy has provided opportunities for women to participate in local leadership from.

  1. Of decentralization of public power Ukrainian land that belonged to Lithuanian (XIII – the early XVII century

    Directory of Open Access Journals (Sweden)

    C. V. Manuilova

    2016-06-01

    Full Text Available A comprehensive crisis in Ukraine and continued military confrontation in the Donbass demonstrated the urgent need to establish effective governance, which would imply decentralization of public power. Note that in implementing the decentralization of power in Ukraine insists the International Monetary Fund; United Nations Development Program; the transfer of authority to the field and decentralization of power in Ukraine is one of the points of the Minsk agreements and obligations of Ukraine to the EU. The article deals with the Ukrainian lands topical issue features the decentralization of public power in the XIII - the beginning of XVII century. The importance of the topic due to the need to study the historical experience of the implementation of decentralization. It was, emphasized that the success of the reforms depends largely because of the historical experience and features of the decentralization of public power in the past. Characterized by the development of local government in the Ukrainian lands was part of the Lithuanian state. The purpose of the article is to clarify the characteristics of decentralization of public authority on Ukrainian lands were part of the Lithuanian state during the XVII century XIII. To address this goal, outline decentralization of public power in the state; analyze, competence of local government in the Ukrainian lands that belonged to the Lith uanian State; determine how close to the power of the people. The level of decentralization of public power in the Grand Duchy of Lithuania in the XIII - the beginning of XVII century was high. It was, found that Lithuania had not established a centralized state. It is, noted that the Board of the nobility limited the princely power. The effect of delegated deputies from different parts of the Lithuanian statehood solutions nobility Council.Clarified the facts that confirm the existence of decentralization of public power in Lithuania: the functioning of local

  2. Remote handling of decentralized power generation plants; Fernwirken von dezentralen Energieerzeugungsanlagen

    Energy Technology Data Exchange (ETDEWEB)

    Conrad, Michael [IDS GmbH, Ettlingen (Germany). Geschaeftsbereich Entwicklung-Prozessautomatisierung; Thomas, Ralf [IDS GmbH, Ettlingen (Germany). Bereich Business Development und Marketing

    2011-05-15

    The incresing number of decentral power generation systems requires new grid solutions, i.e. the so-called smart grids. One important function is the monitoring and control, e.g. of decentral PV, wind power and cogeneration systems. The data interfaces used are highly diverse and as a rule are taken from measuring and automation technology, i.e. they must be adapted to the data models and transmission procedures of remote control and guidance systems. A compact protocol gateway enables standardized control and diagnosis.

  3. Parametric Room Acoustic workflows with real-time acoustic simulation

    DEFF Research Database (Denmark)

    Parigi, Dario

    2017-01-01

    The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages......The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages...

  4. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow

    Science.gov (United States)

    Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.

    2012-01-01

    Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459

  5. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  6. Workflow interruptions, cognitive failure and near-accidents in health care.

    Science.gov (United States)

    Elfering, Achim; Grebner, Simone; Ebener, Corinne

    2015-01-01

    Errors are frequent in health care. A specific model was tested that affirms failure in cognitive action regulation to mediate the influence of nurses' workflow interruptions and safety conscientiousness on near-accidents in health care. One hundred and sixty-five nurses from seven Swiss hospitals participated in a questionnaire survey. Structural equation modelling confirmed the hypothesised mediation model. Cognitive failure in action regulation significantly mediated the influence of workflow interruptions on near-accidents (p accidents via cognitive failure in action regulation was also significant (p accidents; moreover, cognitive failure mediated the association between compliance and near-accidents (p < .05). Contrary to expectations, compliance with safety regulations was not related to workflow interruptions. Workflow interruptions caused by colleagues, patients and organisational constraints are likely to trigger errors in nursing. Work redesign is recommended to reduce cognitive failure and improve safety of nurses and patients.

  7. Engaging Social Capital for Decentralized Urban Stormwater Management

    Science.gov (United States)

    Decentralized approaches to urban stormwater management, whereby installations of green infrastructure (e.g., rain gardens, bioswales, and constructed wetlands) are dispersed throughout a management area, are cost-effective solutions with co-benefits beyond water abatement. Inste...

  8. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  9. Centralization and Decentralization: An International Viewpoint on an American Dilemma. A Special CASEA Report.

    Science.gov (United States)

    Walker, William G.

    This report outlines the history of the centralization-decentralization dilemma in the goverance of organizations, discusses two types of centralization-decentralization continua, and suggests further research. The first type of continuum discussed -- the traditional American -- refers to decisionmaking in the areas of public debate and partisan…

  10. Enforcement and Environmental Quality in a Decentralized Emission Trading System

    Energy Technology Data Exchange (ETDEWEB)

    D' Amato, Alessio (Univ. of Rome, ' Tor Vergata' , Rome (Italy)); Valentini, Edilio (Univ. G. D' Annunzio di Chieti-Pescara, DEST, Fac. di Economia, Pescara (Italy))

    2008-07-01

    This paper addresses the issue of whether the powers of monitoring compliance and allocating allowances under emissions trading within an economic union should be centralized or delegated to single states. To this end, we develop a two stage game played by two governments choosing allowances and monitoring effort to achieve full compliance, and their respective polluting industries. We show that cost advantage in favor of national states is not sufficient to justify decentralization. Nevertheless, cost differential in monitoring violations can imply lower emissions and greater welfare under a decentralized institutional setting than under a centralized one

  11. Fiscal Decentralization and Disparity of Access to Primary Education in Indonesia

    Directory of Open Access Journals (Sweden)

    Shinta Doriza

    2013-12-01

    Full Text Available In education, one crusial issue of development is the disparity of primary education access. Using 440 regions database from 2005-2009, this study is aim to analize the impact of fiscal decentralization in reducing the enrolement of primary education in Indonesia. Three factors were included, i.e fiscal decentralization, socioeconomic factors and regional characteristics. The result of panel data estimation using fixed-effect approach on this study is that DAK for Education, DAK Non Education, and PAD have significant impact in reducing education acess disparity along with poverty and regional characteristic such as Java-non Java regions. For education level, another variable was also found significant including education of the society and regional characteristic such as proliferated-non proliferated regions. In general there is a facts and proves that fiscal decentralization improve education access equality, but several effort need to done to optimalize the equalization of primary education access in Indonesia.

  12. [Health care reform, decentralization, prevention and control of vector-borne diseases].

    Science.gov (United States)

    Schmunis, G A; Dias, J C

    2000-01-01

    Economic policies are changing Latin American health programs, particularly promoting decentralization. Numerous difficulties thus arise for the control of endemic diseases, since such activities traditionally depend on vertical, and centralized structures. Theoretical arguments in favor of decentralization notwithstanding, no such tradition exists at the county level. The lack of program expertise at peripheral levels, intensive staff turnover, and even corruption are additional difficulties. Hence, the simple bureaucratic transfer of activities from the Federal to county level is often irresponsible. The loss of priority for control of endemic diseases in Latin America may mean the inexorable extinction of traditional control services. Malaria, dengue fever, and Chagas disease programs are examples of the loss of expertise and effectiveness in Latin America. A better strategy for responsible decentralization is required. In particular, a shared transition involving all governmental levels is desirable to effectively modernize programs. Maintenance of regional reference centers to ensure supervision, surveillance, and training is suggested.

  13. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows

  14. Decentralized Feedback Controllers for Exponential Stabilization of Hybrid Periodic Orbits: Application to Robotic Walking*

    Science.gov (United States)

    Hamed, Kaveh Akbari; Gregg, Robert D.

    2016-01-01

    This paper presents a systematic algorithm to design time-invariant decentralized feedback controllers to exponentially stabilize periodic orbits for a class of hybrid dynamical systems arising from bipedal walking. The algorithm assumes a class of parameterized and nonlinear decentralized feedback controllers which coordinate lower-dimensional hybrid subsystems based on a common phasing variable. The exponential stabilization problem is translated into an iterative sequence of optimization problems involving bilinear and linear matrix inequalities, which can be easily solved with available software packages. A set of sufficient conditions for the convergence of the iterative algorithm to a stabilizing decentralized feedback control solution is presented. The power of the algorithm is demonstrated by designing a set of local nonlinear controllers that cooperatively produce stable walking for a 3D autonomous biped with 9 degrees of freedom, 3 degrees of underactuation, and a decentralization scheme motivated by amputee locomotion with a transpelvic prosthetic leg. PMID:27990059

  15. Fully decentralized control of a soft-bodied robot inspired by true slime mold.

    Science.gov (United States)

    Umedachi, Takuya; Takeda, Koichi; Nakagaki, Toshiyuki; Kobayashi, Ryo; Ishiguro, Akio

    2010-03-01

    Animals exhibit astoundingly adaptive and supple locomotion under real world constraints. In order to endow robots with similar capabilities, we must implement many degrees of freedom, equivalent to animals, into the robots' bodies. For taming many degrees of freedom, the concept of autonomous decentralized control plays a pivotal role. However a systematic way of designing such autonomous decentralized control system is still missing. Aiming at understanding the principles that underlie animals' locomotion, we have focused on a true slime mold, a primitive living organism, and extracted a design scheme for autonomous decentralized control system. In order to validate this design scheme, this article presents a soft-bodied amoeboid robot inspired by the true slime mold. Significant features of this robot are twofold: (1) the robot has a truly soft and deformable body stemming from real-time tunable springs and protoplasm, the former is used for an outer skin of the body and the latter is to satisfy the law of conservation of mass; and (2) fully decentralized control using coupled oscillators with completely local sensory feedback mechanism is realized by exploiting the long-distance physical interaction between the body parts stemming from the law of conservation of protoplasmic mass. Simulation results show that this robot exhibits highly supple and adaptive locomotion without relying on any hierarchical structure. The results obtained are expected to shed new light on design methodology for autonomous decentralized control system.

  16. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  17. Considering Time in Orthophotography Production: from a General Workflow to a Shortened Workflow for a Faster Disaster Response

    Science.gov (United States)

    Lucas, G.

    2015-08-01

    This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic); these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed) their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the "normal" practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator); the second is simplified and make compromise over positional accuracy (using direct geo-referencing) and seamlines preparation in order to achieve

  18. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  19. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  20. Data mining workflow templates for intelligent discovery assistance in RapidMiner

    OpenAIRE

    Kietz, J U; Serban, F; Bernstein, A; Fischer, S

    2010-01-01

    Knowledge Discovery in Databases (KDD) has evolved during the last years and reached a mature stage offering plenty of operators to solve complex tasks. User support for building workflows, in contrast, has not increased proportionally. The large number of operators available in current KDD systems make it difficult for users to successfully analyze data. Moreover, workflows easily contain a large number of operators and parts of the workflows are applied several times, thus it is hard for us...