WorldWideScience

Sample records for cambafx workflow design

  1. CamBAfx: workflow design, implementation and application for neuroimaging

    Directory of Open Access Journals (Sweden)

    Cinly Ooi

    2009-08-01

    Full Text Available CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers and those who design them (designers. It provides a front-end (user interface optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  2. CamBAfx: workflow design, implementation and application for neuroimaging

    OpenAIRE

    Cinly Ooi; Bullmore, Edward T; Alle-Meije Wink; Levent Sendur; Anna Barnes; Sophie Achard; John Aspden; Sanja Abbott; Shigang Yue; Manfred Kitzbichler; David Meunier; Voichita Maxim; Raymond Salvador; Julian Henty; Roger Tait

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platf...

  3. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging

    OpenAIRE

    Ooi, Cinly; Bullmore, Edward T; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platf...

  4. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  5. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  6. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    OpenAIRE

    Iochpe, C.; Chiao, C.; Hess, G; Nascimento, G.S.; Thom, L.H.; Reichert, M.U.

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not provide functionalities that enable users to define, query, and reuse workflow patterns properly. In this paper we gather a suite for both process modeling and normalization based on workflow patte...

  7. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  8. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    OpenAIRE

    Tianhong Song; Sven Köhler; Bertram Ludäscher; James Hanken; Maureen Kelly; David Lowery; Macklin, James A.; Morris, Paul J.; Morris, Robert A.

    2014-01-01

    Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfal...

  9. Web-based Collaborative Workflow Design

    OpenAIRE

    Held, Markus

    2010-01-01

    In recent years Scientific Workflows have enormously gained importance, while Workflow Management has been a major aspect of enterprise systems since the 1990s. Business processes as well as queries of biological databases are modelled as workflows. In comparison to other programs, workflows contain a very high degree of domain-specific logic, thus rendering a close cooperation of subect matter experts and software engineers inevitable. This dissertation presents concepts and processes for co...

  10. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola; Bhattacharyya, Anirban

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to...

  11. Computer-Assisted Scientific Workflow Design

    OpenAIRE

    Cerezo N.; Montagnat J.; Blay-Fornarino M.

    2013-01-01

    Workflows are increasingly adopted to describe large-scale data- and compute-intensive processes that can take advantage of today's Distributed Computing Infrastructures. Still, most Scientific Workflow formalisms are notoriously difficult to fully exploit, as they entangle the description of scientific processes and their implementation, blurring the lines between what is done and how it is done as well as between what is and what is not infrastructure-dependent. This work addresses the prob...

  12. Grid-enabled Workflows for Industrial Product Design

    OpenAIRE

    Boniface, M.J.; Ferris, J.; Ghanem, M; Azam, N

    2006-01-01

    This paper presents a generic approach for developing and using Grid-based workflow technology for enabling cross-organizational engineering applications. Using industrial product design examples from the automotive and aerospace industries we highlight the main requirements and challenges addressed by our approach and describe how it can be used for enabling interoperability between heterogeneous workflow engines.

  13. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  14. Implementation of the electronic DDA workflow for NSSS system design

    International Nuclear Information System (INIS)

    For improving NSSS design quality, and productivity several cases of the nuclear developed nation's integrated management system, such as Mitsubishi's NUWINGS (Japan), AECL's CANDID (Canada) and Duke Powes's (USA) were investigated, and it was studied in this report that the system implementation of NSSS design document computerization and the major workflow process of the DDA (Document Distribution for Agreement). On the basis of the requirements of design document computerization which covered preparation, review, approval and distribution of the engineering documents, KAERI Engineering Information Management System (KEIMS) was implemented. Major effects of this report are to implement GUI panel for input and retrieval of the document index information, to setup electronic document workflow, and to provide quality assurance verification by tracing the workflow history. Major effects of NSSS design document computerization are the improvement of efficiency and reliability and the engineering cost reduction by means of the fast documents verification capability and electronic document transferring system. 2 tabs., 16 figs., 9 refs. (Author)

  15. Design decisions in workflow management and quality of work.

    OpenAIRE

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of reference. It was found among a total sample of 66 employees that there was no change in the experience of work quality before and after the introduction of the WFM system. There are however, significant...

  16. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  17. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud.

    Science.gov (United States)

    Wolstencroft, Katherine; Haines, Robert; Fellows, Donal; Williams, Alan; Withers, David; Owen, Stuart; Soiland-Reyes, Stian; Dunlop, Ian; Nenadic, Aleksandra; Fisher, Paul; Bhagat, Jiten; Belhajjame, Khalid; Bacall, Finn; Hardisty, Alex; Nieva de la Hidalga, Abraham; Balcazar Vargas, Maria P; Sufi, Shoaib; Goble, Carole

    2013-07-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud environments), using the Taverna Server. In bioinformatics, Taverna workflows are typically used in the areas of high-throughput omics analyses (for example, proteomics or transcriptomics), or for evidence gathering methods involving text mining or data mining. Through Taverna, scientists have access to several thousand different tools and resources that are freely available from a large range of life science institutions. Once constructed, the workflows are reusable, executable bioinformatics protocols that can be shared, reused and repurposed. A repository of public workflows is available at http://www.myexperiment.org. This article provides an update to the Taverna tool suite, highlighting new features and developments in the workbench and the Taverna Server. PMID:23640334

  18. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  19. Effects of the Interactions Between LPS and BIM on Workflow in Two Building Design Projects

    OpenAIRE

    Khan, Sheriz; Tzortzopoulos, Patricia

    2014-01-01

    Variability in design workflow causes delays and undermines the performance of building projects. As lean processes, the Last Planner System (LPS) and Building Information Modeling (BIM) can improve workflow in building projects through features that reduce waste. Since its introduction, BIM has had significant positive influence on workflow in building design projects, but these have been rarely considered in combination with LPS. This paper is part of a postgraduate research focusing on the...

  20. A Multi-Fidelity Workflow to Derive Physics-Based Conceptual Design Methods

    OpenAIRE

    Böhnke, Daniel

    2015-01-01

    The present study developed a multi-fidelity workflow to derive physics-based conceptual design methods from models of higher-fidelity usually employed during preliminary aircraft design. The multi-fidelity workflow consists of a design of experiments, a multi-fidelity loop, and symbolic regression as surrogate modeling technique. Results are presented for conventional and unconventional aircraft configurations.

  1. Incorporating Workflow Interference in Facility Layout Design: The Quartic Assignment Problem

    OpenAIRE

    Wen-Chyuan Chiang; Panagiotis Kouvelis; Timothy L. Urban

    2002-01-01

    Although many authors have noted the importance of minimizing workflow interference in facility layout design, traditional layout research tends to focus on minimizing the distance-based transportation cost. This paper formalizes the concept of workflow interference from a facility layout perspective. A model, formulated as a quartic assignment problem, is developed that explicitly considers the interference of workflow. Optimal and heuristic solution methodologies are developed and evaluated.

  2. Improving design workflow with the Last Planner System: two action research studies

    OpenAIRE

    Khan, Sheriz; Tzortzopoulos, Patricia

    2015-01-01

    Variability in workflow during the design stage of building projects has been widely acknowledged as a problem related to poor planning and control of design tasks and has been identified as a major cause of delay in building projects. The Last Planner system (LPS) of production planning and control helps to create predictable and reliable workflow by enabling the management of the range of relationships, interfaces and deliverables involved in a project. This paper presents results of implem...

  3. BioFlow: a web based workflow management software for design and execution of genomics pipelines

    Science.gov (United States)

    2014-01-01

    Background Bioinformatics data analysis is usually done sequentially by chaining together multiple tools. These are created by writing scripts and tracking the inputs and outputs of all stages. Writing such scripts require programming skills. Executing multiple pipelines in parallel and keeping track of all the generated files is difficult and error prone. Checking results and task completion requires users to remotely login to their servers and run commands to identify process status. Users would benefit from a web-based tool that allows creation and execution of pipelines remotely. The tool should also keep track of all the files generated and maintain a history of user activities. Results A software tool for building and executing workflows is described here. The individual tools in the workflows can be any command line executable or script. The software has an intuitive mechanism for adding new tools to be used in workflows. It contains a workflow designer where workflows can be creating by visually connecting various components. Workflows are executed by job runners. The outputs and the job history are saved. The tool is web based software tool and all actions can be performed remotely. Conclusions Users without scripting knowledge can utilize the tool to build pipelines for executing tasks. Pipelines can be modeled as workflows that are reusable. BioFlow enables users to easily add new tools to the database. The workflows can be created and executed remotely. A number of parallel jobs can be easily controlled. Distributed execution is possible by running multiple instances of the application. Any number of tasks can be executed and the output will be stored making it is easy to correlate the outputs to the jobs executed.

  4. Integrated Environmental Design and Robotic Fabrication Workflow for Ceramic Shading Systems

    OpenAIRE

    Bechthold, Martin; King, Nathan; Kane, Anthony Owen; Niemasz, Jeffrey; Reinhart, Christoph

    2011-01-01

    The current design practice for high performance, custom facade systems disconnects the initial façade design from the fabrication phase. The early design phases typically involve a series of iterative tests during which the environmental performance of different design variants is verified through simulations or physical measurements. After completing the environmental design, construction and fabrication constraints are incorporated. Time, budget constraints, and workflow incompatibilities ...

  5. Design and Evaluation of Data Annotation Workflows for CAVE-like Virtual Environments.

    Science.gov (United States)

    Pick, Sebastian; Weyers, Benjamin; Hentschel, Bernd; Kuhlen, Torsten W

    2016-04-01

    Data annotation finds increasing use in Virtual Reality applications with the goal to support the data analysis process, such as architectural reviews. In this context, a variety of different annotation systems for application to immersive virtual environments have been presented. While many interesting interaction designs for the data annotation workflow have emerged from them, important details and evaluations are often omitted. In particular, we observe that the process of handling metadata to interactively create and manage complex annotations is often not covered in detail. In this paper, we strive to improve this situation by focusing on the design of data annotation workflows and their evaluation. We propose a workflow design that facilitates the most important annotation operations, i.e., annotation creation, review, and modification. Our workflow design is easily extensible in terms of supported annotation and metadata types as well as interaction techniques, which makes it suitable for a variety of application scenarios. To evaluate it, we have conducted a user study in a CAVE-like virtual environment in which we compared our design to two alternatives in terms of a realistic annotation creation task. Our design obtained good results in terms of task performance and user experience. PMID:26780799

  6. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    CERN Document Server

    Le Goff, J M; Bityukov, S; Estrella, F; Kovács, Z; Le Flour, T; Lieunard, S; McClatchey, R; Murray, S; Organtini, G; Vialle, J P; Bazan, A; Chevenier, G

    1997-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of ( often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems ( also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commer cial workflow products cannot support the highly dynamic activities found both in the design stages of product developmen...

  7. Linear CMOS RF power amplifiers a complete design workflow

    CERN Document Server

    Ruiz, Hector Solar

    2013-01-01

    The work establishes the design flow for the optimization of linear CMOS power amplifiers from the first steps of the design to the final IC implementation and tests. The authors also focuses on design guidelines of the inductor's geometrical characteristics for power applications and covers their measurement and characterization. Additionally, a model is proposed which would facilitate designs in terms of transistor sizing, required inductor quality factors or minimum supply voltage. The model considers limitations that CMOS processes can impose on implementation. The book also provides diffe

  8. Echoes of Semiotically-Based Design in the Development and Testing of a Workflow System

    Directory of Open Access Journals (Sweden)

    Clarisse Sieckenius de Souza

    2001-05-01

    Full Text Available Workflow systems are information-intensive task-oriented computer applications that typically involve a considerable number of users playing a wide variety of roles. Since communication, coordination and decision-making processes are essential for such systems, representing, interpreting and negotiating collective meanings are a crucial issue for software design and development processes. In this paper, we report and discuss our experience in implementing Qualitas, a web-based workflow system. Semiotic theory was extensively used to support design decisions and negotiations with users about technological signs. Taking scenarios as a type-sign exchanged throughout the whole process, we could trace the theoretic underpinnings of our experience and draw some revealing conclusions about the product and the process of technologically reified discourse. Although it is present in all information technology applications, this kind of discourse is seldom analyzed by software designers and developers. Our conjecture is that outside semiotic theory, professionals involved with human-computer interaction and software engineering practices have difficulty to coalesce concepts derived from such different disciplines as psychology, anthropology, linguistics and sociology, to name a few. Semiotics, however, can by itself provide a unifying ontological basis for interdisciplinary nowledge, raising issues and proposing alternatives, that may help professionals gain insights at lower learning costs. eywords: semiotic engineering, workflow systems, information-intensive task-oriented systems, scenario based design and development of computer systems, human-computer interaction

  9. A computational workflow for the design of irreversible inhibitors of protein kinases.

    Science.gov (United States)

    Del Rio, Alberto; Sgobba, Miriam; Parenti, Marco Daniele; Degliesposti, Gianluca; Forestiero, Rosetta; Percivalle, Claudia; Conte, Pier Franco; Freccero, Mauro; Rastelli, Giulio

    2010-03-01

    Design of irreversible inhibitors is an emerging and relatively less explored strategy for the design of protein kinase inhibitors. In this paper, we present a computational workflow that was specifically conceived to assist such design. The workflow takes the form of a multi-step procedure that includes: the creation of a database of already known reversible inhibitors of protein kinases, the selection of the most promising scaffolds that bind one or more desired kinase templates, the modification of the scaffolds by introduction of chemically reactive groups (suitable cysteine traps) and the final evaluation of the reversible and irreversible protein-ligand complexes with molecular dynamics simulations and binding free energy predictions. Most of these steps were automated. In order to prove that this is viable, the workflow was tested on a database of known inhibitors of ERK2, a protein kinase possessing a cysteine in the ATP site. The modeled ERK2-ligand complexes and the values of the estimated binding free energies of the putative ligands provide useful indicators of their aptitude to bind reversibly and irreversibly to the protein kinase. Moreover, the computational data are used to rank the ligands according to their computed binding free energies and their ability to bind specific protein residues in the reversible and irreversible complexes, thereby providing a useful decision-making tool for each step of the design. In this work we present the overall procedure and the first proof of concept results. PMID:20306284

  10. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  11. Design of an Integrated Role-Based Access Control Infrastructure for Adaptive Workflow Systems

    OpenAIRE

    C Narendra, Nanjangud

    2003-01-01

    With increasing numbers of organizations automating their business processes by using workflow systems, security aspects of workflow systems has become a heavily researched area. Also, most workflow processes nowadays need to be adaptive, i.e., constantly changing, to meet changing business conditions. However, little attention has been paid to integrating Security and Adaptive Workflow. In this paper, we investigate this important research topic, with emphasis on Role Based Access Control (R...

  12. The use of workflows in the design and implementation of complex experiments in macromolecular crystallography

    International Nuclear Information System (INIS)

    A powerful and easy-to-use workflow environment has been developed at the ESRF for combining experiment control with online data analysis on synchrotron beamlines. This tool provides the possibility of automating complex experiments without the need for expertise in instrumentation control and programming, but rather by accessing defined beamline services. The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE

  13. Towards workflow ecosystems through standard representations

    OpenAIRE

    Garijo Verdejo, Daniel; Gil, Yolanda; Corcho, Oscar

    2014-01-01

    Workflows are increasingly used to manage and share scientific computations and methods. Workflow tools can be used to design, validate, execute and visualize scientific workflows and their execution results. Other tools manage workflow libraries or mine their contents. There has been a lot of recent work on workflow system integration as well as common workflow interlinguas, but the interoperability among workflow systems remains a challenge. Ideally, these tools would f...

  14. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  15. A web accessible scientific workflow system for vadoze zone performance monitoring: design and implementation examples

    Science.gov (United States)

    Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.

    2005-12-01

    Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser

  16. Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands

    OpenAIRE

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, José; Manderveld, Jocelyn

    2005-01-01

    Please refer to the original article in: Westera, W., Brouns, F., Pannekeet, K., Janssen, J., & Manderveld, J. (2005). Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands. Educational Technology & Society, 8 (3), 216-225. (URL: http://www.ifets.info/others/abstract.php?art_id=570)

  17. Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands

    NARCIS (Netherlands)

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, José; Manderveld, Jocelyn

    2005-01-01

    Please refer to the original article in: Westera, W., Brouns, F., Pannekeet, K., Janssen, J., & Manderveld, J. (2005). Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands. Educational Technology & Society, 8 (3), 216-225. (URL: http://www.i

  18. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  19. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  20. Service-Oriented Architectures: from Design to Production exploiting Workflow Patterns

    Directory of Open Access Journals (Sweden)

    Saverio GIALLORENZO

    2015-03-01

    Full Text Available In Service-Oriented Architectures (SOA services are composed by coordinating their communications into a flow of interactions. Coloured Petri nets (CPN offer a formal yet easy tool for modelling abstract SOAs. Still, mapping abstract SOAs into executable ones requires a non-trivial and time-costly analysis. Here, we propose a methodology that maps CPN-modelled SOAs into executable Jolie SOAs (our target language. To this end, we employ a collection of recurring control-flow patterns, called Workflow Patterns, as composable blocks of the translation. Following our methodology, we discuss how the Workflow Patterns we consider are translated in Jolie. Finally, we validate our methodology with a realistic use case. As additional result of our research, we could pragmatically assess the expressiveness of Jolie with relation to the considered Workflow Patterns.

  1. Ontology-Based Workflow Validation

    OpenAIRE

    Pham, Tuan Anh; Nguyen, Thi Hoa Hue; Le Thanh, Nhan

    2015-01-01

    In order to ensure a workflow to be executed correctly, many approaches were introduced. But not many of them consider the semantic correctness of the workflow in the design time and the run time. In this paper, a solution to check the semantic correctness of the workflow automatically is presented. To do that, the workflow must be represented in a machine understandable form, an ontology-based approach to represent a workflow is proposed. In addition, we also provide a set of changed operati...

  2. From benchtop to desktop: important considerations when designing amplicon sequencing workflows.

    Directory of Open Access Journals (Sweden)

    Dáithí C Murray

    Full Text Available Amplicon sequencing has been the method of choice in many high-throughput DNA sequencing (HTS applications. To date there has been a heavy focus on the means by which to analyse the burgeoning amount of data afforded by HTS. In contrast, there has been a distinct lack of attention paid to considerations surrounding the importance of sample preparation and the fidelity of library generation. No amount of high-end bioinformatics can compensate for poorly prepared samples and it is therefore imperative that careful attention is given to sample preparation and library generation within workflows, especially those involving multiple PCR steps. This paper redresses this imbalance by focusing on aspects pertaining to the benchtop within typical amplicon workflows: sample screening, the target region, and library generation. Empirical data is provided to illustrate the scope of the problem. Lastly, the impact of various data analysis parameters is also investigated in the context of how the data was initially generated. It is hoped this paper may serve to highlight the importance of pre-analysis workflows in achieving meaningful, future-proof data that can be analysed appropriately. As amplicon sequencing gains traction in a variety of diagnostic applications from forensics to environmental DNA (eDNA it is paramount workflows and analytics are both fit for purpose.

  3. Tvorba workflow aplikací

    OpenAIRE

    Hanák, Tomáš

    2012-01-01

    Analysis, design and implementation of workflow application for auto service using Bonita open-source process engine. The thesis introduces main terminology in process applications, systems management workflow and BPMS. Methods for ISAC, PDIT and BORM process analysis are examined. Graphic notation BPMN 2.0 for process modeling is briefly described. Finally, workflow application "IT System for Auto Service" (ISA) is designed and implemented on Bonita Open Solution - Commu...

  4. Service-Oriented Architectures: From Design to Production Exploiting Workflow Patterns

    OpenAIRE

    Gabbrielli, Maurizio; Giallorenzo, Saverio; Montesi, Fabrizio

    2014-01-01

    In Service-Oriented Architectures (SOA), services are com-posed by coordinating their communications into a flow of interactions. Coloured Petri nets (CPN) offer a formal yet easy tool for modelling interactions in SOAs, however mapping abstract SOAs into executable ones requires a non-trivial and time-costly analysis. Here, we propose a methodology that maps CPN-modelled SOAs into Jolie SOAs (our tar-get language), exploiting a collection of recurring control-flow patterns, called Workflow P...

  5. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  6. Approaches to Workflow Analysis in Healthcare Settings

    OpenAIRE

    Sheehan, Barbara; Bakken, Suzanne

    2012-01-01

    Attention to workflow is an important component of a comprehensive approach to designing usable information systems. In healthcare, inattention to workflow is associated with poorly accepted systems and unforeseen effects of use. How best to examine workflow for the purpose of system design is in itself the subject of scientific inquiry. Several disciplines offer approaches to the study of workflow that can be tailored to meet the needs of systems designers in healthcare settings. This paper ...

  7. New Interactions with Workflow Systems

    OpenAIRE

    Wassink, I.; Vet, de, H.C.W.; Veer, van der, P.T.; Roos, M.; Dijk, van, G.; Norros, L.; Koskinen, H; Salo, L; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level of the new workflow system.

  8. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  9. Design and first implementation of business process visualization for a task manager supporting the workflow in an operating room

    Science.gov (United States)

    Fink, E.; Wiemuth, M.; Burgert, O.

    2015-03-01

    An operating room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and the tasks of their colleagues. The entire team must work synchronously at all times. To optimize the overall workflow, a task manager supporting the team was developed. In parallel, a common conceptual design of a business process visualization was developed, which makes all relevant information accessible in real-time during a surgery. In this context an overview of all processes in the operating room was created and different concepts for the graphical representation of these user-dependent processes were developed. This paper describes the concept of the task manager as well as the general concept in the field of surgery.

  10. Integration of the result visualization into a workflow modeling tool

    OpenAIRE

    Xuejing, Chen

    2010-01-01

    Scientific workflows are frequently used for simulations, experiments analyses or the design and execution of experiments. The operation of the workflow management system must be self-describing, intuitive, and abstracting from underlying technology because there are many non-computer scientists who want to use workflow technology. The visualization workflows are scientific workflows for visualization. For different visualization methods, there should be different visualization workflows....

  11. DEWEY: The DICOM-Enabled Workflow Engine System

    OpenAIRE

    Erickson, Bradley J.; Langer, Steve G.; Blezek, Daniel J.; Ryan, William J.; French, Todd L.

    2014-01-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that re...

  12. Toward Design, Modelling and Analysis of Dynamic Workflow Reconfigurations - A Process Algebra Perspective

    DEFF Research Database (Denmark)

    Mazzara, M.; Abouzaid, F.; Dragoni, Nicola; Bhattacharyya, A.

    2011-01-01

    This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design and to...

  13. E-BioFlow: Different Perspectives on Scientific Workflows

    OpenAIRE

    Wassink, I.; Rauwerda, H.; Vet, van der, Paul E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R F; Schneider, K.; Toma, C

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control flow perspective, the data flow perspective, and the resource perspective. All three perspectives are of equal importance, but workflow designers from different domains prefer different perspective...

  14. Workflow Mining of More Perspectives of Workflow

    OpenAIRE

    Peng Liu; Bosheng Zhou

    2008-01-01

    The goal of workflow mining is to obtain objective and valuable information from event logs .The research of workflow mining is of great significance for deploying new business process as well as analyzing and improving the already deployed ones. Many information systems log event data about executed tasks. Workflow mining is concerned with the derivation of a graphical process model out of this data. Currently, workflow mining research is narrowly focused on the rediscovery of control flow m...

  15. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    OpenAIRE

    J-M. Le GoffCERN; Chevenier, G.; Bazan, A.; Le Flour, T.; Lieunard, S; S. Murray; J-P. Vialle(LAPP, Annecy); Baker, N.; F. Estrella(UWE); Z. Kovacs(UWE); McClatchey, R.; Organtini, G.; Bityukov, S.

    1998-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of (often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Syste...

  16. Integrated design workflow and a new tool for urban rainwater management.

    Science.gov (United States)

    Chen, Yujiao; Samuelson, Holly W; Tong, Zheming

    2016-09-15

    Low Impact Development (LID) practices provide more sustainable solutions than traditional piping and storm ponds in stormwater management. However, architects are not equipped with the knowledge to perform runoff calculations at early design stage. In response to this dilemma, we have developed an open-source stormwater runoff evaluation and management tool, Rainwater+. It is seamlessly integrated into computer-aided design (CAD) software to receive instant estimate on the stormwater runoff volume of architecture and landscape designs. Designers can thereby develop appropriate rainwater management strategies based on local precipitation data, specific standards, site conditions and economic considerations. We employed Rainwater+ to conduct two case studies illustrating the importance of considering stormwater runoff in the early design stage. The first case study showed that integrating rainwater management into design modeling is critical for determining LID practice at any specific site. The second case study demonstrated the need of visualizing runoff flow direction in assisting the placement of LID practices at proper locations when the terrain is of great complexity. PMID:27208392

  17. An Intelligent Software Workflow Process Design for Location Management on Mobile Devices

    CERN Document Server

    Rao, N Mallikharjuna

    2012-01-01

    Advances in the technologies of networking, wireless communication and trimness of computers lead to the rapid development in mobile communication infrastructure, and have drastically changed information processing on mobile devices. Users carrying portable devices can freely move around, while still connected to the network. This provides flexibility in accessing information anywhere at any time. For improving more flexibility on mobile device, the new challenges in designing software systems for mobile networks include location and mobility management, channel allocation, power saving and security. In this paper, we are proposing intelligent software tool for software design on mobile devices to fulfill the new challenges on mobile location and mobility management. In this study, the proposed Business Process Redesign (BPR) concept is aims at an extension of the capabilities of an existing, widely used process modeling tool in industry with 'Intelligent' capabilities to suggest favorable alternatives to an ...

  18. Lattice QCD workflows

    Energy Technology Data Exchange (ETDEWEB)

    Piccoli, Luciano; /Fermilab /IIT, Chicago; Kowalkowski, James B.; Simone, James N.; /Fermilab; Sun, Xian-He; Jin, Hui; /IIT, Chicago; Holmgren, Donald J.; Seenu, Nirmal; Singh, Amitoj G.; /Fermilab

    2008-12-01

    This paper discusses the application of existing workflow management systems to a real world science application (LQCD). Typical workflows and execution environment used in production are described. Requirements for the LQCD production system are discussed. The workflow management systems Askalon and Swift were tested by implementing the LQCD workflows and evaluated against the requirements. We report our findings and future work.

  19. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  20. Professional Windows Workflow Foundation

    CERN Document Server

    Kitta, Todd

    2007-01-01

    If you want to gain the skills to build Windows Workflow Foundation solutions, then this is the book for you. It provides you with a clear, practical guide on how to develop workflow-based software and integrate it into existing technology landscapes. Throughout the pages, you'll also find numerous real-world examples and sample code that will help you to get started quickly.Each major area of Windows Workflow Foundation is explored in depth along with some of the fundamentals operations related to generic workflow applications. You'll also find detailed coverage on how to develop workflow in

  1. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  2. Fluent Logic Workflow Analyser: A Tool for The Verification of Workflow Properties

    OpenAIRE

    Regis, Germán; Villar, Fernando; Ricci, Nicolás

    2014-01-01

    In this paper we present the design and implementation, as well as a use case, of a tool for workflow analysis. The tool provides an assistant for the specification of properties of a workflow model. The specification language for property description is Fluent Linear Time Temporal Logic. Fluents provide an adequate flexibility for capturing properties of workflows. Both the model and the properties are encoded, in an automated way, as Labelled Transition Systems, and the analysis is reduced ...

  3. The design and implementation of the radiation therapy information management system (RTIMS) based on the workflow of radiation therapy

    International Nuclear Information System (INIS)

    Objective: To meet the special needs of the department of radiation oncology, a radiation therapy information management system (RTIMS) has been developed as a secondary database system to supplement the Varian Varis/Aria since 2007. Methods: The RTIMS server was used to run a database and web service of Apache + PHP + MySQL. The RTIMS sever's web service could be visited with Internet Explorer (IE) to input, search, count, and print information from about 30 workstations and 20 personal computers. As some workstations were installed with Windows and IE in English only, some functions had English version. Results: In past five years, as the RTIMS was implemented in the department, some further needs were met and more practical functions were developed. And now the RTIMS almost covered the whole workflow of radiation therapy (RT). By September 2011 , recorded patients data in the RTIMS is as follows: 3900 patients, 2600 outpatient RT records, 6800 progress notes, 1900 RT summaries, 6700 charge records, 83000 workload records, 3900 plan application forms, 1600 ICRT records. etc. Conclusions: The RTIMS based on the workflow of RT has been successfully developed and clinically implemented. And it was demonstrated to be user-friendly and was proven to significantly improve the efficiency of the department. Since it is an in-house developed system, more functions can be added or modified to further enhance its potentials in research and clinical practice. (authors)

  4. A Middleware Independent Grid Workflow Builder for Scientific Applications

    OpenAIRE

    Johnson, David; Meacham, Ken. E; Kornmayer, H.

    2009-01-01

    particular workflow engines built into Grid middleware, or are application specific and are designed to interact with specific software implementations. g-Eclipse is a middleware independent Grid workbench that aims to provide a unified abstraction of the Grid and includes a Grid workflow builder to allow users to author and deploy workflows to the Grid. This paper describes the g-Eclipse Workflow Builder and its implementations for two Grid middlewares, gLite and GRIA, and a case study utili...

  5. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  6. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  7. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  8. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  9. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  10. P-Graph-based Workflow Modelling

    Directory of Open Access Journals (Sweden)

    József Tick

    2007-03-01

    Full Text Available Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important because of itspractical usage. This paper introduces and examines the workflow modelling techniquebased on the Process-graph as a possible new solution next to the already existingmodelling techniques.

  11. P-Graph-based Workflow Modelling

    OpenAIRE

    József Tick

    2007-01-01

    Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important...

  12. A Novel Approach for Bioinformatics Workflow Discovery

    OpenAIRE

    Walaa Nagy; Hoda M.O. Mokhtar

    2014-01-01

    Workflow systems are typical fit for in the explorative research of bioinformaticians. These systems can help bioinformaticians to design and run their experiments and to automatically capture and store the data generated at runtime. On the other hand, Web services are increasingly used as the preferred method for accessing and processing the information coming from the diverse life science sources. In this work we provide an efficient approach for creating bioinformatic workflow for all-serv...

  13. The Design and Implementation of A Agents-based Workflow Model%一种基于Agent的工作流模型的设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈善国; 高济

    2000-01-01

    In this paper,we present the SaFlow,a workflow management system based on agents. After describing the architecture,we discuss in details about the composing and implementation of MSA,the major part of SaFlow. At last,a product quotation workflow system is demonstrated,as an application of SaFlow.

  14. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  15. A Novel Approach for Bioinformatics Workflow Discovery

    Directory of Open Access Journals (Sweden)

    Walaa Nagy

    2014-11-01

    Full Text Available Workflow systems are typical fit for in the explorative research of bioinformaticians. These systems can help bioinformaticians to design and run their experiments and to automatically capture and store the data generated at runtime. On the other hand, Web services are increasingly used as the preferred method for accessing and processing the information coming from the diverse life science sources. In this work we provide an efficient approach for creating bioinformatic workflow for all-service architecture systems (i.e., all system components are services . This architecture style simplifies the user interaction with workflow systems and facilitates both the change of individual components, and the addition of new components to adopt to other workflow tasks if required. We finally present a case study for the bioinformatics domain to elaborate the applicability of our proposed approach.

  16. Aspect-Oriented Workflow Languages

    OpenAIRE

    Charfi, Anis

    2007-01-01

    This thesis focuses on the modularity of workflow process specifications. In particular, it studies the expression support for crosscutting concerns and workflow changes in current workflow languages and workflow management systems. To illustrate the issues, two workflow languages are considered: a visual graph-based language and the Web Service composition language BPEL. This thesis starts by describing the implementation of several crosscutting concerns such as data collection for billing, ...

  17. Achieving E-learning with IMS Learning Design--Workflow Implications at the Open University of the Netherlands

    Science.gov (United States)

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, Jose; Manderveld, Jocelyn

    2005-01-01

    This paper uses the Open University of the Netherlands as an instructive case for the introduction of e-learning based on the IMS Learning Design specification (IMS LD). The IMS LD specification, as approved by the IMS Global Learning Consortium in 2003, enables the specification and encoding of learning scenarios that describe any design of a…

  18. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Andrew P Davison

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  19. Integrated workflows for spiking neuronal network simulations.

    Science.gov (United States)

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID

  20. Workflow Partitioning and Deployment on the Cloud using Orchestra

    OpenAIRE

    Jaradat, Ward; Dearle, Alan; Barker, Adam

    2014-01-01

    Orchestrating service-oriented workflows is typically based on a design model that routes both data and control through a single point - the centralised workflow engine. This causes scalability problems that include the unnecessary consumption of the network bandwidth, high latency in transmitting data between the services, and performance bottlenecks. These problems are highly prominent when orchestrating workflows that are composed from services dispersed across distant geographical locatio...

  1. A Specification Language for the WIDE Workflow Model

    OpenAIRE

    Chan, Daniel K.C.; Vonk, Jochem; Sánchez, Gabriel; Paul W. P. J. Grefen; Apers, Peter M.G.

    1998-01-01

    This paper presents a workflow specification language developed in the WIDE project. The language provides a rich organisation model, an information model including presentation details, and a sophisticated process model. Workflow application developers should find the language a useful and compact means to capture and investigate design details. Workflow system developers would discover the language a good vehicle to study the interaction between different features as well as facilitate the ...

  2. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    Science.gov (United States)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  3. Partitioning Uncertain Workflows

    CERN Document Server

    Huberman, Bernardo A

    2015-01-01

    It is common practice to partition complex workflows into separate channels in order to speed up their completion times. When this is done within a distributed environment, unavoidable fluctuations make individual realizations depart from the expected average gains. We present a method for breaking any complex workflow into several workloads in such a way that once their outputs are joined, their full completion takes less time and exhibit smaller variance than when running in only one channel. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet.

  4. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  5. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  6. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  7. Workflow Management in Electronic Commerce

    OpenAIRE

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this to workflow management across the boundaries of organizations. In the third part, we further extend this model by making service processes - implemented as workflows - the objects traded in ecomme...

  8. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  9. CloudWF: A Computational Workflow System for Clouds Based on Hadoop

    Science.gov (United States)

    Zhang, Chen; de Sterck, Hans

    This paper describes CloudWF, a scalable and lightweight computational workflow system for clouds on top of Hadoop. CloudWF can run workflow jobs composed of multiple Hadoop MapReduce or legacy programs. Its novelty lies in several aspects: a simple workflow description language that encodes workflow blocks and block-to-block dependencies separately as standalone executable components; a new workflow storage method that uses Hadoop HBase sparse tables to store workflow information internally and reconstruct workflow block dependencies implicitly for efficient workflow execution; transparent file staging with Hadoop DFS; and decentralized workflow execution management relying on the MapReduce framework for task scheduling and fault tolerance. This paper describes the design and implementation of CloudWF.

  10. Design and implementation of a kind of cross-organizational flexible workflow engine%一种跨组织柔性工作流引擎的设计与实现

    Institute of Scientific and Technical Information of China (English)

    蔡丽丽

    2014-01-01

    工作流在实际应用中经常涉及到不同系统间的业务流程协作问题,以及由于业务突发变动而产生的动态适应问题。为了解决这两者问题,提出了一种结合跨组织和柔性工作流技术的工作流引擎设计方案。对已有的跨组织工作流技术和柔性工作流技术的相关研究成果进行了分析总结,在此基础上进一步设计了一种工作流模型,该模型既支持跨业务系统交互,同时也支持业务内部的流程动态变更。结合实际场景分析来阐述该模型的跨组织业务处理和业务动态变更机制的功能设计。以流程回退为例介绍该模型的原型实现。%To resolve the matter of business process cooperation between different systems and the dynamic adaptive matter of process change which is made by some emergencies, this paper presents a blue print of workflow engine’s design which combines cross-organizational workflow and flexible workflow. The cross-organizational workflow and flexible workflow researches are summarized and analyzed. Then on the basis of this, a kind of workflow model supporting cross-organiza-tional interaction and process dynamic adaptability is put forward. According to analysis of the actual scenes, this paper expounds the design of the cross-organizational support and mechanism of dynamic change and introduces the realization of this model based on the example of rolling back process.

  11. Development of a Theoretical Monitoring System Design for a HLW Repository Based on the 'MoDeRn Monitoring Workflow' (A Case Study) - 12044

    International Nuclear Information System (INIS)

    In this paper, a generic German disposal concept in rock salt is used as an example to discuss the design of a repository monitoring system. The approach used is based on a generic structured approach to monitoring - the MoDeRn Monitoring Workflow - which is being developed and tested as part of an on-going European Commission Seventh Framework project. As a first step in the study, the requirements on the monitoring program were identified through consideration of the national context, including regulatory guidelines, host rock properties and waste to be disposed of. These are stated as general monitoring objectives. An analysis of the German safety concept for safe confinement of the radioactive waste allows these general objectives to be converted into specific sub-objectives, and for the sub-objectives to be related to specific monitoring processes and parameters. The safety concept identified the key safety components, each of them having specific associated safety functions. The safety functions can be related to the list of features, events and processes (FEPs) that contains all processes related to the future repository evolution. By screening the FEP list, all processes that potentially can affect the safety functions have been identified. In a next step the parameters that would be affected by the individual processes were determined, leading to a preliminary list of parameters to be monitored. By evaluating available techniques and monitoring equipment, this preliminary list was investigated with respect to its technical feasibility at the intended locations. Prior to final system selection, potential impacts of the monitoring system on safety or other measurements are evaluated. To avoid potential pathways for fluids that may compromise the integrity of a barrier, considerations on the application of wireless data transmission systems and techniques for autonomous, long-term power supply were given. (authors)

  12. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  13. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not...... naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  14. DMS systems and workflow

    OpenAIRE

    Jakeš, Jiří

    2008-01-01

    This work refers to systems for document management (DMS) and support of inside processes by integrated workflow modules. The work contains main reasons for implementing DMS, benefits from it, functionality of typical DMS systems, it defines components of system and status on relevant market, and also trends of future direction. This work is focused on practical use and try to find a way from technology to business.

  15. Make Your Workflows Smarter

    Science.gov (United States)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory

    2012-01-01

    Do you have workflows with many manual tasks that slow down your business? Or, do you scale back workflows because there are simply too many manual tasks? Basic workflow robots can automate some common tasks, but not everything. This presentation will show how advanced robots called "expression robots" can be set up to perform everything from simple tasks such as: moving, creating folders, renaming, changing or creating an attribute, and revising, to more complex tasks like: creating a pdf, or even launching a session of Creo Parametric and performing a specific modeling task. Expression robots are able to utilize the Java API and Info*Engine to do almost anything you can imagine! Best of all, these tools are supported by PTC and will work with later releases of Windchill. Limited knowledge of Java, Info*Engine, and XML are required. The attendee will learn what task expression robots are capable of performing. The attendee will learn what is involved in setting up an expression robot. The attendee will gain a basic understanding of simple Info*Engine tasks

  16. AN APPROACH TO E-WORKFLOW SYSTEMS WITH THE USE OF PATTERNS

    OpenAIRE

    John Ndeta; Stamatia A. Katriou; Siakas, Kerstin V.

    2015-01-01

    In today’s highly competitive and rapidly changing environment, e-businesses constantly have to modify their business processes, i.e. the flow of documents and tasks in a business also known as workflow. More flexible Workflow Management Systems are required to support these constantly changing processes. In this research a platform independent architecture for the design of e-workflow systems is illustrated. The architecture includes an information pool, namely a Workflow Pattern Repository,...

  17. Patient-centered care requires a patient-oriented workflow model

    OpenAIRE

    Ozkaynak, Mustafa; Flatley Brennan, Patricia; Hanauer, David A.; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N.

    2013-01-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed ‘patient-oriented workflow.’ This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is...

  18. Resource scheduling of workflow multi-instance migration based on the shuffled leapfrog algorithm

    OpenAIRE

    Yang Mingshun; Gao Xinqin; Cao Yuan; Liu Yong; Li Yan

    2015-01-01

    Purpose: When the workflow changed, resource scheduling optimization in the process of the current running instance migration has become a hot issue in current workflow flexible research; purpose of the article is to investigate the resource scheduling problem of workflow multi-instance migration. Design/methodology/approach: The time and cost relationships between activities and resources in workflow instance migration process are analyzed and a resource scheduling optimiza...

  19. Research and design of lightweight workflow model based on improved activity-on-vertex network%基于改进AOV网的轻量级工作流模型研究与设计

    Institute of Scientific and Technical Information of China (English)

    於正琳; 孙精科

    2013-01-01

    For the deficiency when the existing workflow models deal with large and complex system, the concept of lightweight model was introduced and a lightweight workflow model based on improved Activity-On-Vertex (AOV) network was proposed to meet the requirements of the large and complex business process in workflow management. While the detailed definition and design of this model were presented, two critical algorithms in process scheduling, scheduling algorithm for branch points and synchronization algorithm for convergent points, were given to ensure the accurate operation of the process. The lightweight advantage of this model was reflected through the analysis of a concrete example's workflow modeling. Using graph theory to do static and dynamic verification on this model, it proves that this model is reasonable.%针对现有工作流模型在应对大型复杂系统时的不足,引入轻量级模型的概念,提出一种基于改进AOV网的轻量级工作流模型以满足大型复杂业务流程的工作流管理需求.在对模型进行详细定义与设计的同时,给出了流程调度中关键的两个算法——分支的调度算法及汇聚的同步算法以确保流程的准确运行.通过对一个具体实例的流程建模分析,体现了模型的轻量级优势并采用图论的分析手段对模型进行静态及动态验证,证明了模型的合理性.

  20. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  1. Analysis of Enterprise Workflow Solutions

    Science.gov (United States)

    Chen, Cui-E.; Wang, Shulin; Chen, Ying; Meng, Yang; Ma, Hua

    Since the 90’s, workflow technology has been widely applied in various industries, such as office automation(OA), manufacturing, telecommunications services, banking, securities, insurance and other financial services, research institutes and education services, and so on, to improve business process automation and integration capabilities. In this paper, based on Workflow theory, the author proposed a set of policy-based workflow approach in order to support dynamic workflow patterns. Through the expansion of the functions of Shark, it implemented a Workflow engine component-OAShark which can support retrieval / rollback function. The related classes were programmed. The technology was applied to the OA system of an enterprise project. The realization of the enterprise workflow solutions greatly improved the efficiency of the office automation.

  2. CSP for Executable Scientific Workflows

    OpenAIRE

    Friborg, Rune Møllegaard

    2011-01-01

    This thesis presents CSP as a means of orchestrating the execution of tasks in a scientific workflow. Scientific workflow systems are popular in a wide range of scientific areas, where tasks are organised in directed graphs. Execution of such graphs is handled by the scientific workflow systems and can usually benefit performance-wise from both multiprocessing, cluster and grid environments.PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming langu...

  3. DOCFLOW: AN INTEGRATED DOCUMENT WORKFLOW FOR BUSINESS PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Boonsit Yimwadsana

    2011-01-01

    Full Text Available Document management and workflow management systems have been widely used in large business enterprises to improve productivity. However, they still do not gain large acceptance in small and mediumsized businesses due to their cost and complexity. In addition, document management and workflow management systems are often used separately because they solve different problems. Only some part of document management systems should be tied together with workflow management systems. However, in most business environment, documents actually flow according to workflow definitions. Our work, thus, combines the two concepts together and simplifies the management of both document and workflow to fit business users. Our application, DocFlow, is designed with simplicity in mind while still maintaining necessary workflow and document management standard features with security. Approval mechanism is naturally included in the workflow, and the approval can be performed by a group of actors such that only one of the team members is sufficient to make the group's decision. A case study of news publishing process is shown to demonstrate how DocFlow can be used to create a workflow that fits the news publishing process.

  4. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  5. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  6. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  7. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  8. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  9. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  10. CMS Alignement and Calibration workflows lesson learned and future plans

    CERN Document Server

    De Guio, Federico

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  11. Using SharePoint server for managing workflow in complex projects

    OpenAIRE

    ORAČ, ROMAN

    2011-01-01

    The aim of the thesis is to find efficient ways of using SharePoint server in the organization and to study in detail the use of workflows for project management. For this purpose, we set up a SharePoint server, which serves as a test environment, and design a workflow. SharePoint Server uses concept of workflow for project management. Workflows can consistently manage common business processes within an organization. Workflow can be described as a series of tasks, which gives the result. Wor...

  12. Advanced Architectures for Transactional Workflows or Advanced Transactions in Workflow Architectures

    NARCIS (Netherlands)

    Grefen, Paul

    1999-01-01

    In this short paper, we outline the workflow management systems research in the Information Systems division at the University of Twente. We discuss the two main themes in this research: architecture design and advanced transaction management. Attention is paid to the coverage of these themes in the

  13. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-01-01

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows. PMID:26673793

  14. An Architecture for Decentralised Orchestration of Web Service Workflows

    OpenAIRE

    Jaradat, Ward; Dearle, Alan; Barker, Adam

    2013-01-01

    Service-oriented workflows are typically executed using a centralised orchestration approach that presents significant scalability challenges. These challenges include the consumption of network bandwidth, degradation of performance, and single-points of failure. We provide a decentralised orchestration architecture that attempts to address these challenges. Our architecture adopts a design model that permits the computation to be moved "closer" to services in a workflow. This is achieved by ...

  15. 应用工作流引擎的用电信息采集系统设计与实现%Design and implementation of electricity information acquisition system based on workflow engine

    Institute of Scientific and Technical Information of China (English)

    罗天; 赵丹阳; 郑静雯

    2015-01-01

    Now one of important problems of the current elec-tricity information collection system is that business process is mainly in accordance with the procedures. It needs to modify the program to adapt the changes of the business logic, which causes a large amount of construction and maintenance. Workflow engine can be used to solve the above problems. It can automatically per-form predefined business processes and drive software to run order-ly. Based on the workflow technology, this article designed and im-plemented the electricity information acquisition system. It rede-fined, managed and monitored the workflow, which can effectively improve the efficiency of the business. It can also easily adjust and redefine workflow to adapt the change of the actual business logic, and greatly reduce the development cost and maintenance cost.%目前用电信息采集系统存在着业务流程固化于程序里,当业务逻辑发生变化时需要通过修改程序来适应其变化,造成了较大的开发量和维护量的问题.工作流引擎可以用来解决上述问题,它能够按照预先定义好的工作流逻辑执行工作流实例,并对整个工作流程进行管理和监控.基于工作流引擎对用电信息采集系统进行了设计实现,对工作流程进行定义、管理和监控,有效提升了业务办理效率,并且在实际业务逻辑发生变化时,可以只通过适当调整或重新定义工作流程来适应其变化,大大降低了开发成本和维护成本.

  16. CA-PLAN, a Service-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Shung-Bin Yan; Feng-Jian Wang

    2005-01-01

    Workflow management systems (WfMSs) are accepted worldwide due to their ability to model and control business processes. Previously, we defined an intra-organizational workflow specification model, Process LANguage (PLAN).PLAN, with associated tools, allowed a user to describe a graph specification for processes, artifacts, and participants in an organization. PLAN has been successfully implemented in Agentflow to support workflow (Agentflow) applications. PLAN,and most current WfMSs are designed to adopt a centralized architecture so that they can be applied to a single organization.However, in such a structure, participants in Agentflow applications in different organizations cannot serve each other with workflows.In this paper, a service-oriented cooperative workflow model, Cooperative Agentflow Process LANguage (CA-PLAN) is presented. CA-PLAN proposes a workflow component model to model inter-organizational processes. In CA-PLAN, an interorganizational process is partitioned into several intra-organizational processes. Each workflow system inside an organization is modeled as an Integrated Workflow Component (IWC). Each IWC contains a process service interface, specifying process services provided by an organization, in conjunction with a remote process interface specifying what remote processes are used to refer to remote process services provided by other organizations, and intra-organizational processes. An IWC is a workflow node and participant. An inter-organizational process is made up of connections among these process services and remote processes with respect to different IWCs. In this paper, the related service techniques and supporting tools provided in Agentflow systems are presented.

  17. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  18. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  19. The design of the workflow-based national geological project information system%基于工作流的国家地质调查项目管理信息系统设计

    Institute of Scientific and Technical Information of China (English)

    王新春; 张怀东

    2011-01-01

    A workflow-based national geological project information system was designed in this paper, which includes all the workflow templates involved in geological projects management, thus making the flow examples of various stages having different roles and different authorities easily exchangeable and realizing each type of information transmission.This system raise the working efficiency of the management of national geological projects, and makes the work procedure standard, open, and systematic.In addition, it is beneficial to future geological project management.%本文基于工作流技术设计实现了国家地质调查项目管理信息系统.该系统覆盖了地质调查项目管理工作各阶段的流程模板,使各阶段的流程实例在拥有不同角色权限的用户间流转,并实现各种信息的传递.该系统全面提升了国家地质调查项目管理工作的信息化程度和工作效率,促使办公程序的规范化、公开化、系统化,对未来项目管理的协调发展起到积极的推动作用.

  20. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  1. Automated attribute inference in complex service workflows based on sharing analysis

    OpenAIRE

    Ivanovic, Dragan; Carro Liñares, Manuel; Hermenegildo, Manuel V.

    2011-01-01

    The properties of data and activities in business processes can be used to greatly facilítate several relevant tasks performed at design- and run-time, such as fragmentation, compliance checking, or top-down design. Business processes are often described using workflows. We present an approach for mechanically inferring business domain-specific attributes of workflow components (including data Ítems, activities, and elements of sub-workflows), taking as starting point known attrib...

  2. Customizable Isolation in Transactional Workflow

    OpenAIRE

    Guabtni, Adnene; Charoy, François; Godart, Claude

    2005-01-01

    In Workflow Management Systems (WFMSs) safety of execution is a main need of more and more business processes and transactional workflows are real needs inside enterprizes. In previous works, transactional models consider mainly atomicity as the main issue regarding long term transactions. It rarely consider the fact that many processes may run concurrently and thus access and update the same data. Usually, the main isolation item is the data on which we apply locking approaches and this atti...

  3. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  4. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  5. LQCD workflow execution framework: Models, provenance and fault-tolerance

    Science.gov (United States)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  6. Development of a Workflow Integration Survey (WIS) for Implementing Computerized Clinical Decision Support

    OpenAIRE

    Flanagan, Mindy; Arbuckle, Nicole; Saleem, Jason J; Militello, Laura G.; Haggstrom, David A.; Doebbeling, Bradley N

    2011-01-01

    Interventions that focus on improving computerized clinical decision support (CDS) demonstrate that successful workflow integration can increase the adoption and use of CDS. However, metrics for assessing workflow integration in clinical settings are not well established. The goal of this study was to develop and validate a survey to assess the extent to which CDS is integrated into workflow. Qualitative data on CDS design, usability, and integration from four sites was collected by direct ob...

  7. A component-based product line architecture for workflow management systems

    OpenAIRE

    Lazilha, Fabrício Ricardo; Barroca, Leonor; de Oliveira Junior, Edson Alves; de Souza Gimenes, Itana Maria

    2004-01-01

    This paper presents a component-based product line for workflow management systems. The process followed to design the product line was based on the Catalysis method. Extensions were made to represent variability across the process. The domain of workflow management systems has been shown to be appropriate to the application of the product line approach as there are a standard architecture and models established by a regulatory board, the Workflow Management Coalition. In addition, there is a...

  8. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical......Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... semantics, which is a necessary foundation for asynchronously executing distributed processes, is not obvious for declarative formalisms and is so far virtually unexplored. This is in stark contrast to the very successful Petri-net–based process languages, which have an inherent notion of concurrency. In...

  9. Advanced Architectures for Transactional Workflows or Advanced Transactions in Workflow Architectures

    OpenAIRE

    Grefen, Paul

    1999-01-01

    In this short paper, we outline the workflow management systems research in the Information Systems division at the University of Twente. We discuss the two main themes in this research: architecture design and advanced transaction management. Attention is paid to the coverage of these themes in the context of the completed Mercurius and WIDE projects and in the new CrossFlow project. In the latter project, contracts are introduced as a new theme to support electronic commerce aspects in work...

  10. Procesos workflow en la nube

    OpenAIRE

    Peralta, Mario; Salgado, Carlos Humberto; Baigorria, Lorena; Montejano, Germán Antonio; Riesco, Daniel Eduardo

    2014-01-01

    Dada la globalización de la información, las organizaciones tienden a virtualizar sus negocios: subir su negocio a la Nube. Desde la perspectiva de la complejidad de los procesos de negocio, una de las tecnologías más significativas para soportar su automatización son los Sistemas de Gestión Workflow, dando soporte computacional para definir, sincronizar y ejecutar actividades del proceso utilizando workflows. Para favorecer y dar flexibilidad a dichos sistemas, es fundamental tener herram...

  11. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  12. Workflow in Astronomy : the VO France Workflow Working Group experience

    Science.gov (United States)

    Schaaff, A.; Petit, F. L.; Prugniel, P.; Slezak, E.; Surace, C.

    2008-08-01

    The French Action Spécifique Observatoires Virtuels has created the Workflow Working Group in 2005. Its aim is to explore the use of the Workflow paradigm in the astronomical domain. The first consensus was the definition of a Workflow as a sequence of tasks realized in a controlled context (at various levels: intelligence in the choice of the algorithms, flow control, etc.), based on use cases studies, in an architecture which takes into account VO standards. The current roadmap is to provide scientific use cases in several domains (image, spectrum, simulation, data mining, etc.) and to improve them mainly with VO existing tools. Another important point is to develop collaborations with the IT community (links to EGEE, ...). Use cases are useful to compare the pertinence of the possible workflow models and to understand how to implement it as efficiently as possible with the existing tools (ex. : AstroGrid, AÏDA, WebCom-G, etc.). The execution (local machine, cluster, grid) through this kind of tools and the use of VO functionalities (Web Services, Grid, VOSpace, etc.) becomes almost transparent.

  13. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  14. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  15. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  16. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and...... model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility...... of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow....

  17. Constructing workflows from script applications

    NARCIS (Netherlands)

    M. Baranowski; A. Belloum; M. Bubak; M. Malawski

    2013-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  18. Integration of the radiotherapy irradiation planning in the digital workflow

    International Nuclear Information System (INIS)

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  19. Continuous digital workflows for earth science research

    OpenAIRE

    Klump, J.; Löwe, P.

    2007-01-01

    The wealth of data available in the earth sciences is underutilised due to the absence of continuous digital workflows. The emergence of standardised web services for geospatial data, sensor network integration and grid technology now offer tools for the creation of such workflows, orchestrated by workflow engines. The creation of continuous digital workflows enables us to create new tools for global collaboration in the earth sciences by integrating the acquisition of data and metadata, and ...

  20. Delegation Protocols in Human-Centric Workflows

    OpenAIRE

    Gaaloul, Khaled; Proper, Erik; Charoy, François

    2011-01-01

    International audience Organisations are facilitated and conducted using workflow management systems. Currently, we observe a tendency moving away from strict workflow modelling towards dynamic approaches supporting human interactions when deploying a workflow. One specific approach ensuring human-centric workflows is task delegation. Delegating a task may require an access to specific and potentially sensitive data that have to be secured and specified into authorisation policies. In this...

  1. Ship 3D Collaborative Design Workflow Research Based on Integrated Platform%基于集成平台的船舶三维协同设计流程研究

    Institute of Scientific and Technical Information of China (English)

    苏绍娟; 刘寅东; 刘晓明

    2011-01-01

    At first, the characteristics of modern ship design and manufacture and the existing limitations with information technology development is analyzed.On the base the need for parallel and collaborative design is put forward.At the same time collaborative design management software Windchill is in-depth studied and application, Integrated platform with 3D design software Solidworks and management software Windchill through middleware software is constructed and its key technologies is aralyzed.3D design workflow combining example of ship design is carried out.The integrated platform can create conditions for further implementation of the design process in parallel, dynamic, realtime control.%分析了现代船舶设计制造的特点及随着信息化的发展而存在的一些局限性,在此基础上分析了进行并行协同设计的必要性.同时对协同设计管理软件windchill进行了深入的研究和应用,通过中间件构建了三维设计软件SolidWorks与windchill的集成平台,并分析了其关键技术.结合船舶设计实例实现了三维协同设计的工作流程,为进一步对设计过程实施并行、动态、实时控制创造了条件.

  2. Performance engineering method for workflow systems : an integrated view of human and computerised work processes

    OpenAIRE

    Brataas, Gunnar

    1996-01-01

    A method for designing workflow systems which satisfy performance requirements is proposed in this thesis. Integration of human and computerised performance is particularly useful for workflow systems where human and computerised processes are intertwined. The proposed framework encompasses human and computerised resources. Even though systematic performance engineering is not common practice in information system development, current, best practice shows that performance engineering of softw...

  3. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  4. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this......, the workflow manager needs stringent and complete information about each integrated tool. We discuss how such information is structured in CLARIN-DK. Provided that many tools are made available to and through the CLARIN-DK infrastructure, the automatically created workflows, although simple linear programs...

  5. Evaluating data caching techniques in DMCF workflows using Hercules

    OpenAIRE

    Rodrigo Duro, Francisco José; Marozzo, Fabrizio; García Blas, Javier; Carretero Pérez, Jesús; Talia, Domenico; Trunfio, Paolo

    2015-01-01

    The Data Mining Cloud Framework (DMCF) is an environment for designing and executing data analysis workflows in cloud platforms. Currently, DMCF relies on the default storage of the public cloud provider for any I/O related operation. This implies that the I/O performance of DMCF is limited by the performance of the default storage. In this work we propose the usage of the Hercules system within DMCF as an ad-hoc storage system for temporary data produced inside workflow-based applications. H...

  6. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  7. Design and Implementation of Scientific Research Management System Based on Workflow%基于工作流的科研管理系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    孟建良; 赵强

    2014-01-01

    With the development of scientific research work is being done, the scientific research ability has become an important index to measure the overall strength of colleges and universities, and scientific research resources increasing also put forward high-er requirements for the management of scientific research work in Colleges and universities. In this paper, based on the definition and the characteristics of workflow technology, combined with the management information system development process, the de-sign ideas and scheme, realize the informatization of scientific research management, so as to meet the needs of the development of scientific research in Colleges and Universities.%随着科研工作的逐步开展,高校的科研能力已成为衡量高校综合实力的一个重要指标,同时科研资源的日益增多也对高校的科研管理工作提出了更高的要求。该文根据工作流技术的定义和特点,结合管理信息系统的开发过程,提出设计思路和方案,实现科研管理信息化,从而满足高校科研工作发展的需要。

  8. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  9. Common motifs in scientific workflows: An empirical analysis

    OpenAIRE

    Garijo Verdejo, Daniel; Alper, P.; Belhajjame, K.; Corcho, Oscar; Gil, Yolanda; Goble, C.

    2013-01-01

    Workflow technology continues to play an important role as a means for specifying and enacting computational experiments in modern science. Reusing and re-purposing workflows allow scientists to do new experiments faster, since the workflows capture useful expertise from others. As workflow libraries grow, scientists face the challenge of finding workflows appropriate for their task, understanding what each workflow does, and reusing relevant portions of a given workflow.We believe that workf...

  10. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  11. Quality of Data Driven Simulation Workflows

    Directory of Open Access Journals (Sweden)

    Michael Reiter

    2014-01-01

    Full Text Available Simulations are long-running computations driven by non-trivial data dependencies. Workflow technology helps to automate these simulations and enable using Quality of Data (QoD frameworks to determine the goodness of simulation data. However, existing frameworks are specific to scientific domains, individual applications, or proprietary workflow engine extensions. In this paper, we propose a generic approach to use QoD as a uniform means to steer complex interdisciplinary simulations implemented as workflows. The approach enables scientists to specify abstract QoD requirements that are considered to steer the workflow for ensuring a precise final result. To realize these Quality of Data-driven workflows, we present a middleware architecture and a WS-Policy-based language to describe QoD requirements and capabilities. To prove technical feasibility, we present a prototype for controlling and steering simulation workflows and a real world simulation scenario.

  12. Use of designed experiments for the improvement of pre-analytical workflow for the quantification of intracellular nucleotides in cultured cell lines.

    Science.gov (United States)

    Machon, Christelle; Bordes, Claire; Cros-Perrial, Emeline; Clement, Yohann; Jordheim, Lars Petter; Lanteri, Pierre; Guitton, Jérôme

    2015-07-31

    The present study is focused on the development of a pre-analytical strategy for the quantification of intracellular nucleotides from cultured cell lines. Different protocols, including cell recovery, nucleotide extraction and purification, were compared on a panel of nucleoside mono-, di- and triphosphates from four cell lines (adherent and suspension cells). The quantification of nucleotides was performed using a validated technique with on-line solid-phase extraction coupled with liquid chromatography-triple quadrupole tandem mass spectrometry (LC-MS/MS). Designed experiments were implemented to investigate, in a rigorous and limited-testing experimental approach, the influence of several operating parameters. Results showed that the technique used to harvest adherent cells drastically affected the amounts of intracellular nucleotides. Scraping cells was deleterious because of a major leakage (more than 70%) of intracellular nucleotides during scraping. Moreover, some other tested conditions should be avoided, such as using pure methanol as extraction solvent (decrease over 50% of intracellular nucleotides extracted from NCI-H292 cells) or adding a purification step with chloroform. Designed experiments allowed identifying an interaction between the percentage of methanol and the presence of chloroform. The mixture methanol/water (70/30, v/v) was considered as the best compromise according to the nucleoside mono-, di-, or triphosphates and the four cell lines studied. This work highlights the importance of pre-analytical step combined with the cell lines studied associated to sensitive and validated assay for the quantification of nucleotides in biological matrices. PMID:26094139

  13. The LabFlow system for workflow management in large scale biology research laboratories.

    Science.gov (United States)

    Goodman, N; Rozen, S; Stein, L D

    1998-01-01

    LabFlow is a workflow management system designed for large scale biology research laboratories. It provides a workflow model in which objects flow from task to task under programmatic control. The model supports parallelism, meaning that an object can flow down several paths simultaneously, and sub-workflows which can be invoked subroutine-style from a task. The system allocates tasks to Unix processes to achieve requisite levels of multiprocessing. The system uses the LabBase data management system to store workflow-state and laboratory results. LabFlow provides a Per15 object-oriented framework for defining workflows, and an engine for executing these. The software is freely available. PMID:9783211

  14. Enabling adaptive scientific workflows via trigger detection

    OpenAIRE

    Salloum, Maher; Bennett, Janine C.; PINAR, Ali; Bhagatwala, Ankit; Chen, Jacqueline H.

    2015-01-01

    Next generation architectures necessitate a shift away from traditional workflows in which the simulation state is saved at prescribed frequencies for post-processing analysis. While the need to shift to in~situ workflows has been acknowledged for some time, much of the current research is focused on static workflows, where the analysis that would have been done as a post-process is performed concurrently with the simulation at user-prescribed frequencies. Recently, research efforts are striv...

  15. Agile parallel bioinformatics workflow management using Pwrake

    OpenAIRE

    Tanaka Masahiro; Sasaki Kensaku; Mishima Hiroyuki; Tatebe Osamu; Yoshiura Koh-ichiro

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  16. VO-compliant workflows and science gateways

    Science.gov (United States)

    Castelli, G.; Taffoni, G.; Sciacca, E.; Becciani, U.; Costa, A.; Krokos, M.; Pasian, F.; Vuerli, C.

    2015-06-01

    Workflow and science gateway technologies have been adopted by scientific communities as a valuable tool to carry out complex experiments. They offer the possibility to perform computations for data analysis and simulations, whereas hiding details of the complex infrastructures underneath. There are many workflow management systems covering a large variety of generic services coordinating execution of workflows. In this paper we describe our experiences in creating workflows oriented science gateways based on gUSE/WS-PGRADE technology and in particular we discuss the efforts devoted to develop a VO-compliant web environment.

  17. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  18. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  19. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  20. Analýza využití workflow produktů

    OpenAIRE

    Šich, Jan

    2012-01-01

    The thesis is focused on workflow processes and systems for their design and management. It tries to show readers more familiar way with the nature of these systems and features that should support. It tries to accomplish three goals. The first goal is to determine the criteria by which you can judge the quality of workflow management system. These criteria must be sorted by the importance of specific situations, which are designed weighting of the criteria and technique for their calculation...

  1. Workflow Planning and Execution - Final Results

    Directory of Open Access Journals (Sweden)

    Ravikant Dewangan

    2016-01-01

    Full Text Available An abstract workflow generation is choosing and configuring with application components to form an abstract workflow. The application components are chosen by examining the specification of their capabilities and checking to see if they can generate the desired data products. They are configured by assigning input files that exist or that may be generated by other application components. The abstract workflow specifies the order in which the components must be executed. A concrete workflow generation is selecting specific resources, files, and additional jobs required to form a concrete workflow that can be executed in the Grid environment. In order to generate a concrete workflow, each component in the abstract workflow is turned into an executable job by specifying the locations of the physical files of the component and data, as well as the resources assigned to the component in the execution environment. Additional jobs may be included in the concrete workflow, for example, jobs that transfer files to the appropriate locations where resources are available to execute the application components.

  2. Introduction to the Workflow Systems in Management

    OpenAIRE

    Aleksander Wocial

    2007-01-01

    The article concerns ontology of workflow management systems. The fundamental diagrams and their constituent elements are presented, the meaning of components and relation or interaction among them as well. The first is conceptual model of flow process, followed by meta model of process definition. The understanding of terms is crucial for IT or management specialists involved in the area of workflow.

  3. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  4. WIDE - A Distributed Architecture for Workflow Management

    OpenAIRE

    S. Ceri; Grefen, P.W.P.J.; G. Sánchez

    1997-01-01

    This paper presents the distributed architecture of the WIDE workflow management system. We show how distribution and scalability are obtained by the use of a distributed object model, a client/server architecture, and a distributed workflow server architecture. Specific attention is paid to the extended transaction support and active rule support subarchitectures.

  5. A quantitative fitness analysis workflow.

    Science.gov (United States)

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  6. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  7. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  8. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  9. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency and the...... readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to...

  10. DynamicWorkflow in Grid-MAS Integration Context

    OpenAIRE

    Salle, Paola; Duvert, Frédéric; Hérin, Danièle; Stefano A. CERRI

    2007-01-01

    This paper addresses the architectural foundations of dynamic workflows in distributed multi-agent systems (MAS) integrated in Grid context. The purpose is to design an architecture at the same time taking into consideration tasks dependencies among agents, adaptation with respect to historic lessons learnt from past behaviour (memory) and the autonomous decisions when an unpredicted event occurs. In order to do this, given one ontology, called AGIO, which describes Agent-Grid Integration, we...

  11. Taverna Workflows in the Virtual Observatory

    Science.gov (United States)

    Benson, K.; Cecconi, B.

    2015-12-01

    Taverna workflows used in the Virtual ObservatoryPlanetary and Solar applications developed over the last decade generate dataat a previously unimaginable scale. One of these programmes which builds on the strengths of IDIS of Europlanet FP7, is the Virtual European Solar and Planetary Access (VESPA). With VESPA more data will be distributed and the connectivity of tools and infrastructure willimprove. VESPA enables growth of the user and provider community. However the challenge of connectivity persist throughout applications data services. VESPA calls are formed in part by tools and interactions services. One such tool and interaction service is the Taverna workflow management system. Workflows allow to address the challenges of data interconnectivity by establishing pipeline to services offered by other data streaming services. Workflows offer the capability to cross domains and overome interoperability issues. Furthermore, Taverna offers sharing of workflows; academic community 'myExperiment', a social site for scientists, supports search and opens access to pre existing workflows. This presentation focuses on cross domain workflows including use of the infrastructure setup with Helio, EUROPLANET and VAMDC projects. Hands on demonstration and an opportunity to join the community discussion will make the presentation more interactive

  12. Application of workflow technology for workshop scheduling

    Institute of Scientific and Technical Information of China (English)

    ZHOU Wan-kun; ZHU Jian-ying

    2005-01-01

    This paper attempts to solve the complexity of scheduling problems and meet the requirement of the ever-changing manufacturing environment. In this paper, a new Workflow-Based Scheduling System (WBSS) is proposed. The integration of Workflow Management System (WfMS) and rule-based scheduler provides us an effective way of generating a task-sheet according to the states of system and the scheduled objects. First, the definition of workflow model for scheduling is proposed, and following are the architecture and mechanism of the proposed WBSS. At last, an application is given to show how the established system works.

  13. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    OpenAIRE

    Jonhan Ho; Orly Aridor; Parwani, Anil V.

    2012-01-01

    Background: For decades anatomic pathology (AP) workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of ...

  14. Verifying the Interplay of Authorization Policies and Workflow in Service-Oriented Architectures (Full version)

    OpenAIRE

    Barletta, Michele; Ranise, Silvio; Viganò, Luca

    2009-01-01

    A widespread design approach in distributed applications based on the service-oriented paradigm, such as web-services, consists of clearly separating the enforcement of authorization policies and the workflow of the applications, so that the interplay between the policy level and the workflow level is abstracted away. While such an approach is attractive because it is quite simple and permits one to reason about crucial properties of the policies under consideration, it does not provide the r...

  15. Proposing a Formal Method for Workflow Modelling: Temporal Logic of Actions (TLA)

    OpenAIRE

    Caro, Jose L.

    2014-01-01

    The study and implementation of formal techniques to aid the design and implementation of Workflow Management Systems (WfMS) is still required. Using these techniques, we can provide this technology with automated reasoning capacities, which are required for the automated demonstration of the properties that will verify a given model. This paper develops a formalization of the workflow paradigm based on communication (speech-act theory) by using a temporal logic, namely, the Temporal Logic of...

  16. A multi-parametric workflow for the prioritization of mitochondrial DNA variants of clinical interest

    OpenAIRE

    Santorsola, Mariangela; Calabrese, Claudia; Girolimetti, Giulia; Diroma, Maria Angela; Gasparre, Giuseppe; Attimonelli, Marcella

    2015-01-01

    Assigning a pathogenic role to mitochondrial DNA (mtDNA) variants and unveiling the potential involvement of the mitochondrial genome in diseases are challenging tasks in human medicine. Assuming that rare variants are more likely to be damaging, we designed a phylogeny-based prioritization workflow to obtain a reliable pool of candidate variants for further investigations. The prioritization workflow relies on an exhaustive functional annotation through the mtDNA extraction pipeline MToolBox...

  17. Parallel workflow tools to facilitate human brain MRI post-processing

    OpenAIRE

    Gaolang Gong

    2015-01-01

    Multi-modal magnetic resonance imaging (MRI) techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computat...

  18. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    usage history to help Earth scientists better understand existing artifacts and how to use them in a proper manner? R2: Informed by insights derived from their computing contexts, how could such hidden knowledge be used to facilitate artifact reuse by Earth scientists? Our study of the two research questions will provide answers to three technical questions aiming to assist NEX users during workflow development: 1) How to determine what topics interest the researcher? 2) How to find appropriate artifacts? and 3) How to advise the researcher in artifact reuse? In this paper, we report our on-going efforts of leveraging social networking theory and analysis techniques to provide dynamic advice on artifact reuse to NEX users based on their surrounding contexts. As a proof of concept, we have designed and developed a plug-in to the VisTrails workflow design tool. When users develop workflows using VisTrails, our plug-in will proactively recommend most relevant sub-workflows to the users.

  19. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  20. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  1. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  2. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  3. Workflow Management in Occupational Medicine Using the Simple Workflow Access Protocol (SWAP)

    OpenAIRE

    McClay, James

    2001-01-01

    There are over nine million reported work related injuries a year administered through the workers compensation system. Workers compensation requires extensive communication with employers and payers. Workflow automation tools exist in segments of the industry but there isn't a common communication system. The Internet Engineering Task Force (IETF) Working Group on Simple Workflow Access Protocol (SWAP) is addressing the specifications for workflow across the Internet. We are adapting these p...

  4. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  5. Modular Workflow Engine for Distributed Services using Lightweight Java Clients

    CERN Document Server

    Vetter, R -M; Peetz, J -V

    2009-01-01

    In this article we introduce the concept and the first implementation of a lightweight client-server-framework as middleware for distributed computing. On the client side an installation without administrative rights or privileged ports can turn any computer into a worker node. Only a Java runtime environment and the JAR files comprising the workflow client are needed. To connect all clients to the engine one open server port is sufficient. The engine submits data to the clients and orchestrates their work by workflow descriptions from a central database. Clients request new task descriptions periodically, thus the system is robust against network failures. In the basic set-up, data up- and downloads are handled via HTTP communication with the server. The performance of the modular system could additionally be improved using dedicated file servers or distributed network file systems. We demonstrate the design features of the proposed engine in real-world applications from mechanical engineering. We have used ...

  6. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  7. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  8. The MPO API: A tool for recording scientific workflows

    International Nuclear Information System (INIS)

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access

  9. Using Technology to Facilitate Technical Services Workflows

    OpenAIRE

    Getz, Kelli; Castro, Jeanne M

    2013-01-01

    Managing workflows in a complex and evolving environment is a challenge for technical services librarians. By taking advantage of technology, technical services librarians at the University of Houston Libraries currently develop and revise workflows using tools such as Google Docs, Microsoft Outlook Tasks, and Drupal-based forms. By embracing technology and harnessing the power of these tools, the UH librarians are able to successfully pair effective communication with a high-level of transpa...

  10. Proof-of-concept engineering workflow demonstrator

    OpenAIRE

    Molinari, M; Cox, SJ; Takeda, K.

    2006-01-01

    When Microsoft needed a proof-of-concept implementation of bespoke engineering workflow software for their customer, BAE Systems, it called on the software engineering skills and experience of the Microsoft Institute for High Performance Computing. BAE Systems was looking into converting their in-house SOLAR software suite to run on the MS Compute Cluster Server product with 64-bit MPI support in conjunction with an extended Windows Workflow environment for use by their engineers

  11. OBJECTFLOW: a modular workflow management system

    OpenAIRE

    Camilo, Ocampo; Botella López, Pere

    1997-01-01

    Workflow Management (WM) is an emerging area that involves cross-disciplinary fields as Database, Software Engineering, Business Management, Human Coordination. A Workflow Management System (WMS) is a software tool to automate Business Processes (BPs) and coordinate people of an organization. BPs are a set of linked procedures concentrated on reaching a business goal, normally following a set of procedural rules. This work presents the OBJECTFLOW(2) project, result of ...

  12. A mixed methods approach for measuring the impact of delivery-centric interventions on clinician workflow.

    Science.gov (United States)

    Cady, Rhonda G; Finkelstein, Stanley M

    2012-01-01

    Health interventions vary widely. Pharmaceuticals, medical devices and wellness promotion are defined as 'outcome-centric.' They are implemented by clinicians for the use and benefit of consumers, and intervention effectiveness is measured by a change in health outcome. Electronic health records, computerized physician order entry systems and telehealth technologies are defined as 'delivery-centric.' They are implemented by organizations for use by clinicians to manage and facilitate consumer health, and the impact of these interventions on clinician workflow has become increasingly important. The methodological framework introduced in this paper uses a two-phase sequential mixed methods design that qualitatively explores clinician workflow before and after implementation of a delivery-centric intervention, and uses this information to quantitatively measure changes to workflow activities. The mixed methods protocol provides a standardized approach for understanding and determining the impact of delivery-centric interventions on clinician workflow. PMID:23304393

  13. Electronic health information in use: Characteristics that support employee workflow and patient care.

    Science.gov (United States)

    Russ, Alissa L; Saleem, Jason J; Justice, Connie F; Woodward-Hagg, Heather; Woodbridge, Peter A; Doebbeling, Bradley N

    2010-12-01

    The aim of this investigation was to assess helpful and challenging aspects of electronic health information with respect to clinical workflow and identify a set of characteristics that support patient care processes. We conducted 20 semi-structured interviews at a Veterans Affairs Medical Center, with a fully implemented electronic health record (EHR), and elicited positive and negative examples of how information technology (IT) affects the work of healthcare employees. Responses naturally shed light on information characteristics that aid work processes. We performed a secondary analysis on interview data and inductively identified characteristics of electronic information that support healthcare workflow. Participants provided 199 examples of how electronic information affects workflow. Seventeen characteristics emerged along with four primary domains: trustworthy and reliable; ubiquitous; effectively displayed; and adaptable to work demands. Each characteristic may be used to help evaluate health information technology pre- and post-implementation. Results provide several strategies to improve EHR design and implementation to better support healthcare workflow. PMID:21216808

  14. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  15. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; De, K; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2014-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  16. Re-engineering Workflows: Changing the Life Cycle of an Electronic Health Record System

    Directory of Open Access Journals (Sweden)

    Jane M. Brokel

    2011-01-01

    Full Text Available An existing electronic health record (EHR system was re-engineered with cross-functional workflows to enhance the efficiency and clinical utility of health information technology. The new designs were guided by a systematic review of clinicians' requests, which were garnered by direct interviews. To design cross-functional, patient-centered workflows, several multi-disciplinary teams from the health system of hospitals, clinics, and other services participated. We identified gaps and inconsistencies with current care processes and implemented changes that improved workflow for patients and clinicians. Our findings emphasize that, to coordinate care between many providers, process workflow must be standardized within and across settings and focus on patient care processes, not the technology. These new, comprehensive, admission-to-discharge workflows replaced the older, functional- and departmental-process flow charts that had fallen short. Our experience led to integrated redesign of the workflows, review prior to implementation and ongoing maintenance of this process knowledge across 37 hospital facilities.

  17. The demand for consistent web-based workflow editors

    OpenAIRE

    Gesing, Sandra; Atkinson, Malcolm; Klampanos, Iraklis; Galea, Michelle; Berthold, Michael; Barbera, Robert; Scardaci, Diego; Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter

    2016-01-01

    This paper identifies the high value to researchers in many disciplines of having web-based graphical editors for scientific workflows and draws attention to two technological transitions: good quality editors can now run in a browser and workflow enactment systems are emerging that manage multiple workflow languages and support multi-lingual workflows. We contend that this provides a unique opportunity to introduce multi-lingual graphical workflow editors which in turn would yield substantia...

  18. SPATIAL DATA QUALITY AND A WORKFLOW TOOL

    Directory of Open Access Journals (Sweden)

    M. Meijer

    2015-08-01

    Full Text Available Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.

  19. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  20. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  1. YesWorkflow: A User-Oriented, Language-Independent Tool for Recovering Workflow Information from Scripts

    OpenAIRE

    Timothy McPhillips; Tianhong Song; Tyler Kolisnik; Steve Aulenbach; Khalid Belhajjame; R Kyle Bocinsky; Yang Cao; James Cheney; Fernando Chirigati; Saumen Dey; Juliana Freire; Christopher Jones; James Hanken; Kintigh, Keith W.; Kohler, Timothy A.

    2015-01-01

    Scientific workflow management systems offer features for composing complex computational pipelines from modular building blocks, for executing the resulting automated workflows, and for recording the provenance of data products resulting from workflow runs. Despite the advantages such features provide, many automated workflows continue to be implemented and executed outside of scientific workflow systems due to the convenience and familiarity of scripting languages (such as Perl, Python, R, ...

  2. Workflow reengineering: a methodology for business process reengineering with workflow management technology

    OpenAIRE

    Bitzer, Sharon Marie.

    1995-01-01

    All organizations, both private and public, must improve their business practices to survive in today's volatile and highly competitive marketplace. This thesis overviews business process reengineering principles, and examines four methodologies for its accomplishment. Based on existing approaches, the thesis develops a new reengineering procedure, called the Workflow Reengineering Methodology. This methodology uses workflow automation as an enabler for efficiently and eff...

  3. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  4. YesWorkflow: A User-Oriented, Language-Independent Tool for Recovering Workflow Information from Scripts

    Directory of Open Access Journals (Sweden)

    Timothy McPhillips

    2015-02-01

    Full Text Available Scientific workflow management systems offer features for composing complex computational pipelines from modular building blocks, executing the resulting automated workflows, and recording the provenance of data products resulting from workflow runs. Despite the advantages such features provide, many automated workflows continue to be implemented and executed outside of scientific workflow systems due to the convenience and familiarity of scripting languages (such as Perl, Python, R, and MATLAB, and to the high productivity many scientists experience when using these languages. YesWorkflow is a set of software tools that aim to provide such users of scripting languages with many of the benefits of scientific workflow systems. YesWorkflow requires neither the use of a workflow engine nor the overhead of adapting code to run effectively in such a system. Instead, YesWorkflow enables scientists to annotate existing scripts with special comments that reveal the computational modules and dataflows otherwise implicit in these scripts. YesWorkflow tools extract and analyze these comments, represent the scripts in terms of entities based on the typical scientific workflow model, and provide graphical renderings of this workflow-like view of the scripts. Future version of YesWorkflow will also allow the prospective provenance of the data products of these scripts to be queried in ways similar to those available to users of scientific workflow systems.

  5. SwinDeW-C: A Peer-to-Peer Based Cloud Workflow System

    Science.gov (United States)

    Liu, Xiao; Yuan, Dong; Zhang, Gaofeng; Chen, Jinjun; Yang, Yun

    Workflow systems are designed to support the process automation of large scale business and scientific applications. In recent years, many workflow systems have been deployed on high performance computing infrastructures such as cluster, peer-to-peer (p2p), and grid computing (Moore, 2004; Wang, Jie, & Chen, 2009; Yang, Liu, Chen, Lignier, & Jin, 2007). One of the driving forces is the increasing demand of large scale instance and data/computation intensive workflow applications (large scale workflow applications for short) which are common in both eBusiness and eScience application areas. Typical examples (will be detailed in Section 13.2.1) include such as the transaction intensive nation-wide insurance claim application process; the data and computation intensive pulsar searching process in Astrophysics. Generally speaking, instance intensive applications are those processes which need to be executed for a large number of times sequentially within a very short period or concurrently with a large number of instances (Liu, Chen, Yang, & Jin, 2008; Liu et al., 2010; Yang et al., 2008). Therefore, large scale workflow applications normally require the support of high performance computing infrastructures (e.g. advanced CPU units, large memory space and high speed network), especially when workflow activities are of data and computation intensive themselves. In the real world, to accommodate such a request, expensive computing infrastructures including such as supercomputers and data servers are bought, installed, integrated and maintained with huge cost by system users

  6. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  7. Workflow for fast and efficient integration of Petrel-based fault models into coupled hydro-mechanical TOUGH2-MP - FLAC3D simulations of CO2 storage

    OpenAIRE

    B. Nakaten; T. Kempka

    2014-01-01

    We discuss a workflow implemented for coupling arbitrary numerical simulators considering complex geological models with discrete faults. This includes grid conversion of geological model grids generated with the Petrel software package to different simulator input formats within a few minutes for multi-million element models. We introduce the conceptual workflow design and tools required for the workflow realization. In this context, different fault representations can be realized including ...

  8. Scientific Workflow Applications on Amazon EC2

    CERN Document Server

    Juve, Gideon; Vahi, Karan; Mehta, Gaurang; Berriman, Bruce; Berman, Benjamin P; Maechling, Phil

    2010-01-01

    The proliferation of commercial cloud computing providers has generated significant interest in the scientific computing community. Much recent research has attempted to determine the benefits and drawbacks of cloud computing for scientific applications. Although clouds have many attractive features, such as virtualization, on-demand provisioning, and "pay as you go" usage-based pricing, it is not clear whether they are able to deliver the performance required for scientific applications at a reasonable price. In this paper we examine the performance and cost of clouds from the perspective of scientific workflow applications. We use three characteristic workflows to compare the performance of a commercial cloud with that of a typical HPC system, and we analyze the various costs associated with running those workflows in the cloud. We find that the performance of clouds is not unreasonable given the hardware resources provided, and that performance comparable to HPC systems can be achieved given similar resour...

  9. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  10. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  11. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an...... Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  12. Understanding the Potential of Digital Intraoral and Benchtop Scanning Workflows.

    Science.gov (United States)

    Jansen, Curtis E

    2015-01-01

    Although the overwhelming majority of dental offices now use digital radiography and patient records, relatively few yet use either stand-alone intraoral scanning systems (6%) or complete systems that combine intraoral scanning with computer-aided design and computer-aided manufacturing (12%). This should change as dentists become more aware of the numerous advantages scanning systems offer in terms of patient care and communication of patient information, particularly with the dental laboratory. This article reviews the various types of scanner architecture as well as potential workflow models. PMID:26625165

  13. Creating Bioinformatic Workflows within the BioExtract Server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows generally require access to multiple, distributed data sources and analytic tools. The requisite data sources may include large public data repositories, community...

  14. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  15. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter;

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment of the ...... where a baked goods company seeks to improve production time while simultaneously minimising the cost and use of resources....

  16. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  17. KDE Bioscience: platform for bioinformatics analysis workflows.

    Science.gov (United States)

    Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue

    2006-08-01

    Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research. PMID:16260186

  18. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  19. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  20. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    . Finally, we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  1. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  2. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  3. Evaluation of Workflow Management Systems - A Meta Model Approach

    OpenAIRE

    Michael Rosemann; Michael zur Muehlen

    1998-01-01

    The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. Af...

  4. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an...... approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory of the...

  5. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  6. Sharing of Cluster Resources among Multiple Workflow Applications

    Directory of Open Access Journals (Sweden)

    Uma Boregowda

    2014-04-01

    Full Text Available Many computational solutions can be expressed as workflows. A Cluster of processors is a shared resou rce among several users and hence the need for a schedul er which deals with multi - user j obs presented as workflows . T he scheduler must find the number of processors to be allotted for each workflow and schedule tasks on allotted processors. In this work, a new method to find optimal and maximum number of processors that can be allotted for a workflow is proposed. Regression analysis is used to find the best possible way to share av ailable processors , among suitable number of submitted workflows . An instance of a scheduler is created for each workflow , which schedules tasks on the allotted pro cessors. Towards this end, a new framework to receive online submission of workflow s, to allot processors to each workflow and schedule tasks, is proposed and experimented using a discrete - event based simulator . This space - sharing of proc essors among multi ple workflow s shows better performance than the other methods found in literature. Because of sp ace - sharing, an instance of a scheduler must be used for each workflow within the allotted processors. Since the number of processors for each workflow is know n only during runtime, a static schedule can not be used. Hence a hybrid scheduler which tries to combine the advantages of static and dynamic scheduler is proposed. Thus the proposed fram ework is a promising solution to multiple workflows scheduling on cl uster

  7. Agent-Based Workflow Systems in Electronic Distance Education.

    Science.gov (United States)

    Dlodlo, Nomusa; Dlodlo, Joseph B.; Masiye, Bighton S.

    Current workflow systems largely assume a closed network where all the software is available on a homogenous platform and all participants are locally linked together at the same time. The field of Electronic Distance Education (EDE) on the other hand, requires the next-generation workflow that will integrate workflows from a distributed…

  8. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  9. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  10. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  11. Reduction of Hospital Physicians' Workflow Interruptions: A Controlled Unit-Based Intervention Study

    Directory of Open Access Journals (Sweden)

    Matthias Weigl

    2012-01-01

    Full Text Available Highly interruptive clinical environments may cause work stress and suboptimal clinical care. This study features an intervention to reduce workflow interruptions by re-designing work and organizational practices in hospital physicians providing ward coverage. A prospective, controlled intervention was conducted in two surgical and two internal wards. The intervention was based on physician quality circles - a participative technique to involve employees in the development of solutions to overcome work-related stressors. Outcome measures were the frequency of observed workflow interruptions. Workflow interruptions by fellow physicians and nursing staff were significantly lower after the intervention. However, a similar decrease was also observed in control units. Additional interviews to explore process-related factors suggested that there might have been spill-over effects in the sense that solutions were not strictly confined to the intervention group. Recommendations for further research on the effectiveness and consequences of such interventions for professional communication and patient safety are discussed.

  12. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    Science.gov (United States)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  13. RMCgui: a new interface for the workflow associated with running Reverse Monte Carlo simulations

    International Nuclear Information System (INIS)

    The Reverse Monte Carlo method enables construction and refinement of large atomic models of materials that are tuned to give best agreement with experimental data such as neutron and x-ray total scattering data, capturing both the average structure and fluctuations. The practical drawback with the current implementations of this approach is the relatively complex workflow required, from setting up the configuration and simulation details through to checking the final outputs and analysing the resultant configurations. In order to make this workflow more accessible to users, we have developed an end-to-end workflow wrapped within a graphical user interface—RMCgui—designed to make the Reverse Monte Carlo more widely accessible. (paper)

  14. The application of workflow technology in the development of management procedures in NPPs

    International Nuclear Information System (INIS)

    According to the national nuclear safety standards and guides, operating organizations of NPPs should document management programs against all safety related activities. One of the preconditions for the implementation of these programs is to setup a comprehensive instructions and procedures. The workflow technology which is a concept originally from computer technology can help in analysing work processes of different working areas in NPP, designing and developing management procedures hierarchy and requirements. The application of the workflow can not only comprehensively analyse the work process but also analyse the requirements for personnel which are related to the work process, therefore the procedures and programs developed could meet the requirements of national nuclear safety standards and guides. This paper also covers the application of workflow in other areas in NPPs. (authors)

  15. KNOWLEDGE MANAGEMENT DRIVEN BUSINESS PROCESS AND WORKFLOW MODELING WITHIN AN ORGANIZATION FOR CUSTOMER SATISFACTION

    Directory of Open Access Journals (Sweden)

    Atsa Etoundi roger,

    2010-12-01

    Full Text Available Enterprises to deal with the competitive pressure of the network economy have to design their business processes and workflow systems based on the satisfaction of customers. Therefore, mass product productions have to be abandoned for individual and customized products. Enterprises that fail to meet this challenge will beobliged to step down. Those which tackle this problem need to manage the knowledge of various customers in order to come out with a set of criteria for the delivery of services or production of products. These criteria should then be use to reengineer the business processes and workflows accordingly for the satisfaction of eachof the customers. In this paper, based on the knowledge management approach, we define an enterprise business process and workflow model for the delivery of services and production of goods based on the satisfaction of customers.

  16. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  17. Distributed interoperable workflow support for electronic commerce

    OpenAIRE

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business activities including the support of financial transactions and electronic contracts. This environment has as its aim to provide key infrastructure services for mediating and monitoring electronic co...

  18. Computing Workflows for Biologists: A Roadmap

    OpenAIRE

    Shade, Ashley; Teal, Tracy K.

    2015-01-01

    Extremely large datasets have become routine in biology. However, performing a computational analysis of a large dataset can be overwhelming, especially for novices. Here, we present a step-by-step guide to computing workflows with the biologist end-user in mind. Starting from a foundation of sound data management practices, we make specific recommendations on how to approach and perform computational analyses of large datasets, with a view to enabling sound, reproducible biological research.

  19. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    . We present an algorithm for the translation of such models into Markov Decision processes expressed in the syntax of the PRISM model checker. This enables analysis of business processes for the following properties: transient and steadystate probabilities, the timing, occurrence and ordering of...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  20. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  1. ASP, Amalgamation and the Conceptual Blending Workflow

    OpenAIRE

    Eppe, Manfred; Maclean, Ewen; Confalonieri, Roberto; Kutz, Oliver; Schorlemmer, Marco; Plaza, Enric

    2015-01-01

    We present an amalgamation technique used for conceptual blending – a concept invention method that is advocated in cognitive science as a fundamental, and uniquely human engine for creative thinking. Herein, we employ the search capabilities of ASP to find commonalities among input concepts as part of the blending process, and we show how our approach fits within a generalised conceptual blending workflow. Specifically, we orchestrate ASP with imperative programming languages like Python, to...

  2. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, C; Wessels, B; Hamilton, H; Difranco, T; Mansur, D [University Hospitals Case Medical Center, Cleveland, OH (United States)

    2014-06-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of a number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT.

  3. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    International Nuclear Information System (INIS)

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of a number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT

  4. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  5. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  6. Workflow tool for engineers in a grid-enabled Matlab environment

    OpenAIRE

    Xu, Fenglain; Cox, Simon J.

    2003-01-01

    The Geodise Project aims to aid engineers in the design process by making available a suite of design search and optimisation tools and Computational Fluid Dynamics (CFD) analysis packages integrated with distributed Grid-enabled computing, databases and knowledge management technologies. Engineering Design Search and Optimisaiton (EDSO) is a a long and repetitive process requiring a complex sequence of tasks to be scripted together. We have developed a visual workflow tool with a friendly ...

  7. Analysis of the use of a workflow engine for OTRUM system software

    OpenAIRE

    Altamimi, Mohamed

    2007-01-01

    Workflow engines are attracting more and more attention. Applications based on workflow engine technology are currently developed and deployed by many companies, such as OTRUM Company. In this project, we focus on the analysis and development of an efficient workflow engine for interactive TV. The research project will realize workflow engine solutions based on three choices including commercial workflow engine, open source workflow engine, and a workflow engine implemented ...

  8. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    Science.gov (United States)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  9. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  10. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    Directory of Open Access Journals (Sweden)

    Jonhan Ho

    2012-01-01

    Full Text Available Background: For decades anatomic pathology (AP workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users′ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists′ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number

  11. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its even......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  12. UML based modeling of medical applications workflow in maxillofacial surgery

    OpenAIRE

    Toma, M; Busam, A; Ortmaier, T; Raczkowsky, J.; Höpner, C; Marmulla1, R.

    2007-01-01

    This paper presents our research in medical workflow modeling for computer- and robot-based surgical intervention in maxillofacial surgery. Our goal is to provide a method for clinical workflow modeling including workflow definition for pre- and intra-operative steps, analysis of new methods for combining conventional surgical procedures with robot- and computer-assisted procedures and facilitate an easy implementation of hard- and software systems.

  13. Deductive Synthesis of Workflows for E-Science

    OpenAIRE

    Yang, B.; Bundy, Alan; Smaill, A.; Dixon, L

    2005-01-01

    In this paper we show that the automated reasoning technique of deductive synthesis can be applied to address the problem of machine-assisted composition of e-Science workflows according to users' specifications. We encode formal specifications of e-Science data, services and workflows, constructed from their descriptions, in the generic theorem prover Isabelle. Workflows meeting this specification are then synthesised as a side-effect of proving that these specifications can be met.

  14. Text mining meets workflow: linking U-Compare with Taverna

    OpenAIRE

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare t...

  15. CPOE/EHR-Driven Healthcare Workflow Generation and Scheduling

    OpenAIRE

    Han, Minmin; Song, Xiping; DeHaan, Jan; Cao, Hui; Kennedy, Rosemary; Gugerty, Brian

    2006-01-01

    Automated healthcare workflow generation and scheduling is an approach to ensure the use of the evidence-based protocols. Generating efficient and practical workflows is challenging due to the dynamic nature of healthcare practice and operations. We propose to use Computerized Physician Order Entry (CPOE) and Electronic Health Record (EHR) components to generate workflows (consisting of scheduled work items) to aid healthcare (nursing) operations. Currently, we are prototyping and developing ...

  16. Dynamic Workflows and Advanced Data Management for Problem Solving Environments

    OpenAIRE

    Moisa, Dan

    2004-01-01

    Workflow management in problem solving environments (PSEs) is an emerging topic that aims to combine both data-oriented and execution-oriented views of scientific experiments, and closely integrate the processes underlying the practice of computational science with the software artifacts constituted by the PSE. This thesis presents a workflow management solution called BREW (BetteR Experiments through Workflow management) that provides functionality along four dimensions: components and insta...

  17. Building and Documenting Workflows with Python-Based Snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    Snakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area. Snakemake e...

  18. Effective and Efficient Similarity Search in Scientific Workflow Repositories

    OpenAIRE

    Starlinger, Johannes; Cohen-Boulakia, Sarah; Khanna, Sanjeev; Davidson, Susan; Leser, Ulf

    2015-01-01

    Scientific workflows have become a valuable tool for large-scale data processing and analysis. This has led to the creation of specialized online repositories to facilitate worflkow sharing and reuse. Over time, these repositories have grown to sizes that call for advanced methods to support workflow discovery, in particular for similarity search. Effective similarity search requires both high quality algorithms for the comparison of scientific workflows and efficient strategies for indexing,...

  19. Database management issues in workflow systems: a summary

    OpenAIRE

    Put, Ferdinand

    1996-01-01

    Currently most workflow systems use a database management system as supporting technology. But very little can be found about the actual database modeling issues in the context of workflow management. Also in the recent reference model of the Workflow Management Coalition (WFMC) nothing is mentioned about the position of a database management system, nor about modeling. This paper tries to indicate where databases come into action, and what specific problems are encountered. Discusses are...

  20. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  1. A Distributed Workflow Platform for Simulation

    OpenAIRE

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Cet article présente une approche pour concevoir, réaliser et déployer une plateforme de simulation basée sur les workflows distribués. Elle permet l'intégration de logiciels existant, par exemple Matlab, Scilab, Python, OpenFOAM, Paraview et de programmes définis par les utilisateurs. La contribution est ici le support de la tolérance aux pannes par les applications et le traitement des exceptions, c-à-d la résilience.

  2. Evolving Workflow Graphs Using Typed Genetic Programming

    Czech Academy of Sciences Publication Activity Database

    Křen, T.; Pilát, M.; Neruda, Roman

    Los Alamitos: IEEE, 2015, s. 1407-1414. ISBN 978-1-4799-7560-0. [SSCI 2015. Symposium Series on Computational Intelligence. Cape Town (ZA), 08.12.2015-10.12.2015] R&D Projects: GA ČR GA15-19877S; GA MŠk ED1.1.00/02.0070 Grant ostatní: GA UK(CZ) 187115; GA UK(CZ) SVV 260224; GA MŠk(CZ) LM2011033 Institutional support: RVO:67985807 Keywords : typed genetic programming * meta- learning * workflow graphs Subject RIV: IN - Informatics, Computer Science

  3. SPATIAL DATA QUALITY AND A WORKFLOW TOOL

    OpenAIRE

    Meijer, M; Vullings, L.A.E.; J. D. Bulens; F. I. Rip; M. Boss; Hazeu, G.; Storm, M.

    2015-01-01

    Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help th...

  4. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...... have proved that it is sufficiently expressive to model ω-regular languages for infinite runs. The model has been extended with nested sub-graphs to express hierarchy, multi-instance sub processes to model replicated behavior and support for data. The second contribution of the thesis is to provide...

  5. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  6. A Safety Analysis Approach to Clinical Workflows: Application and Evaluation

    Directory of Open Access Journals (Sweden)

    Lamis Al-Qora’n

    2014-11-01

    Full Text Available Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are required to ensure the safety of these high-risk workflows, because errors that may happen through routine workflow could propagate within the workflow to result in harmful failures of the system’s output. This paper shows how to apply an approach for safety analysis of clinical workflows to analyse the safety of the workflow within a radiology department and evaluates the approach in terms of usability and benefits. The outcomes of using this approach include identification of the root causes of hazardous workflow failures that may put patients’ lives at risk. We show that the approach is applicable to this area of healthcare and is able to present added value through the detailed information on possible failures, of both their causes and effects; therefore, it has the potential to improve the safety of radiology and other clinical workflows.

  7. Process Makna - A Semantic Wiki for Scientific Workflows

    CERN Document Server

    Paschke, Adrian

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Process Makna Semantic Wiki system is a novel combination of an business process management system adapted for scientific workflows with a Corporate Semantic Web Wiki user interface supporting knowledge intensive human interaction tasks during scientific workflow execution.

  8. Deploying and sharing U-Compare workflows as web services

    OpenAIRE

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, BalaKrishna; Thompson, Paul; Ananiadou, Sophia

    2013-01-01

    Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applic...

  9. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  10. AN AI PLANNING APPROACH FOR GENERATING BIG DATA WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Wesley Deneke

    2015-09-01

    Full Text Available The scale of big data causes the compositions of extract-transform-load (ETL workflows to grow increasingly complex. With the turnaround time for delivering solutions becoming a greater emphasis, stakeholders cannot continue to afford to wait the hundreds of hours it takes for domain experts to manually compose a workflow solution. This paper describes a novel AI planning approach that facilitates rapid composition and maintenance of ETL workflows. The workflow engine is evaluated on real-world scenarios from an industrial partner and results gathered from a prototype are reported to demonstrate the validity of the approach.

  11. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  12. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  13. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  14. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  15. Workflow management for a cosmology collaboratory

    International Nuclear Information System (INIS)

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most 'explosive' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Manuna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. The authors describe the workflow management framework for the project, discuss security and resource allocation requirements and review emerging tools to support this important aspect of collaborative work

  16. Workflow Management for a Cosmology Collaboratory

    Institute of Scientific and Technical Information of China (English)

    StewartC.Loken; CharlesMcParland

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problms in particle and nuclear physics.Itsd goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles)in precision measurements of cosmological parameters.Over the past several years,astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and,during the 4 to 8 weeks of their most"explosive~ activity,measure their changing magnitude and spectra.The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and MaunaKea,Hawaii and Mt.Palomar,California,This new program provides a demanding testbed for the integration of computational,data management and collaboratory technologies.A citical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments .This paper describes the workflow management framework for the project,discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  17. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  18. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  19. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  20. Image data compression in diagnostic imaging. International literature review and workflow recommendation

    Energy Technology Data Exchange (ETDEWEB)

    Braunschweig, R.; Kaden, Ingmar [Klinik fuer Bildgebende Diagnostik und Interventionsradiologie, BG-Kliniken Bergmannstrost Halle (Germany); Schwarzer, J.; Sprengel, C. [Dept. of Management Information System and Operations Research, Martin-Luther-Univ. Halle Wittenberg (Germany); Klose, K. [Medizinisches Zentrum fuer Radiologie, Philips-Univ. Marburg (Germany)

    2009-07-15

    Purpose: Today healthcare policy is based on effectiveness. Diagnostic imaging became a ''pace-setter'' due to amazing technical developments (e.g. multislice CT), extensive data volumes, and especially the well defined workflow-orientated scenarios on a local and (inter)national level. To make centralized networks sufficient, image data compression has been regarded as the key to a simple and secure solution. In February 2008 specialized working groups of the DRG held a consensus conference. They designed recommended data compression techniques and ratios. Material und methoden: The purpose of our paper is an international review of the literature of compression technologies, different imaging procedures (e.g. DR, CT etc.), and targets (abdomen, etc.) and to combine recommendations for compression ratios and techniques with different workflows. The studies were assigned to 4 different levels (0-3) according to the evidence. 51 studies were assigned to the highest level 3. Results: We recommend a compression factor of 1: 8 (excluding cranial scans 1:5). For workflow reasons data compression should be based on the modalities (CT, etc.). PACS-based compression is currently possible but fails to maximize workflow benefits. Only the modality-based scenarios achieve all benefits. (orig.)

  1. A multi-parametric workflow for the prioritization of mitochondrial DNA variants of clinical interest.

    Science.gov (United States)

    Santorsola, Mariangela; Calabrese, Claudia; Girolimetti, Giulia; Diroma, Maria Angela; Gasparre, Giuseppe; Attimonelli, Marcella

    2016-01-01

    Assigning a pathogenic role to mitochondrial DNA (mtDNA) variants and unveiling the potential involvement of the mitochondrial genome in diseases are challenging tasks in human medicine. Assuming that rare variants are more likely to be damaging, we designed a phylogeny-based prioritization workflow to obtain a reliable pool of candidate variants for further investigations. The prioritization workflow relies on an exhaustive functional annotation through the mtDNA extraction pipeline MToolBox and includes Macro Haplogroup Consensus Sequences to filter out fixed evolutionary variants and report rare or private variants, the nucleotide variability as reported in HmtDB and the disease score based on several predictors of pathogenicity for non-synonymous variants. Cutoffs for both the disease score as well as for the nucleotide variability index were established with the aim to discriminate sequence variants contributing to defective phenotypes. The workflow was validated on mitochondrial sequences from Leber's Hereditary Optic Neuropathy affected individuals, successfully identifying 23 variants including the majority of the known causative ones. The application of the prioritization workflow to cancer datasets allowed to trim down the number of candidate for subsequent functional analyses, unveiling among these a high percentage of somatic variants. Prioritization criteria were implemented in both standalone ( http://sourceforge.net/projects/mtoolbox/ ) and web version ( https://mseqdr.org/mtoolbox.php ) of MToolBox. PMID:26621530

  2. Image data compression in diagnostic imaging. International literature review and workflow recommendation

    International Nuclear Information System (INIS)

    Purpose: Today healthcare policy is based on effectiveness. Diagnostic imaging became a ''pace-setter'' due to amazing technical developments (e.g. multislice CT), extensive data volumes, and especially the well defined workflow-orientated scenarios on a local and (inter)national level. To make centralized networks sufficient, image data compression has been regarded as the key to a simple and secure solution. In February 2008 specialized working groups of the DRG held a consensus conference. They designed recommended data compression techniques and ratios. Material und methoden: The purpose of our paper is an international review of the literature of compression technologies, different imaging procedures (e.g. DR, CT etc.), and targets (abdomen, etc.) and to combine recommendations for compression ratios and techniques with different workflows. The studies were assigned to 4 different levels (0-3) according to the evidence. 51 studies were assigned to the highest level 3. Results: We recommend a compression factor of 1: 8 (excluding cranial scans 1:5). For workflow reasons data compression should be based on the modalities (CT, etc.). PACS-based compression is currently possible but fails to maximize workflow benefits. Only the modality-based scenarios achieve all benefits. (orig.)

  3. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  4. Research of Web-based Workflow Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The state of the art of workflow management techniques in research is introduced. The research and development trends of Workflow Manag ement System (WFMS) are presented. On basis of analysis and comparison of kinds of WFMSs, a WFMS based on Web technology and distributed object management is pr oposed. Finally, the application of the WFMS in supply chain management is descr ibed in detail.

  5. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  6. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  7. The influence of workflow systems on team learning

    NARCIS (Netherlands)

    Offenbeek, Marjolein A.G. van

    1999-01-01

    The question is raised what influence a team’s use of a workflow system will have on teamlearning. In office environments where the work is organised in semi-autonomous teams thatare responsible for whole processes, workflow systems are being implemented to effectivelyand efficiently realise the con

  8. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  9. Phonon Gas Model (PGM) workflow in the VLab Science Gateway

    Science.gov (United States)

    da Silveira, P.; Zhang, D.; Wentzcovitch, R. M.

    2013-12-01

    This contribution describes a scientific workflow for first principles computations of free energy of crystalline solids using the phonon gas model (PGM). This model was recently implemented as a hybrid method combining molecular dynamics and phonon normal mode analysis to extract temperature dependent phonon frequencies and life times beyond perturbation theory. This is a demanding high throughout workflow and is currently being implemented in VLab Cyberinfrastructure [da Silveira et al., 2008], which has recently been integrated to the XSEDE. First we review the underlying PGM, its practical implementation, and calculation requirements. We then describe the workflow management and its general method for handling actions. We illustrate the PGM application with a calculation of MgSiO3-perovskite's anharmonic phonons. We conclude with an outlook of workflows to compute other material's properties that will use the PGM workflow. Research supported by NSF award EAR-1019853.

  10. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  11. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  12. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  13. Using Cloud-Aware Provenance to Reproduce Scientific Workflow Execution on Cloud

    OpenAIRE

    Hasham, Khawar; Munir, Kamran; McClatchey, Richard

    2015-01-01

    Provenance has been thought of a mechanism to verify a workflow and to provide workflow reproducibility. This provenance of scientific workflows has been effectively carried out in Grid based scientific workflow systems. However, recent adoption of Cloud-based scientific workflows present an opportunity to investigate the suitability of existing approaches or propose new approaches to collect provenance information from the Cloud and to utilize it for workflow repeatability in the Cloud infra...

  14. An Integrated Workflow for DNA Methylation Analysis

    Institute of Scientific and Technical Information of China (English)

    Pingchuan Li; Feray Demirci; Gayathri Mahalingam; Caghan Demirci; Mayumi Nakano; Blake C.Meyers

    2013-01-01

    The analysis of cytosine methylation provides a new way to assess and describe epigenetic regulation at a whole-genome level in many eukaryotes.DNA methylation has a demonstrated role in the genome stability and protection,regulation of gene expression and many other aspects of genome function and maintenance.BS-seq is a relatively unbiased method for profiling the DNA methylation,with a resolution capable of measuring methylation at individual cytosines.Here we describe,as an example,a workflow to handle DNA methylation analysis,from BS-seq library preparation to the data visualization.We describe some applications for the analysis and interpretation of these data.Our laboratory provides public access to plant DNA methylation data via visualization tools available at our "Next-Gen Sequence" websites (http://mpss.udel.edu),along with small RNA,RNA-seq and other data types.

  15. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  16. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  17. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, Yingssu [Stanford University, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080 (United States); McPhillips, Scott E.; González, Ana; McPhillips, Timothy M. [Stanford University, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Zinn, Daniel [LogicBlox Inc., 1349 West Peachtree Street NW, Atlanta, GA 30309 (United States); Cohen, Aina E. [Stanford University, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Feese, Michael D.; Bushnell, David [Cocrystal Discovery Inc., 19805 North Creek Parkway, Bothell, WA 98011 (United States); Tiefenbrunn, Theresa; Stout, C. David [The Scripps Research Institute, 10550 North Torrey Pines Road, La Jolla, CA 92037 (United States); Ludaescher, Bertram [University of California, One Shields Avenue, Davis, CA 95616 (United States); Hedman, Britt; Hodgson, Keith O. [Stanford University, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States); Stanford University, 333 Campus Drive, Mudd Building, Stanford, CA 94305-5080 (United States); Soltis, S. Michael, E-mail: soltis@slac.stanford.edu [Stanford University, 2575 Sand Hill Road, Menlo Park, CA 94025 (United States)

    2013-05-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  18. Software workflow for the automatic tagging of medieval manuscript images (SWATI)

    Science.gov (United States)

    Chandna, Swati; Tonne, Danah; Jejkal, Thomas; Stotzka, Rainer; Krause, Celia; Vanscheidt, Philipp; Busch, Hannah; Prabhune, Ajinkya

    2015-01-01

    Digital methods, tools and algorithms are gaining in importance for the analysis of digitized manuscript collections in the arts and humanities. One example is the BMBF-funded research project "eCodicology" which aims to design, evaluate and optimize algorithms for the automatic identification of macro- and micro-structural layout features of medieval manuscripts. The main goal of this research project is to provide better insights into high-dimensional datasets of medieval manuscripts for humanities scholars. The heterogeneous nature and size of the humanities data and the need to create a database of automatically extracted reproducible features for better statistical and visual analysis are the main challenges in designing a workflow for the arts and humanities. This paper presents a concept of a workflow for the automatic tagging of medieval manuscripts. As a starting point, the workflow uses medieval manuscripts digitized within the scope of the project Virtual Scriptorium St. Matthias". Firstly, these digitized manuscripts are ingested into a data repository. Secondly, specific algorithms are adapted or designed for the identification of macro- and micro-structural layout elements like page size, writing space, number of lines etc. And lastly, a statistical analysis and scientific evaluation of the manuscripts groups are performed. The workflow is designed generically to process large amounts of data automatically with any desired algorithm for feature extraction. As a result, a database of objectified and reproducible features is created which helps to analyze and visualize hidden relationships of around 170,000 pages. The workflow shows the potential of automatic image analysis by enabling the processing of a single page in less than a minute. Furthermore, the accuracy tests of the workflow on a small set of manuscripts with respect to features like page size and text areas show that automatic and manual analysis are comparable. The usage of a computer

  19. Understanding dental CAD/CAM for restorations--the digital workflow from a mechanical engineering viewpoint.

    Science.gov (United States)

    Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P

    2015-01-01

    As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology. PMID:25911827

  20. End-user developed workflow-based hemodialysis nursing record system.

    Science.gov (United States)

    Tai, Hsin-Ling; Lin, Hsiu-Wen; Ke, Suh-Huei; Lin, Shu-Ai; Chang, Chiung-Chu; Chang, Polun

    2009-01-01

    We reported how we decided to build our own Hemodialysis nursing record system using the end user computing strategy with Excel VBA. The project took one year to complete since we used our off-duty time and started everything from the grounds. We are proud of the final system which tightly meets our workflow and clinical needs. Its interface was carefully designed to be easy to use with a style. PMID:19593037

  1. Supporting Effective Unexpected Exception Handling in Workflow Management Systems Within Organizational Contexts

    OpenAIRE

    Mourão, Hernâni Raul Vergueiro Monteiro Cidade, 1964-

    2008-01-01

    Tese de doutoramento em Informática (Engenharia Informática), apresentada à Universidade de Lisboa através da Faculdade de Ciências, 2008 Workflow Management Systems (WfMS) support the execution of organizational processes within organizations. Processes are modelled using high level languages specifying the sequence of tasks the organization has to perform. However, organizational processes do not have always a smooth flow conforming to any possible designed model and exceptions to the ru...

  2. Using meta-mining to support data mining workflow planning and optimization

    OpenAIRE

    Kalousis, Alexandros; Nguyen, Phong; Hilario, Mélanie

    2014-01-01

    Knowledge Discovery in Databases is a complex process that involves many different data processing and learning operators. Today’s Knowledge Discovery Support Systems can contain several hundred operators. A major challenge is to assist the user in designing workflows which are not only valid but also – ideally – optimize some performance measure associated with the user goal. In this paper we present such a system. The system relies on a meta-mining module which analyses past data mining exp...

  3. Simplified Toolbar to Accelerate Repeated Tasks (START) for ArcGIS: Optimizing Workflows in Humanitarian Demining

    OpenAIRE

    Lacroix, Pierre Marcel Anselme; De Roulet, Pablo; Ray, Nicolas

    2014-01-01

    This paper presents START (Simplified Toolbar to Accelerate Repeated Tasks), a new, freely downloadable ArcGIS extension designed for non-expert GIS users. START was developed jointly by the Geneva International Centre for Humanitarian Demining (GICHD) and the University of Geneva to support frequent workflows relating to mine action. START brings together a series of basic ArcGIS tools in one toolbar and provides new geoprocessing, geometry and database management functions. The toolbar oper...

  4. A Workflow Process Mining Algorithm Based on Synchro-Net

    Institute of Scientific and Technical Information of China (English)

    Xing-Qi Huang; Li-Fu Wang; Wen Zhao; Shi-Kun Zhang; Chong-Yi Yuan

    2006-01-01

    Sometimes historic information about workflow execution is needed to analyze business processes. Process mining aims at extracting information from event logs for capturing a business process in execution. In this paper a process mining algorithm is proposed based on Synchro-Net which is a synchronization-based model of workflow logic and workflow semantics. With this mining algorithm based on the model, problems such as invisible tasks and short-loops can be dealt with at ease. A process mining example is presented to illustrate the algorithm, and the evaluation is also given.

  5. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  6. Mapping open access to e-resources workflows

    OpenAIRE

    Stone, Graham; Awre, Chris; Stainthorp, Paul

    2016-01-01

    Open Access workflows are often seen as a separate add-on set of processes. However, libraries already have processes in place to manage the e-resource life cycle. Therefore, as part of work package 8 (Library processes and open access), the HHuLOA team decided to investigate how open access workflows could be embedded into e-resource management. This poster accompanies the blog post at: https://library3.hud.ac.uk/blogs/hhuloa/2016/05/11/mapping-open-access-to-e-resources-workflows/

  7. Constraint-Guided Workflow Composition Based on the EDAM Ontology

    CERN Document Server

    Lamprecht, Anna-Lena; Steffen, Bernhard; Margaria, Tiziana

    2010-01-01

    Methods for the automatic composition of services into executable workflows need detailed knowledge about the application domain,in particular about the available services and their behavior in terms of input/output data descriptions. In this paper we discuss how the EMBRACE data and methods ontology (EDAM) can be used as background knowledge for the composition of bioinformatics workflows. We show by means of a small example domain that the EDAM knowledge facilitates finding possible workflows, but that additional knowledge is required to guide the search towards actually adequate solutions. We illustrate how the ability to flexibly formulate domain-specific and problem-specific constraints supports the work ow development process.

  8. Service-based flexible workflow system for virtual enterprise

    Institute of Scientific and Technical Information of China (English)

    WU Shao-fei

    2008-01-01

    Using the services provided by virtual enterprises, we presented a solution to implement flexible inter-enterprise workflow management. Services were the responses of events that can be accessed programmatically on the Internet by HTTP protocol. Services were obtained according to some standardized service templates. The workflow engine's flexible control to a request was bound to appropriate services and their providers by using a constraint-based, dynamic binding mechanism. Hence, a flexible and collaborative business was achieved. The workflow management system supports virtual enterprise, and the styles of virtual enterprises can be adjusted readily to adapt various situations.

  9. Solutions for complex, multi data type and multi tool analysis: principles and applications of using workflow and pipelining methods.

    Science.gov (United States)

    Munro, Robin E J; Guo, Yike

    2009-01-01

    Analytical workflow technology, sometimes also called data pipelining, is the fundamental component that provides the scalable analytical middleware that can be used to enable the rapid building and deployment of an analytical application. Analytical workflows enable researchers, analysts and informaticians to integrate and access data and tools from structured and non-structured data sources so that analytics can bridge different silos of information; compose multiple analytical methods and data transformations without coding; rapidly develop applications and solutions by visually constructing analytical workflows that are easy to revise should the requirements change; access domain-specific extensions for specific projects or areas, for example, text extraction, visualisation, reporting, genetics, cheminformatics, bioinformatics and patient-based analytics; automatically deploy workflows directly into web portals and as web services to be part of a service-oriented architecture (SOA). By performing workflow building, using a middleware layer for data integration, it is a relatively simple exercise to visually design an analytical process for data analysis and then publish this as a service to a web browser. All this is encapsulated into what can be referred to as an 'Embedded Analytics' methodology which will be described here with examples covering different scientifically focused data analysis problems. PMID:19597790

  10. Comprehensive and Scalable Highly Automated MS-Based Proteomic Workflow for Clinical Biomarker Discovery in Human Plasma.

    Science.gov (United States)

    Dayon, Loïc; Núñez Galindo, Antonio; Corthésy, John; Cominetti, Ornella; Kussmann, Martin

    2014-07-24

    Over the past decade, mass spectrometric performance has greatly improved in terms of sensitivity, dynamic range, and speed. By contrast, only limited progress has been accomplished with regard to automation, throughput, and robustness of the proteomic sample preparation process upstream of mass spectrometry. The present work delivers an optimized analysis of human plasma samples in both small preclinical and large clinical studies, enabled by the development of a highly automated quantitative proteomic workflow. Several iterative evaluation and validation steps were performed before process "design freeze" and development completion. A robotic liquid handling workflow and platform (including reduction, alkylation, digestion, TMT labeling, pooling, and purification) were shown to provide better quantitative trueness and precision than manual operation at the bench. Depletion of the most abundant human plasma proteins and subsequent buffer exchange were also developed and integrated. Finally, 96 identical pooled human plasma samples were prepared in a 96-well plate format, and each sample was individually subjected to our developed workflow. This test revealed increased throughput and robustness compared with to-date published manual or less automated workflows. Our workflow is ready-to-use for future (pre-) clinical studies. We expect our work to facilitate, accelerate, and improve clinical proteomic discovery in human blood plasma. PMID:25058407

  11. DMS systémy a workflow

    OpenAIRE

    Jakeš, Jiří

    2008-01-01

    Tato práce pojednává o systémch na správu dokumentů (DMS) a podpoře vnitropodnikových prosesů pomocí integrovaných workflow modulů. Práce zahrnuje hlavní důvody pro zavedení DMS, benefity plynoucí z jeho zavedení, funkcionalitu typických DMS, vymezuje jednotlivé komponenty systému, mapuje stav na příslušném trhu a uvádí trendy, ke kterým bude vývoj techto systémů směřovat. Práce vychází především z praktických poznatků a snaží se najít přechod od technologie k businessu....

  12. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  13. Confusion Analysis and Detection for Workflow Nets

    Directory of Open Access Journals (Sweden)

    Xiao-liang Chen

    2014-01-01

    Full Text Available Option processes often occur in a business procedure with respect to resource competition. In a business procedure modeled with a workflow net (WF-net, all decision behavior and option operations for business tasks are modeled and performed by the conflicts in corresponding WF-net. Concurrency in WF-nets is applied to keep a high-performance operation of business procedures. However, the firing of concurrent transitions in a WF-net may lead to the disappearance of conflicts in the WF-net. The phenomenon is usually called confusions that produces difficulties for the resolution of conflicts. This paper investigates confusion detection problems in WF-nets. First, confusions are formalized as a class of marked subnets with special conflicting and concurrent features. Second, a detection approach based on the characteristics of confusion subnets and the integer linear programming (ILP is developed, which is not required to compute the reachability graph of a WF-net. Examples of the confusion detection in WF-nets are presented. Finally, the impact of confusions on the properties of WF-nets is specified.

  14. McRunjob: A High Energy Physics Workflow Planner for Grid Production Processing

    CERN Document Server

    Graham, G E; Bertram, I; Graham, Gregory E.; Evans, Dave; Bertram, Iain

    2003-01-01

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job d...

  15. ScyFlow: An Environment for the Visual Specification and Execution of Scientific Workflows

    Science.gov (United States)

    McCann, Karen M.; Yarrow, Maurice; DeVivo, Adrian; Mehrotra, Piyush

    2004-01-01

    With the advent of grid technologies, scientists and engineers are building more and more complex applications to utilize distributed grid resources. The core grid services provide a path for accessing and utilizing these resources in a secure and seamless fashion. However what the scientists need is an environment that will allow them to specify their application runs at a high organizational level, and then support efficient execution across any given set or sets of resources. We have been designing and implementing ScyFlow, a dual-interface architecture (both GUT and APT) that addresses this problem. The scientist/user specifies the application tasks along with the necessary control and data flow, and monitors and manages the execution of the resulting workflow across the distributed resources. In this paper, we utilize two scenarios to provide the details of the two modules of the project, the visual editor and the runtime workflow engine.

  16. Automated workflows for critical time-dependent calibrations at the CMS experiment.

    CERN Document Server

    Cerminara, Gianluca

    2015-01-01

    Fast and efficient methods for the calibration and the alignment ofthe detector are a key asset to exploit the physics potential of theCompact Muon Solenoid (CMS) detector and to ensure timely preparationof results for conferences and publications.To achieve this goal, the CMS experiment has set up a powerfulframework. This includes automated workflows in the context of a promptcalibration concept, which allows for a quick turnaround of thecalibration process following as fast as possible any change inrunning conditions.The presentation will review the design and operational experience ofthese workflows and the related monitoring system during the LHC RunIand focus on the development, deployment and commissioning in preparation of RunII.

  17. Delegation in Role Based Access Control Model for Workflow Systems

    Directory of Open Access Journals (Sweden)

    Prasanna H Bammigatti

    2008-03-01

    Full Text Available Role -based access control (RBAC has been introduced in the last few years, and offers a powerful means of specifying access control decisions. The model of RBAC usually assumes that, if there is a role hierarchy then access rights are inherited upwards through the hierarchy. In organization workflow the main threat is of access control. The Role based access control is one of the best suitable access control model one can think of. It is not only the role hierarchies but also other control factors that affect the access control in the workflow. The paper discusses the control factors and role hierarchies in workflow and brings a new model of RBAC. This paper also over comes the conflicts and proves that the system is safe by applying the new model to the workflow

  18. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  19. A Community-Driven Workflow Recommendation and Reuse Infrastructure Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX...

  20. Network resource control for grid workflow management systems

    NARCIS (Netherlands)

    R. Strijkers; M. Cristea; V. Korkhov; D. Marchal; A. Belloum; C. de Laat; R. Meijer

    2010-01-01

    Grid workflow management systems automate the orchestration of scientific applications with large computational and data processing needs, but lack control over network resources. Consequently, the management system cannot prevent multiple communication intensive applications to compete for network

  1. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system. PMID:14518730

  2. Coupling between a multi-physics workflow engine and an optimization framework

    Science.gov (United States)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  3. Supporting exploration and collaboration in scientific workflow systems

    Science.gov (United States)

    Marini, L.; Kooper, R.; Bajcsy, P.; Myers, J.

    2007-12-01

    As the amount of observation data captured everyday increases, running scientific workflows will soon become a fundamental step of scientific inquiry. Current scientific workflow systems offer ways to link together data, software and computational resources, but often accomplish this by requiring a deep understanding of the system with a steep learning curve. Thus, there is a need to lower user adoption barriers for workflow systems and improve the plug-and-play functionality of these systems. We created a system that allows the user to easily create and share workflows, data and algorithms. Our goal of lowering user adoption barriers is to support discoveries and to provide means for conducting research more efficiently. Current paradigms for workflow creation focus on the visual programming using a graph based metaphor. This can be a powerful metaphor in the hands of expert users, but can become daunting when graphs become large, the steps in the graph include engineering level steps such as loading and visualizing data, and the users are not very familiar with all the possible tools available. We present a different method of workflow creation that co- exists with the standard graph based editors. The method builds on exploratory interface using a macro- recording style, and focuses on the data being analyzed during the step by step creation of the workflow. Instead of storing data in system specific data structures, the use of more flexible open standards that are platform independent would create systems that are easier to extend and that provide a simple interface for external applications to query and analyze the data and metadata produced. We have explored and implemented a system that stores workflows and related metadata using the Resource Description Framework (RDF) metadata model and that is build on top of the Tupelo data and metadata archiving system. The scientific workflow system connects to shared content repositories, where users can easily share

  4. Optimierung datenintensiver Workflows: Konzepte und Realisierung eines heuristischen, regelbasierten Optimierers

    OpenAIRE

    Vrhovnik, Marko

    2011-01-01

    Um die Modellierung datenintensiver Workflows, die große relationale Datenmengen verarbeiten, zu vereinfachen, wurden Workflowbeschreibungssprachen, wie BPEL, von führenden Herstellern von Workflow- und Datenbankmanagementsystemen um SQL-Funktionalität erweitert. Dadurch müssen Datenverarbeitungsoperationen, wie SQL-Anweisungen oder Aufrufe benutzerdefinierter Prozeduren, nicht mehr in Web-Services gekapselt werden, sondern können direkt auf der Workflowebene definiert werden. Daraus resultie...

  5. Organisatorische Flexibilität durch Workflow-Management-Systeme?

    OpenAIRE

    Kirn, Stefan

    2008-01-01

    Mit dem Einsatz von Workflow-Management-Systemen wird allgemein eine Verbesserung der organisatorischen Flexibilität verbunden. Das ist dann von wesentlicher Bedeutung, wenn, wie in der Dienstleistung, dem Kunden maßgeschneiderte Produkte angeboten werden sollen. Ausgehend von den theoretischen Grundlagen zur Flexibilität prozeßorientierter Organisationen untersucht der Beitrag anhand empirischer Daten die flexibilitätsrelevanten Eigenschaften von Workflow-Management-Systemen. Diese hängen we...

  6. Workflow management systems, their security and access control mechanisms

    OpenAIRE

    Chehrazi, Golriz

    2007-01-01

    This paper gives an overview of workflow management systems (WfMSs) and their security requirements with focus on access mechanisms. It is a descriptive paper in which we examine the state of the art of workflow systems, describe what security risks affect WfMSs in particular, and how these can be diminiuished. WfMSs manage, illustrate and support business processes. They contribute to the performance, automation and optimization of processes, which is important in the global economy today. ...

  7. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    OpenAIRE

    SARAVANAN .M.S,; RAMA SREE .R.J

    2011-01-01

    This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM). A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of...

  8. Optimization of tomographic reconstruction workflows on geographically distributed resources.

    Science.gov (United States)

    Bicer, Tekin; Gürsoy, Dogˇa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T

    2016-07-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  9. Scheduling Computational Workflows on Failure-Prone Platforms

    OpenAIRE

    Aupy, Guillaume; Benoit, Anne; Casanova, Henri; Robert, Yves

    2015-01-01

    We study the scheduling of computational workflows on compute resources that experience exponentially distributed failures. When a failure occurs, roll-back and recovery is used to resume the execution from the last checkpointed state. The scheduling problem is to minimize the expected execution time by deciding in which order to execute the tasks in the workflow and whether to checkpoint or not checkpoint a task after it completes. We give a polynomial-time algorithm for fork graphs and show...

  10. INTRODUCTION OF WINDOWS WORKFLOW FOUNDATION INTO EXISTING APPLICATIONS

    OpenAIRE

    Kržič, Jernej

    2008-01-01

    In this dissertation we discuss the process of implementing the Windows Workflow foundation model in existing applications. We present several possible approaches to solving this problem and their advantages, as well as limitations. First, we describe the main concepts behind Business Process Management and the Windows Workflow foundation programming model. Other related technologies are also described, including .NET Framwork, Windows Communication Foundation, ASP.NET etc. The practical p...

  11. Process Makna - A Semantic Wiki for Scientific Workflows

    OpenAIRE

    Paschke, Adrian; Zhao, Zhili

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Proces...

  12. Spheres of isolation: adaptation of isolation levels to transactional workflow

    OpenAIRE

    Guabtni, Adnene; Charoy, François; Godart, Claude

    2005-01-01

    In Workflow Management Systems (WFMSs), transaction isolation is managed most of the time by the underlying database system using ANSI SQL strategies. These strategies do not take sufficiently into account process aspects. Our work consists in studying with more depth the relation between isolation strategy and process dimension as well as the real isolation needs in workflow environments. To carry out these needs, we define `spheres of isolation' inspired from `spheres of control' proposed b...

  13. A Taxonomy of Workflow Management Systems for Grid Computing

    OpenAIRE

    Yu, Jia; Buyya, Rajkumar

    2005-01-01

    With the advent of Grid and application technologies, scientists and engineers are building more and more complex applications to manage and process large data sets, and execute scientific experiments on distributed resources. Such application scenarios require means for composing and executing complex workflows. Therefore, many efforts have been made towards the development of workflow management systems for Grid computing. In this paper, we propose a taxonomy that characterizes and classifi...

  14. Workflow-based semantics for peer-to-peer specifications

    Institute of Scientific and Technical Information of China (English)

    Antonio BROGI; Razvan POPESCU

    2008-01-01

    In this paper we introduce SMoL, a simplified BPEL-like language for specifying peer and service beha-viour in P2P systems. We then define a transformational semantics of SMoL in terms of Yet Another Workflow Language (YAWL) workflows, which enables the simu-lation (e.g., testing possible execution scenarios) and ana-lysis (e.g., verifying reachability or lock freedom) of the behaviour of P2P peers and services.

  15. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    OpenAIRE

    Mohamed El Khadiri; Abdelaziz El Fazziki

    2012-01-01

    This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI) of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient...

  16. BReW: Blackbox Resource Selection for e-Science Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh [Univ. of Southern California, Los Angeles, CA (United States); Soroush, Emad [Univ. of Washington, Seattle, WA (United States); Van Ingen, Catharine [Microsoft Research, San Francisco, CA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-04

    Workflows are commonly used to model data intensive scientific analysis. As computational resource needs increase for eScience, emerging platforms like clouds present additional resource choices for scientists and policy makers. We introduce BReW, a tool enables users to make rapid, highlevel platform selection for their workflows using limited workflow knowledge. This helps make informed decisions on whether to port a workflow to a new platform. Our analysis of synthetic and real eScience workflows shows that using just total runtime length, maximum task fanout, and total data used and produced by the workflow, BReW can provide platform predictions comparable to whitebox models with detailed workflow knowledge.

  17. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  18. Collaboration Policies: Access Control Management in Decentralized Heterogeneous Workflows

    Directory of Open Access Journals (Sweden)

    Mine Altunay

    2006-07-01

    Full Text Available Service-oriented computing promotes collaboration by defining the standards layer that allows compatibility between disparate domains. Workflows, by taking advantage of the service oriented framework, provide the necessary tools to harness services in order to tackle complicated problems. As a result, a service is no longer exposed to a small pre-determined homogeneous pool of users; instead it has a large, undefined, and heterogeneous pool of users. This paradigm shift in computing results in increased service exposure. The interactions among the services of a workflow must be carefully evaluated against the security risks associated with them. Classical security problems, such as delegation of rights, conflict of interest, and access control in general, become more complicated due to multiple autonomous security domains and the absence of pre- established trust relationships among the domains. Our work tackles these problems in two aspects: it provides a service owner with the necessary means to express and evaluate its trust requirements from a workflow (collaboration policies, and it incorporates these trust requirements into the workflow-planning framework (workflow authorization framework. Our policy-based framework allows bilateral peer-level trust evaluations that are based on each peer’s collaboration policies, and incorporates the outcome of these evaluations into the workflow planning logic. As a result, our work provides the necessary tools for promoting multi-party ad-hoc collaborations, and aims to reduce the reluctance and hesitation towards these collaborations by attacking the security risks associated with them.

  19. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. PMID:26721184

  20. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  1. The CESM Workflow Re-Engineering Project

    Science.gov (United States)

    Strand, G.

    2015-12-01

    The Community Earth System Model (CESM) Workflow Re-Engineering Project is a collaborative project between the CESM Software Engineering Group (CSEG) and the NCAR Computation and Information Systems Lab (CISL) Application Scalability and Performance (ASAP) Group to revamp how CESM saves its output. The CMIP3 and particularly CMIP5 experiences in submitting CESM data to those intercomparison projects revealed that the output format of the CESM is not well-suited for the data requirements common to model intercomparison projects. CESM, for efficiency reasons, creates output files containing all fields for each model time sampling, but MIPs require individual files for each field comprising all model time samples. This transposition of model output can be very time-consuming; depending on the volume of data written by the specific simulation, the time to re-orient the data can be comparable to the time required for the simulation to complete. Previous strategies including using serial tools to perform this transposition, but they are now far too inefficient to deal with the many terabytes of output a single simulation can generate. A new set of Python tools, using data parallelism, have been written to enable this re-orientation, and have achieved markedly improved I/O performance. The perspective of a data manager/data producer in the use of these new tools is presented, and likely future work on their development and use will be shown. These tools are a critical part of the NCAR CESM submission to the upcoming CMIP6, with the intention that a much more timely and efficient submission of the expected petabytes of data will be accomplished in the given time frame.

  2. Inverse IMRT workflow process at Austin health

    International Nuclear Information System (INIS)

    Full text: The work presented here will review the strategies adopted at Austin Health to bring IMRT into clinical use. IMRT is delivered using step and shoot mode on an Elekta Precise machine with 40 pairs of 1cm wide MLC leaves. Planning is done using CMS Focus/XiO. A collaborative approach for RO's, Physicists and RTs from concept to implementation was adopted. An overview will be given of the workflow for the clinic, the equipment used, tolerance levels and the lessons learned. 1. Strategic Planning for IMRT 2. Training a. MSKCC (New York) b.ESTRO (Amsterdam) c.Elekta (US and UK) 3. Linac testing and data acquisition a. Equipment and software review and selection b. Linac reliability/geometric and mechanical checks c. Draft Patient QA procedure d. EPI Image matching checks and procedures 4. Planning system checks a. export of dose matrix (options) b. dose calculation choices 5. IMRT Research Initiatives a. IMRT Planning Studies, Stabilisation, On-line Imaging 6. Equipment Procurement and testing a. Physics and Linac Equipment, Hardware, Software/Licences, Stabilisation 7. Establishing a DICOM Environment a. Prescription sending, Image transfer for EPI checks b. QA Files 8. Physics QA (Pre-Treatment) a.Clinical plan review; DVH checks b. geometry; dosimetry checks; DICOM checks c. 2D Distance to agreement; mm difference reports; Gamma function index 9. Documentation a.Protocol Development i. ICRU 50/62 reporting and prescribing b. QA for Physics c. QA for RT's d. Generation of a report for RO/patient history. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  3. Anima: Modular workflow system for comprehensive image data analysis

    Directory of Open Access Journals (Sweden)

    Ville eRantanen

    2014-07-01

    Full Text Available Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and preprocessing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis, and it contains several features that are crucial in high-throughput image data analysis: programming language independence, batch processing, easily customized data processing, interoperability with other software via application programming interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environmens. Anima is fully open source and available with documentation at http://www.anduril.org/anima

  4. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality-Assurance.

    Science.gov (United States)

    Askren, Mary K; McAllister-Day, Trevor K; Koh, Natalie; Mestre, Zoé; Dines, Jennifer N; Korman, Benjamin A; Melhorn, Susan J; Peterson, Daniel J; Peverill, Matthew; Qin, Xiaoyan; Rane, Swati D; Reilly, Melissa A; Reiter, Maya A; Sambrook, Kelly A; Woelfer, Karl A; Grabowski, Thomas J; Madhyastha, Tara M

    2016-01-01

    The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. A makefile (or a file of instructions for Make) consists of a set of rules that create or update target files if they have not been modified since their dependencies were last modified. These rules are processed to create a directed acyclic dependency graph that allows multiple entry points from which to execute the workflow. We show that using Make we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make permits a large step toward these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials) and all data, scripts, and makefiles necessary to run the practicals and examples are available in the "makepipelines" project at NITRC. PMID:26869916

  5. INTEGRATED WORKFLOW-BASED SYSTEM FOR RISK MAPPING OF OIL SPILLS WITH USING HIGH PERFORMANCE CLUSTER

    OpenAIRE

    Kairat A. Bostanbekov; Jalal K. Jamalov; Dmitriy K. Kim; Daniyar B. Nurseitov; Ilyas E. Tursunov; Edige A. Zakarin; David L. Zaurbekov

    2013-01-01

    We present in the paper the integrated workflow-based system for risk mapping of oil spills with using high performance cluster. The design and integration methodology of the system are based on service-oriented architecture that allows provide an easy, flexible integration of any service into any desktop or mobile client. We have designed and build 4-tier SOA on the basis of W3C Web service standard. The main objective of the developing system to predict risk of a negative impact on a biota ...

  6. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  7. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  8. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  9. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future. PMID:22317594

  10. Trends in Use of Scientific Workflows: Insights from a Public Repository and Recommendations for Best Practice

    Directory of Open Access Journals (Sweden)

    Richard Littauer

    2012-12-01

    Full Text Available Scientific workflows are typically used to automate the processing, analysis and management of scientific data. Most scientific workflow programs provide a user-friendly graphical user interface that enables scientists to more easily create and visualize complex workflows that may be comprised of dozens of processing and analytical steps. Furthermore, many workflows provide mechanisms for tracing provenance and methodologies that foster reproducible science. Despite their potential for enabling science, few studies have examined how the process of creating, executing, and sharing workflows can be improved. In order to promote open discourse and access to scientific methods as well as data, we analyzed a wide variety of workflow systems and publicly available workflows on the public repository myExperiment. It is hoped that understanding the usage of workflows and developing a set of recommended best practices will lead to increased contribution of workflows to the public domain.

  11. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  12. Hybrid Workflow Policy Management for Heart Disease Identification

    CERN Document Server

    Kim, Dong-Hyun; Youn, Chan-Hyun

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we propose hybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS) workflow applications.

  13. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  14. Hybrid Workflow Policy Management for Heart Disease Identification

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Kim

    2009-12-01

    Full Text Available As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS in Grid computing are promisingsolution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we proposehybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS workflow applications.

  15. Implementation of Workflow Management System for Collaborative Process Planning

    Directory of Open Access Journals (Sweden)

    Su Ying-Ying

    2013-01-01

    Full Text Available Workflow management system has generally been accepted as a paradigm for supporting processes in complex organizations. Since process planning is a huge and complex work, several process planners should execute planning together. Collaborative process planning is inevitable for saving in time and cost of process planning through concurrent and collaborative engineering. Workflow technology, as the important branch of computer supported cooperative work, has strong advantages in organization management and flow optimization. In this research, the structure and business flow of collaborative process planning is analyzed. The function of workflow management system for collaborative process planning is illustrated and the system is implemented to effectively control and manage the flow of process planning.

  16. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  17. A HYBRID PETRI-NET MODEL OF GRID WORKFLOW

    Institute of Scientific and Technical Information of China (English)

    Ji Yimu; Wang Ruchuan; Ren Xunyi

    2008-01-01

    In order to effectively control the random tasks submitted and executed in grid workflow, a grid workflow model based on hybrid petri-net is presented. This model is composed of random petri-net, colored petri-net and general petri-net. Therein random petri-net declares the relationship between the number of grid users' random tasks and the size of service window and computes the server intensity of grid system. Colored petri-net sets different color for places with grid services and provides the valid interfaces for grid resource allocation and task scheduling. The experiment indicated that the model presented in this letter could compute the valve between the number of users' random tasks and the size of grid service window in grid workflow management system.

  18. ESO Reflex: A Graphical Workflow Engine for Astronomical Data Reduction

    Science.gov (United States)

    Hook, Richard; Romaniello, Martino; Ullgrén, Marko; Maisala, Sami; Solin, Otto; Oittinen, Tero; Savolainen, Villa; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Izzo, Carlo; Ballester, Pascal; Gabasch, Armin

    2008-03-01

    ESO Reflex is a software tool that provides a novel approach to astronomical data reduction. The reduction sequence is rendered and controlled as a graphical workflow. Users can follow and interact with the processing in an intuitive manner, without the need for complex scripting. The graphical interface also allows the modification of existing workflows and the creation of new ones. ESO Reflex can invoke standard ESO data reduction recipes in a flexible way. Python scripts, IDL procedures and shell commands can also be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. ESO Reflex was developed in the context of the Sampo project, a three-year effort led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. It is planned that the software will be released to the community in late 2008.

  19. Scientific Workflow Systems for 21st Century e-Science, New Bottle or New Wine?

    CERN Document Server

    Zhao, Yong; Foster, Ian

    2008-01-01

    With the advances in e-Sciences and the growing complexity of scientific analyses, more and more scientists and researchers are relying on workflow systems for process coordination, derivation automation, provenance tracking, and bookkeeping. While workflow systems have been in use for decades, it is unclear whether scientific workflows can or even should build on existing workflow technologies, or they require fundamentally new approaches. In this paper, we analyze the status and challenges of scientific workflows, investigate both existing technologies and emerging languages, platforms and systems, and identify the key challenges that must be addressed by workflow systems for e-science in the 21st century.

  20. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  1. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas;

    2013-01-01

    Declarative workflow languages have been a growing research subject over the past ten years, but applications of the declarative approach in industry are still uncommon. Over the past two years Exformatics A/S, a Danish provider of Electronic Case Management systems, has been cooperating with...... researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction to...

  2. Contextual cloud-based service oriented architecture for clinical workflow.

    Science.gov (United States)

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW. PMID:25991217

  3. Biologically Inspired Execution Framework for Vulnerable Workflow Systems

    CERN Document Server

    Safdar, Sohail; Qureshi, Muhammad Aasim; Akbar, Rehan

    2009-01-01

    The main objective of the research is to introduce a biologically inspired execution framework for workflow systems under threat due to some intrusion attack. Usually vulnerable systems need to be stop and put into wait state, hence to insure the data security and privacy while being recovered. This research ensures the availability of services and data to the end user by keeping the data security, privacy and integrity intact. To achieve the specified goals, the behavior of chameleons and concept of hibernation has been considered in combination. Hence the workflow systems become more robust using biologically inspired methods and remain available to the business consumers safely even in a vulnerable state.

  4. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    OpenAIRE

    Amsterdamer, Yael; Davidson, Susan B.; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2011-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style...

  5. Public Clouds Work Sharing Exact Workflows in Time Limits

    Directory of Open Access Journals (Sweden)

    C.Thamizhannai

    2014-12-01

    Full Text Available An algorithm that uses idle time of provisioned resources and budget surplus to replicate tasks. Deadlines being met and reduces the total execution time of applications as the budget available for replication increases. The description of tasks, data transfer time between tasks if running in different VMs (depicted in the arcs, and execution time of tasks in three different VM types (labeled S1, S2, and S3. The deadline for execution of such workflow is 30 time units and the allocation interval is 10 time units. Deadlines being met and reduces the total execution time of applications as the budget available for replication increases proposed two algorithms for cost-optimized, deadline-constrained execution of workflows in Clouds. On these settings, the IC-PCP algorithm, which is the state-of-the-art algorithm for provisioning and scheduling of workflows in Clouds. THE EIPR ALGORITHM The goal of the proposed Enhanced IC-PCP with Replication (EIPR algorithm is increasing the likelihood of completing the execution of a scientific workflow application within a user-defined deadline in a public Cloud environment. Task scheduling type of VMs to be used for workflow execution as well as start and finish time of each VM (provisioning. Placement of tasks Data transfer start and end time of scheduled tasks, but also the data transfers to the first scheduled task and from the last scheduled task. Task replication virtual machines to be ready to receive data and tasks in the moment that they are required to meet times estimated during the scheduling process. New criteria for ranking candidate tasks for replication and also workflow structure-aware scheduling of replicas, where the structure of the workflow application is considered not only during the selection of candidates for replication but also during the replica’s scheduling. We will also investigate how the replication-based approach can be used when the provisioning and scheduling process is

  6. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    Science.gov (United States)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  7. Improving Clinical Workflow in Ambulatory Care: Implemented Recommendations in an Innovation Prototype for the Veteran’s Health Administration

    Science.gov (United States)

    Patterson, Emily S.; Lowry, Svetlana Z.; Ramaiah, Mala; Gibbons, Michael C.; Brick, David; Calco, Robert; Matton, Greg; Miller, Anne; Makar, Ellen; Ferrer, Jorge A.

    2015-01-01

    Introduction: Human factors workflow analyses in healthcare settings prior to technology implemented are recommended to improve workflow in ambulatory care settings. In this paper we describe how insights from a workflow analysis conducted by NIST were implemented in a software prototype developed for a Veteran’s Health Administration (VHA) VAi2 innovation project and associated lessons learned. Methods: We organize the original recommendations and associated stages and steps visualized in process maps from NIST and the VA’s lessons learned from implementing the recommendations in the VAi2 prototype according to four stages: 1) before the patient visit, 2) during the visit, 3) discharge, and 4) visit documentation. NIST recommendations to improve workflow in ambulatory care (outpatient) settings and process map representations were based on reflective statements collected during one-hour discussions with three physicians. The development of the VAi2 prototype was conducted initially independently from the NIST recommendations, but at a midpoint in the process development, all of the implementation elements were compared with the NIST recommendations and lessons learned were documented. Findings: Story-based displays and templates with default preliminary order sets were used to support scheduling, time-critical notifications, drafting medication orders, and supporting a diagnosis-based workflow. These templates enabled customization to the level of diagnostic uncertainty. Functionality was designed to support cooperative work across interdisciplinary team members, including shared documentation sessions with tracking of text modifications, medication lists, and patient education features. Displays were customized to the role and included access for consultants and site-defined educator teams. Discussion: Workflow, usability, and patient safety can be enhanced through clinician-centered design of electronic health records. The lessons learned from implementing

  8. Workflow for the use of a high-resolution image detector in endovascular interventional procedures

    Science.gov (United States)

    Rana, R.; Loughran, B.; Swetadri Vasan, S. N.; Pope, L.; Ionita, C. N.; Siddiqui, A.; Lin, N.; Bednarek, D. R.; Rudin, S.

    2014-03-01

    Endovascular image-guided intervention (EIGI) has become the primary interventional therapy for the most widespread vascular diseases. These procedures involve the insertion of a catheter into the femoral artery, which is then threaded under fluoroscopic guidance to the site of the pathology to be treated. Flat Panel Detectors (FPDs) are normally used for EIGIs; however, once the catheter is guided to the pathological site, high-resolution imaging capabilities can be used for accurately guiding a successful endovascular treatment. The Micro-Angiographic Fluoroscope (MAF) detector provides needed high-resolution, high-sensitivity, and real-time imaging capabilities. An experimental MAF enabled with a Control, Acquisition, Processing, Image Display and Storage (CAPIDS) system was installed and aligned on a detector changer attached to the C-arm of a clinical angiographic unit. The CAPIDS system was developed and implemented using LabVIEW software and provides a user-friendly interface that enables control of several clinical radiographic imaging modes of the MAF including: fluoroscopy, roadmap, radiography, and digital-subtraction-angiography (DSA). Using the automatic controls, the MAF detector can be moved to the deployed position, in front of a standard FPD, whenever higher resolution is needed during angiographic or interventional vascular imaging procedures. To minimize any possible negative impact to image guidance with the two detector systems, it is essential to have a well-designed workflow that enables smooth deployment of the MAF at critical stages of clinical procedures. For the ultimate success of this new imaging capability, a clear understanding of the workflow design is essential. This presentation provides a detailed description and demonstration of such a workflow design.

  9. Integrating imaging modalities: what makes sense from a workflow perspective?

    International Nuclear Information System (INIS)

    From a workflow/cost perspective integrated imaging is not an obvious solution. An analysis of scanning costs as a function of system cost and relevant imaging times is presented. This analysis ignores potential clinical advantages of integrated imaging. An analysis comparing separate vs integrated imaging costs was performed by deriving pertinent equations and using reasonable cost numbers for imaging devices and systems, room and other variable costs. Integrated systems were divided into those sequentially and simultaneously. Sequential scanning can be done with two devices placed in a single or in two different scanning rooms. Graphs were derived which represent the cost difference between integrated imaging system options and their separate counterparts vs scanning time on one of the devices and cost ratio of an integrated system and its counterpart of separate devices. Integrated systems are favoured by the fact that patients have to be up- and downloaded only once. If imaging times become longer than patient changing times, imaging on separate devices is advantageous. An integrated imaging cost advantage is achieved if the integrated systems typically and overall cost three fourths or less of the separate systems. If PET imaging takes 15 min or less, PET/CT imaging costs less than separate PET and CT imaging, while this time is below 5 min for SPECT/CT. A two-room integrated system has the added advantage that patient download time is not cost relevant, when imaging times on the two devices differ by more than the patient download time. PET/CT scanning is a cost-effective implementation of an integrated system unlike most current SPECT/CT systems. Integration of two devices in two rooms by a shuttle seems the way how to make PET/MR cost-effective and may well also be a design option for SPECT/CT systems. (orig.)

  10. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a...

  11. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  12. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    Directory of Open Access Journals (Sweden)

    SARAVANAN .M.S,

    2011-06-01

    Full Text Available This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM. A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. The process mining aims at extracting useful and meaningful information from event logs, which is a set of real executions of business process at any organizations. This paper briefly reviews the state-or-the-art of business processes developed so far and the techniques adopted. Also presents, the survey of workflow analysis in the view of business process can be broadly classified into four major categories, they are Business Process Modeling, Ontology based Business Process Management, Workflow based Business Process Controlling and Business Process Mining.

  13. CrossFlow: Integrating Workflow Management and Electronic Commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  14. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2002-01-01

    Electronic service outsourcing creates a new paradigm for automated enterprise collaboration. The service-oriented paradigm requires a high level of flexibility of current workflow management systems and support for Business-to-Business (B2B) collaboration to realize collaborative enterprises. This

  15. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    CERN Document Server

    Amsterdamer, Yael; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2012-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style provenance, by using Pig Latin to expose the functionality of modules, thus capturing internal state and fine-grained dependencies. A critical ingredient in our solution is the use of a novel form of provenance graph that models module invocations and yields a compact representation of fine-grained workflow provenance. It also enables a number of novel graph transformation operations, allowing to choose the desired level of granularity in provenance querying (ZoomIn and ZoomOut), and supporting "what-if" workflow analyti...

  16. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  17. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    We present a field study of oncology workflow, involving doctors, nurses and pharmacists at Danish hospitals and discuss the obstacles, enablers and challenges for the use of computer based clinical practice guidelines. Related to the CIGDec approach of Pesic and van der Aalst we then describe how...

  18. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. PMID:26603098

  19. Managing Library IT Workflow with Bugzilla

    OpenAIRE

    Nina McHale

    2010-01-01

    Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified ...

  20. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  1. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  2. Improvement of workflow and processes to ease and enrich meaningful use of health information technology.

    Science.gov (United States)

    Singh, Ranjit; Singh, Ashok; Singh, Devan R; Singh, Gurdev

    2013-01-01

    The introduction of health information technology (HIT) can have unexpected and unintended patient safety and/or quality consequences. This highly desirable but complex intervention requires workflow changes in order to be effective. Workflow is often cited by providers as the number one 'pain point'. Its redesign needs to be tailored to the organizational context, current workflow, HIT system being introduced, and the resources available. Primary care practices lack the required expertise and need external assistance. Unfortunately, the current methods of using esoteric charts or software are alien to health care workers and are, therefore, perceived to be barriers. Most importantly and ironically, these do not readily educate or enable staff to inculcate a common vision, ownership, and empowerment among all stakeholders. These attributes are necessary for creating highly reliable organizations. We present a tool that addresses US Accreditation Council for Graduate Medical (ACGME) competency requirements. Of the six competencies called for by the ACGME, the two that this tool particularly addresses are 'system-based practice' and 'practice-based learning and continuing improvement'. This toolkit is founded on a systems engineering approach. It includes a motivational and orientation presentation, 128 magnetic pictorial and write-erase icons of 40 designs, dry-erase magnetic board, and five visual aids for reducing cognitive and emotive biases in staff. Pilot tests were carried out in practices in Western New York and Colorado, USA. In addition, the toolkit was presented at the 2011 North American Primary Care Research Group (NAPCRG) meeting and an Agency for Health Research and Quality (AHRQ) meeting in 2013 to solicit responses from attendees. It was also presented to the officers of the Office of the National Coordinator (ONC) for HIT. All qualitative feedback was extremely positive and enthusiastic. The respondents recommended that the toolkit be disseminated

  3. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    Science.gov (United States)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera

  4. A Workflow-based RBAC Model for Web Services in Multiple Autonomous Domains

    Directory of Open Access Journals (Sweden)

    Zhenwu WANG

    2013-03-01

    Full Text Available A workflow-based RBAC model for web services (WFRBAC4WS has been proposed in this paper. The model organizes web services in different autonomous domains through workflow mechanism, and maps RBAC model to tasks of workflow model. The paper details the authorization procedure of WFRBAC4WS model, the lifetime management, the extension of authorization constraint and the formal descriptions of the proposed model. Compared with other RBAC models for web services, this model not only combines RBAC model to workflow, but also describes the interactions between workflow mechanism and RABC model in web services environment, the authorization work of this model is dynamically and comprehensively.

  5. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  6. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  7. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    Science.gov (United States)

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  8. Adopting Collaborative Workflow Pattern : Use Case

    Directory of Open Access Journals (Sweden)

    Antonio Capodieci

    2015-03-01

    Full Text Available In recent years, the use of web 2.0 tools has incre ased in companies and organisation. This phenomenon, has modified common organisational and operative practices. This has led “knowledge workers” to change their working practic es through the use of Web 2.0 communication tools. Unfortunately, these tools hav e not been integrated with existing enterprise information systems. This is an importan t problem in an organisational context because knowledge of information exchanged is neede d. In previous works we demonstrate that it is possible to capture this knowledge using coll aboration processes, which are processes of abstraction created in accordance with design patte rns and applied to new organisational operative practices. In this article, we want to pr esent the experience of the adoption of the methodology and the catalog of patterns from some p attern designers of a company operating in Public Administration and Finance, with the aim of shaping an information system Enterprise 2.0, which takes into account the collaborative pro cesses.

  9. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    Science.gov (United States)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  10. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  11. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  12. Research on Architecture of Enterprise Modeling in Workflow System

    Institute of Scientific and Technical Information of China (English)

    李伟平; 齐慧彬; 薛劲松; 朱云龙

    2002-01-01

    The market that an enterprise is faced is changing and can 't be forecastedaccurately in this information time. In order to find the chances in the marketpractitioners have focused on business processes through their re-engineeringprogramme to improve enterprise efficiency. It is necessary to manage an enterpriseusing process-based method for the requirement of enhancing work efficiency and theability of competition in the market. And information system developers haveemphasized the use of standard models to accelerate the speed of configuration andimplementation of integrated systems for enterprises. So we have to model anenterprise with process-based modeling method. An architecture of enterprise modelingis presented in this paper. This architecture is composed of four views and supportingthe whole lifecycle of enterprise model. Because workflow management system is basedon process definition, this architecture can be directly used in the workflowmanagement system. The implement method of this model was thoroughly describedmeanwhile the workflow management software supporting the building and running themodel was also given.

  13. Technical Perspectives on Knowledge Management in Bioinformatics Workflow Systems

    Directory of Open Access Journals (Sweden)

    Walaa N. Ismail

    2015-01-01

    Full Text Available Workflow systems by it’s nature can help bioin-formaticians to plan for their experiments, store, capture and analysis of the runtime generated data. On the other hand, the life science research usually produces new knowledge at an increasing speed; Knowledge such as papers, databases and other systems knowledge that a researcher needs to deal with is actually a complex task that needs much of efforts and time. Thus the management of knowledge is therefore an important issue for life scientists. Approaches has been developed to organize biological knowledge sources and to record provenance knowledge of an experiment into a readily resource are presently being carried out. This article focuses on the knowledge management of in silico experimentation in bioinformatics workflow systems.

  14. A framework for streamlining research workflow in neuroscience and psychology

    Directory of Open Access Journals (Sweden)

    Jonas Kubilius

    2014-01-01

    Full Text Available Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for faster, more robust code development and collaboration for researchers.

  15. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  16. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  17. A Bayesian Approach to the Partitioning of Workflows

    CERN Document Server

    Chua, Freddy C

    2015-01-01

    When partitioning workflows in realistic scenarios, the knowledge of the processing units is often vague or unknown. A naive approach to addressing this issue is to perform many controlled experiments for different workloads, each consisting of multiple number of trials in order to estimate the mean and variance of the specific workload. Since this controlled experimental approach can be quite costly in terms of time and resources, we propose a variant of the Gibbs Sampling algorithm that uses a sequence of Bayesian inference updates to estimate the processing characteristics of the processing units. Using the inferred characteristics of the processing units, we are able to determine the best way to split a workflow for processing it in parallel with the lowest expected completion time and least variance.

  18. Enabling smart workflows over heterogeneous ID-sensing technologies.

    Science.gov (United States)

    Giner, Pau; Cetina, Carlos; Lacuesta, Raquel; Palacios, Guillermo

    2012-01-01

    Sensing technologies in mobile devices play a key role in reducing the gap between the physical and the digital world. The use of automatic identification capabilities can improve user participation in business processes where physical elements are involved(Smart Workflows). However, identifying all objects in the user surroundings does not automatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each of the participants in a smart workflow. Parkour is based on a pluggable architecture that can be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are also provided. Several Parkour-based systems have been developed in order to validate the applicability of the proposal. PMID:23202193

  19. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    Directory of Open Access Journals (Sweden)

    Guillermo Palacios

    2012-11-01

    Full Text Available Sensing technologies in mobile devices play a key role in reducing the gapbetween the physical and the digital world. The use of automatic identification capabilitiescan improve user participation in business processes where physical elements are involved(Smart Workflows. However, identifying all objects in the user surroundings does notautomatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each ofthe participants in a smart workflow. Parkour is based on a pluggable architecture thatcan be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are alsoprovided. Several Parkour-based systems have been developed in order to validate theapplicability of the proposal.

  20. CrossFlow: Integrating Workflow Management and Electronic Commerce

    OpenAIRE

    Hoffner, Y.; Ludwig, H; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when augmented by virtual market technology, the dynamic configuration of the contract enactment infrastructures, and the provision of fine grained service monitoring and control. Standard ways of desc...

  1. Evolutionary multi-objective workflow scheduling in Cloud

    OpenAIRE

    Z. Zhu; Zhang, G.; M. Li; Liu, X.

    2015-01-01

    Cloud computing provides promising platforms for executing large applications with enormous computational resources to offer on demand. In a Cloud model, users are charged based on their usage of resources and the required quality of service (QoS) specifications. Although there are many existing workflow scheduling algorithms in traditional distributed or heterogeneous computing environments, they have difficulties in being directly applied to the Cloud environments since Cloud differs from t...

  2. Hybrid Workflow Policy Management for Heart Disease Identification

    OpenAIRE

    Dong-Hyun Kim; Woo-Ram Jung; Chan-Hyun Youn.

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is m...

  3. Analysis of Whole Transcriptome Sequencing Data: Workflow and Software.

    Science.gov (United States)

    Yang, In Seok; Kim, Sangwoo

    2015-12-01

    RNA is a polymeric molecule implicated in various biological processes, such as the coding, decoding, regulation, and expression of genes. Numerous studies have examined RNA features using whole transcriptome sequencing (RNA-seq) approaches. RNA-seq is a powerful technique for characterizing and quantifying the transcriptome and accelerates the development of bioinformatics software. In this review, we introduce routine RNA-seq workflow together with related software, focusing particularly on transcriptome reconstruction and expression quantification. PMID:26865842

  4. A Component Based Approach to Scientific Workflow Management

    OpenAIRE

    Goff, J. -M. Le; Kovacs, Z; Baker, N.; Brooks, P; R. McClatchey

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirem...

  5. A framework for streamlining research workflow in neuroscience and psychology

    OpenAIRE

    Jonas Kubilius

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tool...

  6. A framework for streamlining research workflow in neuroscience and psychology

    OpenAIRE

    Kubilius, Jonas

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tool...

  7. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  8. Advanced Workflows for Fluid Transfer in Faulted Basins.

    OpenAIRE

    Thibaut Muriel; Jardin Anne; Faille Isabelle; Willien Françoise; Guichet Xavier

    2014-01-01

    The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones a...

  9. A graph model of data and workflow provenance

    OpenAIRE

    Acar, U.; Buneman, P.; J. Cheney; Van den Bussche, Jan; Kwasnikowska, Natalia; Vansummeren, Stijn

    2010-01-01

    Provenance has been studied extensively in both database and workflow management systems, so far with little convergence of definitions or models. Provenance in databases has generally been defined for relational or complex object data, by propagating fine-grained annotations or algebraic expressions from the input to the output. This kind of provenance has been found useful in other areas of computer science: annotation databases, probabilistic databases, schema and data integration, etc. In...

  10. A Complete Workflow for Development of Bangla OCR

    OpenAIRE

    Omee, Farjana Yeasmin; Himel, Shiam Shabbir; Bikas, Md. Abu Naser

    2012-01-01

    Developing a Bangla OCR requires bunch of algorithm and methods. There were many effort went on for developing a Bangla OCR. But all of them failed to provide an error free Bangla OCR. Each of them has some lacking. We discussed about the problem scope of currently existing Bangla OCR's. In this paper, we present the basic steps required for developing a Bangla OCR and a complete workflow for development of a Bangla OCR with mentioning all the possible algorithms required.

  11. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    OpenAIRE

    Guillermo Palacios; Carlos Cetina; Raquel Lacuesta; Pau Giner

    2012-01-01

    Sensing technologies in mobile devices play a key role in reducing the gap between the physical and the digital world. The use of automatic identification capabilities can improve user participation in business processes where physical elements are involved (Smart Workflows). However, identifying all objects in the user surroundings does not automatically translate into meaningful services to the user. This work introduces Parkour, an architecture that allows the development of services that ...

  12. Multidimensional Interactive Radiology Report and Analysis: standardization of workflow and reporting for renal mass tracking and quantification

    Science.gov (United States)

    Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha

    2015-12-01

    A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.

  13. Color variance in PDF-based production workflow environments

    Science.gov (United States)

    Riordan, Michael

    2006-02-01

    Based on the production practices of a representative sampling of graphic arts professionals, a series of tests were conducted to determine the potential color variance incurred during specific production-based PDF workflows. The impact of key production variables--including the use of ICC profiles, methods and settings used for PDF distillation, and printer/RIP color management handling for PDF rendering--were examined for RGB, CMYK and select spot colors to determine the potential magnitude of color variation under normal production conditions. The results of the study, quantified via paired comparison and delta E, showed that, while color variance could be kept to a minimum using very specific workflow configurations, significant color variation was incurred in many of the common workflow configurations representative of the production environments observed from the sample population. Further, even compliance to PDF-X1a and PDF-X3 specifications allowed for unwanted variation depending on specific production activities that preceded or followed the creation of the PDF-X file.

  14. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    Science.gov (United States)

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project. PMID:24825693

  15. Research on a dynamic workflow access control model

    Science.gov (United States)

    Liu, Yiliang; Deng, Jinxia

    2007-12-01

    In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.

  16. Layered Workflow Process Model Based on Extended Synchronizer

    Directory of Open Access Journals (Sweden)

    Gang Ni

    2014-07-01

    Full Text Available The layered workflow process model provide a modeling approach and analysis for the key process with Petri Net. It not only describes the relation between the process of business flow and transition nodes clearly, but also limits the rapid increase in the scale of libraries, transition and directed arcs. This paper studies the process like reservation and complaint handling information management system, especially for the multi-mergence and discriminator patterns which can not be directly modeled with existing synchronizers. Petri Net is adopted to provide formalization description for the workflow patterns and the relation between Arcs and weight class are also analyzed. We use the number of in and out arcs to generalize the workflow into three synchronous modes: fully synchronous mode, competition synchronous mode and asynchronous mode. The types and parameters for synchronization are added to extend the modeling ability of the synchronizers and the synchronous distance is also expanded. The extended synchronizers have the ability to terminate branches automatically or activate the next link many times, besides the ability of original synchronizers. By the analyses on cases of the key business, it is verified that the original synchronizers can not model directly, while the extended synchronizers based on Petri Net can provide modeling for multi-mergence and discriminator modes.

  17. Optimal Workflow Scheduling in Critical Infrastructure Systems with Neural Networks

    Directory of Open Access Journals (Sweden)

    S. Vukmirović

    2012-04-01

    Full Text Available Critical infrastructure systems (CISs, such as power grids, transportation systems, communication networks and water systems are the backbone of a country’s national security and industrial prosperity. These CISs execute large numbers of workflows with very high resource requirements that can span through different systems and last for a long time. The proper functioning and synchronization of these workflows is essential since humanity’s well-being is connected to it. Because of this, the challenge of ensuring availability and reliability of these services in the face of a broad range of operating conditions is very complicated. This paper proposes an architecture which dynamically executes a scheduling algorithm using feedback about the current status of CIS nodes. Different artificial neural networks (ANNs were created in order to solve the scheduling problem. Their performances were compared and as the main result of this paper, an optimal ANN architecture for workflow scheduling in CISs is proposed. A case study is shown for a meter data management system with measurements from a power distribution management system in Serbia. Performance tests show that significant improvement of the overall execution time can be achieved by ANNs.

  18. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  19. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. PMID:24464814

  20. MCRUNJOB: A High energy physics workflow planner for grid production processing

    International Nuclear Information System (INIS)

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job description languages or new application level tasks

  1. MCRUNJOB: A High energy physics workflow planner for grid production processing

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job description languages or new application level tasks.

  2. Inter-laboratory evaluation of instrument platforms and experimental workflows for quantitative accuracy and reproducibility assessment

    Directory of Open Access Journals (Sweden)

    Andrew J. Percy

    2015-09-01

    Full Text Available The reproducibility of plasma protein quantitation between laboratories and between instrument types was examined in a large-scale international study involving 16 laboratories and 19 LC–MS/MS platforms, using two kits designed to evaluate instrument performance and one kit designed to evaluate the entire bottom-up workflow. There was little effect of instrument type on the quality of the results, demonstrating the robustness of LC/MRM-MS with isotopically labeled standards. Technician skill was a factor, as errors in sample preparation and sub-optimal LC–MS performance were evident. This highlights the importance of proper training and routine quality control before quantitation is done on patient samples.

  3. Considering Time in Orthophotography Production: from a General Workflow to a Shortened Workflow for a Faster Disaster Response

    Science.gov (United States)

    Lucas, G.

    2015-08-01

    This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic); these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed) their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the "normal" practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator); the second is simplified and make compromise over positional accuracy (using direct geo-referencing) and seamlines preparation in order to achieve

  4. A practical data processing workflow for multi-OMICS projects.

    Science.gov (United States)

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post

  5. An iterative workflow for mining the human intestinal metaproteome

    Directory of Open Access Journals (Sweden)

    Beauvallet Christian

    2011-01-01

    Full Text Available Abstract Background Peptide spectrum matching (PSM is the standard method in shotgun proteomics data analysis. It relies on the availability of an accurate and complete sample proteome that is used to make interpretation of the spectra feasible. Although this procedure has proven to be effective in many proteomics studies, the approach has limitations when applied on complex samples of microbial communities, such as those found in the human intestinal tract. Metagenome studies have indicated that the human intestinal microbiome contains over 100 times more genes than the human genome and it has been estimated that this ecosystem contains over 5000 bacterial species. The genomes of the vast majority of these species have not yet been sequenced and hence their proteomes remain unknown. To enable data analysis of shotgun proteomics data using PSM, and circumvent the lack of a defined matched metaproteome, an iterative workflow was developed that is based on a synthetic metaproteome and the developing metagenomic databases that are both representative for but not necessarily originating from the sample of interest. Results Two human fecal samples for which metagenomic data had been collected, were analyzed for their metaproteome using liquid chromatography-mass spectrometry and used to benchmark the developed iterative workflow to other methods. The results show that the developed method is able to detect over 3,000 peptides per fecal sample from the spectral data by circumventing the lack of a defined proteome without naive translation of matched metagenomes and cross-species peptide identification. Conclusions The developed iterative workflow achieved an approximate two-fold increase in the amount of identified spectra at a false discovery rate of 1% and can be applied in metaproteomic studies of the human intestinal tract or other complex ecosystems.

  6. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-01-01

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse. PMID:18999220

  7. Introducing OAWAL: crowdsourcing best practices for open access workflows in academic libraries

    OpenAIRE

    Emery, Jill; Stone, Graham

    2014-01-01

    Currently in the formative stage, the intent of OAWAL is to create an openly accessible wiki for librarians working on the management of open access workflow within their given institutions. At is point, the team has developed significant areas of focus for workflow management and will be building upon the current structure as informed through in-person and online crowdsourcing. The current sections to be developed are: advocacy, Creative Commons, the Library as Publisher, standards, workflow...

  8. Context-oriented scientific workflow system and its application in virtual screening

    OpenAIRE

    Fan, Xiaoliang; Brézillon, Patrick; Zhang, Ruisheng; Li, Lian

    2010-01-01

    Scientific workflow (SWF) system is gradually liberating the computational scientists from burden of data-centric operations to concentration on their decision making. However, contemporary SWF systems fail to address the variables when scientists urge to deliver new outcomes through reproduction of workflow, including not only workflow representation, but also its "context" of use. Thus, current failure is mainly due to lack of representing and managing the "context". We propose a context-or...

  9. An event- and repository-based component framework for workflow system architecture

    OpenAIRE

    Tombros, Dimitrios

    1999-01-01

    During the past decade a new class of systems has emerged, which plays an important role in the support of efficient business process implementation: workflow systems. Despite their proliferation however, workflow systems are still being developed in an ad hoc way without making use of advanced software engineering technologies such as component-based system development and reuse of architecture artifacts.This work proposes a modern approach to workflow system construction. The approach is ce...

  10. The Workflow Specification of Process Definition%工作流过程定义规范

    Institute of Scientific and Technical Information of China (English)

    缪晓阳; 石文俊; 吴朝晖

    2000-01-01

    This paper discusses the representation of a business process in a form.There are three basic aspects:the concept of workflow process definition,on which the idea of process definition interchange is raised;the meta-model of workflow,which is used to describe the entities and attributes of entities within the process definition;the workflow process definition language(WPDL),which is used to implement the Process definition.

  11. Pegasus: A Framework for Mapping Complex Scientific Workflows onto Distributed Systems

    OpenAIRE

    Ewa Deelman; Gurmeet Singh; Mei-Hui Su; James Blythe; Yolanda Gil; Carl Kesselman; Gaurang Mehta; Karan Vahi; G. Bruce Berriman; John Good; Anastasia Laity; Jacob, Joseph C.; Katz, Daniel S.

    2005-01-01

    This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. Pegasus enables users to represent the workflows at an abstract level without needing to worry about the particulars of the target execution systems. The paper describes general issues in mapping applications and the functionality of Pegasus. We present the results of improving application performance through workflow restructuring which clusters multiple tasks in a work...

  12. Bandwidth-Aware Scheduling of Workflow Application on Multiple Grid Sites

    OpenAIRE

    Harshadkumar B. Prajapati; Shah, Vipul A.

    2014-01-01

    Bandwidth-aware workflow scheduling is required to improve the performance of a workflow application in a multisite Grid environment, as the data movement cost between two low-bandwidth sites can adversely affect the makespan of the application. Pegasus WMS, an open-source and freely available WMS, cannot fully utilize its workflow mapping capability due to unavailability of integration of any bandwidth monitoring infrastructure in it. This paper develops the integration of Network Weather Se...

  13. High performance workflow implementation for protein surface characterization using grid technology

    OpenAIRE

    Clematis Andrea; D'Agostino Daniele; Morra Giulia; Merelli Ivan; Milanesi Luciano

    2005-01-01

    Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are...

  14. A MVC framework for policy-based adaptation of workflow processes: A case study on confidentiality

    OpenAIRE

    Geebelen, Kristof; Kulikowski, Eryk; Truyen, Eddy; Joosen, Wouter

    2010-01-01

    Most work on adaptive workflows offers insufficient flexibility to enforce complex policies regarding dynamic, evolvable and robust workflows. In addition, many proposed approaches require customized workflow engines. This paper presents a portable framework for realistic enforcement of dynamic adaptation policies in business processes. The framework is based on the Model-View-Controller (MVC) pattern, commonly used for adding dynamism to web pages. To enhance reusability, our approach suppor...

  15. Using a suite of ontologies for preserving workflow-centric research objects

    OpenAIRE

    Belhajjame, Khalid; Zhao, Jun; Klyne, Graham; Goble, Carole; Garijo, Daniel; Gamble, Matthew; Hettne, Kristina; Palma, Raul; Mina, Eleni; Corcho, Oscar; Gómez-Pérez, José Manuel; Bechhofer, Sean

    2015-01-01

    Scientific workflows are a popular mechanism for specifying and automating data-driven in silico experiments. A significant aspect of their value lies in their potential to be reused. Once shared, workflows become useful building blocks that can be combined or modified for developing new experiments. However, previous studies have shown that storing workflow specifications alone is not sufficient to ensure that they can be successfully reused, without being able to understand what the workflo...

  16. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M Jr; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  17. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those. PMID:21058181

  18. PhyloGrid: a development for a workflow in Phylogeny

    CERN Document Server

    Montes, Esther; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do this, no knowledge from his/her side about the computational procedure is required.

  19. Transaktionale Datei- und Dokumentenverwaltung in Workflow-Management-Systemen

    OpenAIRE

    Täuber, Wolfgang

    1996-01-01

    Diese Arbeit wird im Rahmen des Software-Labors der Universität Stuttgart durchgeführt. Das Software-Labor ist eine Einrichtung, das die Entwicklung marktfähiger Software, im Rahmen einer engen Zusammenarbeit der Universität Stuttgart mit der Industrie, zum Ziel hat. Ein Teilprojekt des Software-Labors besitzt als Themengebiet die Weiterentwicklung des Workflow-Management-Systems 'FlowMark' der Firma IBM. Für 'FlowMark' sollen unter anderem transaktionale Konzepte ausgearbeitet werden. Ziel d...

  20. Dokumentenmanagement in einem WWW-basierten Workflow-System

    OpenAIRE

    Horster, Oliver

    1998-01-01

    Workflow-Management-Systeme gestatten die computerunterstützte Bearbeitung von Vorgängen. Ein WFMS ermittelt dabei anhand einer Vorgangsbeschreibung die nächste auszuführende Aktivität und präsentiert sie den Bearbeitern, die sie dann unter Verwendung von Anwendungsprogrammen abarbeiten, beispielsweise durch die Bearbeitung von Dokumenten. Dazu werden Dokumente in sogenannten Dokumenten-Management-Systemen (DMS) verwaltet. Im PoliFlow-Projekt entsteht das WWW-basierte System SWATS, das ein ko...

  1. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  2. Workflow for large-scale analysis of melanoma tissue samples

    Directory of Open Access Journals (Sweden)

    Maria E. Yakovleva

    2015-09-01

    Full Text Available The aim of the present study was to create an optimal workflow for analysing a large cohort of malignant melanoma tissue samples. Samples were lysed with urea and enzymatically digested with trypsin or trypsin/Lys C. Buffer exchange or dilution was used to reduce urea concentration prior to digestion. The tissue digests were analysed directly or following strong cation exchange (SCX fractionation by nano LC–MS/MS. The approach which resulted in the largest number of protein IDs involved a buffer exchange step before enzymatic digestion with trypsin and chromatographic separation in 120 min gradient followed by SCX–RP separation of peptides.

  3. Integration of Motion Capture into 3D Animation Workflows

    OpenAIRE

    Unver, Ertu; Hughes, Daniel; Walker, Bernard; Blackburn, Ryan; Chien, Lin

    2011-01-01

    The research aims to test and evaluate Motion Capture (MoCap) technology on a live CG animation project and discover how it can actually con¬tribute to the animation production workflow. MoCap is a technique for gathering data of the movements of the human body. With the intention of using this information to drive the movements of 3D models in computer generated animation. MoCap offers significant advantages for producing natural and believable movement in 3D animation and opens up the pos...

  4. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  5. PhyloGrid: a development for a workflow in Phylogeny

    OpenAIRE

    Montes, Esther; Isea, Raul; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do t...

  6. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  7. The robust schedule - a link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren Munch; Wandahl, Søren

    2012-01-01

    In today’s construction, there is a paramount focus on time, and on the scheduling and control of time. Everything is organized with respect to time. The construction project has to be completed within a fixed and often tight deadline. Otherwise a daily penalty often has to be paid. This pins down...... result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under sub-optimal conditions. Even though productivity overall seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase...

  8. The robust schedule - A link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren; Wandahl, Søren

    2012-01-01

    In today’s construction, there is a paramount focus on time, and on the scheduling and control of time. Everything is organized with respect to time. The construction project has to be completed within a fixed and often tight deadline. Otherwise a daily penalty often has to be paid. This pin....... The result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under non-optimal conditions. Even though productivity seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase...

  9. Workflow for High Throughput Screening of Gas Sensing Materials

    Directory of Open Access Journals (Sweden)

    Ulrich Simon

    2006-04-01

    Full Text Available The workflow of a high throughput screening setup for the rapid identification ofnew and improved sensor materials is presented. The polyol method was applied to preparenanoparticular metal oxides as base materials, which were functionalised by surface doping.Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS awide range of materials could be screened in a short time. Applying HT-IS in search of newselective gas sensing materials a NO2-tolerant NO sensing material with reducedsensitivities towards other test gases was identified based on iridium doped zinc oxide.Analogous behaviour was observed for iridium doped indium oxide.

  10. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  11. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    Science.gov (United States)

    Hartman, Douglas J

    2016-03-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. PMID:26851662

  12. Submission of content to a digital object repository using a configurable workflow system

    CERN Document Server

    Hense, Andreas

    2007-01-01

    The prototype of a workflow system for the submission of content to a digital object repository is here presented. It is based entirely on open-source standard components and features a service-oriented architecture. The front-end consists of Java Business Process Management (jBPM), Java Server Faces (JSF), and Java Server Pages (JSP). A Fedora Repository and a mySQL data base management system serve as a back-end. The communication between front-end and back-end uses a SOAP minimal binding stub. We describe the design principles and the construction of the prototype and discuss the possibilities and limitations of work ow creation by administrators. The code of the prototype is open-source and can be retrieved in the project escipub at http://sourceforge.net

  13. Design and Optimization of Future Hybrid and Electric Propulsion Systems: An Advanced Tool Integrated in a Complete Workflow to Study Electric Devices Développement et optimisation des futurs systèmes de propulsion hybride et électrique : un outil avancé et intégré dans une chaîne complète dédiée à l’étude des composants électriques

    Directory of Open Access Journals (Sweden)

    Le Berr F.

    2012-08-01

    Full Text Available Electrification to reduce greenhouse effect gases in transport sector is now well-known as a relevant and future solution studied intensively by the whole actors of the domain. To reach this objective, a tool for design and characterization of electric machines has been developed at IFP Energies nouvelles. This tool, called EMTool, is based on physical equations and is integrated to a complete workflow of simulation tools, as Finite Element Models or System Simulation. This tool offers the possibility to study several types of electric machine topologies: permanent magnet synchronous machine with radial or axial flux, induction machines, etc. This paper presents the main principles of design and the main equations integrated in the EMTool, the methods to evaluate electric machine performances and the validations performed on existing machine. Finally, the position of the EMTool in the simulation tool workflow and application examples are presented, notably by coupling the EMTool with advanced optimization algorithms or finite element models. Le recours à l’électrification pour réduire les émissions de gaz à effet de serre dans le domaine du transport est désormais reconnu comme une solution pertinente et d’avenir, très étudiée par l’ensemble des acteurs du domaine. Dans cet objectif, un outil d’aide au dimensionnement et à la caractérisation de machines électriques a été développé à IFP Energies nouvelles. Cet outil, appelé EMTool, est basé sur les équations physiques du domaine et est intégré à un ensemble d’outils de simulation dédiés à l’étude des groupes motopropulseurs électrifiés, comme les outils de modélisation par éléments finis ou les outils de simulation système. Il permet d’étudier plusieurs types de topologies de machines électriques : machines synchrones à aimants permanents à flux radial ou axial, machines asynchrones, etc. Ce papier présente les grands principes de

  14. Geometric processing workflow for vertical and oblique hyperspectral frame images collected using UAV

    Science.gov (United States)

    Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.

    2014-08-01

    Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.

  15. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  16. Accelerated partial breast irradiation utilizing brachytherapy: patient selection and workflow

    Science.gov (United States)

    Wobb, Jessica; Manyam, Bindu; Khan, Atif; Vicini, Frank

    2016-01-01

    Accelerated partial breast irradiation (APBI) represents an evolving technique that is a standard of care option in appropriately selected woman following breast conserving surgery. While multiple techniques now exist to deliver APBI, interstitial brachytherapy represents the technique used in several randomized trials (National Institute of Oncology, GEC-ESTRO). More recently, many centers have adopted applicator-based brachytherapy to deliver APBI due to the technical complexities of interstitial brachytherapy. The purpose of this article is to review methods to evaluate and select patients for APBI, as well as to define potential workflow mechanisms that allow for the safe and effective delivery of APBI. Multiple consensus statements have been developed to guide clinicians on determining appropriate candidates for APBI. However, recent studies have demonstrated that these guidelines fail to stratify patients according to the risk of local recurrence, and updated guidelines are expected in the years to come. Critical elements of workflow to ensure safe and effective delivery of APBI include a multidisciplinary approach and evaluation, optimization of target coverage and adherence to normal tissue guideline constraints, and proper quality assurance methods. PMID:26985202

  17. A High Throughput Workflow Environment for Cosmological Simulations

    CERN Document Server

    Erickson, Brandon M S; Evrard, August E; Becker, Matthew R; Busha, Michael T; Kravtsov, Andrey V; Marru, Suresh; Pierce, Marlon; Wechsler, Risa H

    2012-01-01

    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manua...

  18. Magallanes: a web services discovery and automatic workflow composition tool

    Directory of Open Access Journals (Sweden)

    Trelles Oswaldo

    2009-10-01

    Full Text Available Abstract Background To aid in bioinformatics data processing and analysis, an increasing number of web-based applications are being deployed. Although this is a positive circumstance in general, the proliferation of tools makes it difficult to find the right tool, or more importantly, the right set of tools that can work together to solve real complex problems. Results Magallanes (Magellan is a versatile, platform-independent Java library of algorithms aimed at discovering bioinformatics web services and associated data types. A second important feature of Magallanes is its ability to connect available and compatible web services into workflows that can process data sequentially to reach a desired output given a particular input. Magallanes' capabilities can be exploited both as an API or directly accessed through a graphic user interface. The Magallanes' API is freely available for academic use, and together with Magallanes application has been tested in MS-Windows™ XP and Unix-like operating systems. Detailed implementation information, including user manuals and tutorials, is available at http://www.bitlab-es.com/magallanes. Conclusion Different implementations of the same client (web page, desktop applications, web services, etc. have been deployed and are currently in use in real installations such as the National Institute of Bioinformatics (Spain and the ACGT-EU project. This shows the potential utility and versatility of the software library, including the integration of novel tools in the domain and with strong evidences in the line of facilitate the automatic discovering and composition of workflows.

  19. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  20. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  1. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  2. Improvement of workflow and processes to ease and enrich meaningful use of health information technology

    Directory of Open Access Journals (Sweden)

    Singh R

    2013-11-01

    Full Text Available Ranjit Singh,1 Ashok Singh,2 Devan R Singh,3 Gurdev Singh1 1Department of Family Medicine, UB Patient Safety Research Center, School of Medicine and Management, State University of NY at Buffalo, NY, USA; 2Niagara Family Medicine Associates, Niagara Falls, NY, USA; 3SaferPatients LLC, Lewiston, NY, USA Abstract: The introduction of health information technology (HIT can have unexpected and unintended patient safety and/or quality consequences. This highly desirable but complex intervention requires workflow changes in order to be effective. Workflow is often cited by providers as the number one 'pain point'. Its redesign needs to be tailored to the organizational context, current workflow, HIT system being introduced, and the resources available. Primary care practices lack the required expertise and need external assistance. Unfortunately, the current methods of using esoteric charts or software are alien to health care workers and are, therefore, perceived to be barriers. Most importantly and ironically, these do not readily educate or enable staff to inculcate a common vision, ownership, and empowerment among all stakeholders. These attributes are necessary for creating highly reliable organizations. We present a tool that addresses US Accreditation Council for Graduate Medical (ACGME competency requirements. Of the six competencies called for by the ACGME, the two that this tool particularly addresses are 'system-based practice' and 'practice-based learning and continuing improvement'. This toolkit is founded on a systems engineering approach. It includes a motivational and orientation presentation, 128 magnetic pictorial and write-erase icons of 40 designs, dry-erase magnetic board, and five visual aids for reducing cognitive and emotive biases in staff. Pilot tests were carried out in practices in Western New York and Colorado, USA. In addition, the toolkit was presented at the 2011 North American Primary Care Research Group (NAPCRG

  3. Tools for Integrating Data Access from the IRIS DMC into Research Workflows

    Science.gov (United States)

    Reyes, C. G.; Suleiman, Y. Y.; Trabant, C.; Karstens, R.; Weertman, B. R.

    2012-12-01

    Web service interfaces at the IRIS Data Management Center (DMC) provide access to a vast archive of seismological and related geophysical data. These interfaces are designed to easily incorporate data access into data processing workflows. Examples of data that may be accessed include: time series data, related metadata, and earthquake information. The DMC has developed command line scripts, MATLAB® interfaces and a Java library to support a wide variety of data access needs. Users of these interfaces do not need to concern themselves with web service details, networking, or even (in most cases) data conversion. Fetch scripts allow access to the DMC archive and are a comfortable fit for command line users. These scripts are written in Perl and are well suited for automation and integration into existing workflows on most operating systems. For metdata and event information, the Fetch scripts even parse the returned data into simple text summaries. The IRIS Java Web Services Library (IRIS-WS Library) allows Java developers the ability to create programs that access the DMC archives seamlessly. By returning the data and information as native Java objects the Library insulates the developer from data formats, network programming and web service details. The MATLAB interfaces leverage this library to allow users access to the DMC archive directly from within MATLAB (r2009b or newer), returning data into variables for immediate use. Data users and research groups are developing other toolkits that use the DMC's web services. Notably, the ObsPy framework developed at LMU Munich is a Python Toolbox that allows seamless access to data and information via the DMC services. Another example is the MATLAB-based GISMO and Waveform Suite developments that can now access data via web services. In summary, there now exist a host of ways that researchers can bring IRIS DMC data directly into their workflows. MATLAB users can use irisFetch.m, command line users can use the various

  4. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  5. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  6. Fault tolerant workflow scheduling based on replication and resubmission of tasks in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jayadivya S K

    2012-06-01

    Full Text Available The aim of workflow scheduling system is to schedule the workflows within the user given deadline to achieve a good success rate. Workflow is a set of tasks processed in a predefined order based on its data and control dependency. Scheduling these workflows in a computing environment, like cloud environment, is an NP-Complete problem and it becomes more challenging when failures of tasks areconsidered. To overcome these failures, the workflow scheduling system should be fault tolerant. In this paper, the proposed Fault Tolerant Workflow Scheduling algorithm (FTWS provides fault tolerance by using replication and resubmission of tasks based on priority of the tasks. The replication of tasks depends on a heuristic metric which is calculated by finding the tradeoff between the replication factor and resubmission factor. The heuristic metric is considered because replication alone may lead to resource wastage and resubmission alone may increase makespan. Tasks are prioritized based on the criticality of the task which is calculated by using parameters like out degree, earliest deadline and high resubmission impact. Priority helps in meeting the deadline of a task and thereby reducing wastage of resources. FTWS schedules workflows within a deadline even in the presence of failures without using any history of information. The experiments were conducted in a simulated cloud environment by scheduling workflows in the presence of failures which are generated randomly. The experimental results of the proposed work demonstrate the effective success rate in-spite of various failures.

  7. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  8. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  9. Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows

    Directory of Open Access Journals (Sweden)

    Marquis P. Vawter

    2012-08-01

    Full Text Available Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG, to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders.

  10. Software Design for Empowering Scientists

    OpenAIRE

    De Roure, David; Goble, Carole

    2009-01-01

    Scientific research is increasingly digital. Some activities, such as data analysis, search, and simulation, can be accelerated by letting scientists write workflows and scripts that automate routine activities. These capture pieces of the scientific method that scientists can share. The averna Workbench, a widely deployed scientific-workflow-management system, together with the myExperiment social Web site for sharing scientific experiments, follow six principles of designing software for ad...

  11. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language......, a well-established visual language for modelling workflows in a business context. The framework’s modelling language is extended to include the tracking of real-valued quantities associated with the process (such as time, cost, temperature). In addition, this language also allows for an intention...... by means of a case study from the food industry. Through this case study we explore the extent to which the risk of production faults can be reduced and the impact of these can be minimised, primarily through restructuring of the production workflows. This approach is fully automated and only the...

  12. Domain-Specific Languages For Developing and Deploying Signature Discovery Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Ferosh; Wynne, Adam S.; Liu, Yan; Gray, Jeff

    2013-12-02

    Domain-agnostic Signature Discovery entails scientific investigation across multiple domains through the re-use of existing algorithms into workflows. The existing algorithms may be written in any programming language for various hardware architectures (e.g., desktops, commodity clusters, and specialized parallel hardware platforms). This raises an engineering issue in generating Web services for heterogeneous algorithms so that they can be composed into a scientific workflow environment (e.g., Taverna). In this paper, we present our software tool that defines two simple Domain-Specific Languages (DSLs) to automate these processes: SDL and WDL. Our Service Description Language (SDL) describes key elements of a signature discovery algorithm and generates the service code. The Workflow Description Language (WDL) describes the pipeline of services and generates deployable artifacts for the Taverna workflow management system. We demonstrate our tool with a landscape classification example that is represented by BLAST workflows composed of services that wrap original scripts.

  13. Optimizing perioperative decision making: improved information for clinical workflow planning.

    Science.gov (United States)

    Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph

    2012-01-01

    Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction. PMID:23304284

  14. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  15. The Distributed Workflow Management System--FlowAgent

    Institute of Scientific and Technical Information of China (English)

    王文军; 仲萃豪

    2000-01-01

    While mainframe or 2-tier client/server system have serious problems in flexibility and scalability for the large-scale business processes, 3-tier client/server architecture and object-oriented system modeling which construct business process on service components seem to bring software system some scalability. As enabling infrastructure for object-oriented methodology, distributed WFMS (Work-flow Management System) can flexibly describe business rules among autonomous 'service tasks', and support scalability of large-scale business process. But current distributed WFMS still have difficulty to manage a large number of distributed tasks, the 'multi-TaskDomain' architecture of FlowAgent will try to solve this problem, and bring a dynamic and distributed environment for task-scheduling.

  16. The impact of medical technology on office workflow.

    Science.gov (United States)

    McEvoy, S P

    2003-01-01

    Digital technologies are gaining wider acceptance within the medical and dental professions. The lure of increased productivity and improved quality entice practices to adapt. These systems are beginning to have a profound impact on the workflows within the practice, as well as putting new demands on existing resources. To successfully implement a new technology within your practice, you must look beyond advertising and discover the real requirements of the system. Vendors rarely try to help beyond the sale and installation of their equipment, nor do they consider how their product might require you to modify the way you and your staff work. Acquiring the necessary knowledge through self-education, a consultant, or (preferably) a combination of the two is the best way to integrate a new technology with your practice. PMID:14606549

  17. Workflow: a new modeling concept in critical care units.

    Science.gov (United States)

    Yousfi, F; Beuscart, R; Geib, J M

    1995-01-01

    The term Groupware concerns computer-based systems that support groups of people engaged in a common task (goal) and that provide an interface to a shared environment [1]. The absence of a common tool for exchanges between physicians and nurses causes a multiplication of paper supports for the recording of information. Our objective is to study software architectures in particular medical units that allow task coordination and managing conflicts between participants within a distributed environment. The final goal of this research is to propose a computer solution that could answer the user requirements in Critical Care Units (CCUs). This paper describes the Workflow management approach [5] for supporting group work in health care field. The emphasis is especially on asynchronous cooperation. This approach was applied to CCUs through the analysis and the proposal of a new architecture [6]. We shall limit ourselves to explaining control board and analyzing the message management we support. PMID:8591248

  18. Using Simulations to Integrate Technology into Health Care Aidesཿ Workflow

    Directory of Open Access Journals (Sweden)

    Sharla King

    2013-07-01

    Full Text Available Health care aides (HCAs are critical to home care, providing a range of services to people with chronic conditions, aging or are unable to care for themselves independently. The current HCA supply will not keep up with this increasing demand without fundamental changes in their work environment. One possible solution to some of the workflow challenges and workplace stress of HCAs is hand-held tablet technology. In order to introduce the use of tablets with HCAs, simulations were developed. Once an HCA was comfortable with the tablet, a simulated client was introduced. The HCA interacted with the simulated client and used the tablet applications to assist with providing care. After the simulations, the HCAs participated in a focus group. HCAs completed a survey before and after the tablet training and simulation to determine their perception and acceptance of the tablet. Future deployment and implementation of technologies in home care should be further evaluated for outcomes.

  19. Experiment planning and execution workflow at ASDEX Upgrade

    International Nuclear Information System (INIS)

    We present the current workflow from experiment proposals to the actual execution and evaluation of discharges at the ASDEX Upgrade tokamak. Requests for experiments are solicited from both within the IPP and from external collaborators in the yearly call-for-proposals, checked for feasibility and compliance with the project's research goals and collected in a proposal database. During the campaign shot requests are derived from the proposals and in weekly operation meetings the requests are mapped to a schedule (shot list). Before the execution of discharges a complete set of configuration data needs to be assembled. After the execution follows the analysis (including the evaluation of the discharge as to its usefulness for the underlying proposal) and logging of the attained parameters in a physics logbook. The paper describes processes, software tools, and information management showing how they ultimately lead to an improved scientific productivity.

  20. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  1. A component based approach to scientific workflow management

    International Nuclear Information System (INIS)

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse

  2. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    Science.gov (United States)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point

  3. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems

    OpenAIRE

    Xuejun Li; Jia Xu; Yun Yang

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they h...

  4. Grid-Workflow-Management-Systeme für die Ausführung wissenschaftlicher Prozessabläufe

    OpenAIRE

    Ekaterina Elts; Hans-Joachim Bungartz

    2016-01-01

    Für die Ausführung komplexer wissenschaftlicher Prozessabläufe (Workflows) in einer verteilten und heterogenen Rechner- und Softwareumgebung braucht man speziell darauf ausgerichtete Workflow-Management-Systeme. In diesem Bericht wurden einige international anerkannte Workflow-Management-Systeme untersucht und verglichen. Dabei wurden die besonderen Anforderungen an die wissenschaftlichen Workflows (im Gegensatz zu Geschäftsprozessen) beachtet und die jeweiligen Besonderheiten der betrac...

  5. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics

    Science.gov (United States)

    Zhao, Jun; Avila-Garcia, Maria Susana; Roos, Marco; Thompson, Mark; van der Horst, Eelke; Kaliyaperumal, Rajaram; Luo, Ruibang; Lee, Tin-Lap; Lam, Tak-wah; Edmunds, Scott C.; Sansone, Susanna-Assunta

    2015-01-01

    Motivation Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA), Nanopublications (NP), and Research Objects (RO) models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler. Results Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata. Availability SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http

  6. From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics.

    Directory of Open Access Journals (Sweden)

    Alejandra González-Beltrán

    Full Text Available Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed. The Investigation/Study/Assay (ISA, Nanopublications (NP, and Research Objects (RO models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler.Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings. The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata.SOAPdenovo2 scripts, data, and results are available through the GigaScience Database: http://dx.doi.org/10.5524/100044; the workflows are available from GigaGalaxy: http://galaxy.cbiit.cuhk.edu.hk; and the representations using the ISA, NP, and RO models are available through the SOAPdenovo2 case study website http://isa-tools.github.io/soapdenovo2

  7. Workflow Management Application Programming Interface Specification%工作流管理应用编程接口规范

    Institute of Scientific and Technical Information of China (English)

    刘华伟; 吴朝晖

    2000-01-01

    The document 'Workflow Management Application Programming Interface Specification'is distributed by the Workflow Management Coalition to specify stadard APIs which can be supported by workflow management products.In this paper,we first introduce two parts of this interface,then discuss the standardized data structure and functions definition,finally address the future work.

  8. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  9. Overcoming the Challenges of Implementing a Multi-Mission Distributed Workflow System

    Science.gov (United States)

    Sayfi, Elias; Cheng, Cecilia; Lee, Hyun; Patel, Rajesh; Takagi, Atsuya; Yu, Dan

    2009-01-01

    A multi-mission approach to solving the same problems for various projects is enticing. However, the multi-mission approach leads to the need to develop a configurable, adaptable and distributed system to meet unique project requirements. That, in turn, leads to a set of challenges varying from handling synchronization issues to coming up with a smart design that allows the "unknowns" to be decided later. This paper discusses the challenges that the Multi-mission Automated Task Invocation Subsystem (MATIS) team has come up against while designing the distributed workflow system, as well as elaborates on the solutions that were implemented. The first is to design an easily adaptable system that requires no code changes as a result of configuration changes. The number of formal deliveries is often limited because each delivery costs time and money. Changes such as the sequence of programs being called, a change of a parameter value in the program that is being automated should not result in code changes or redelivery.

  10. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  11. Ergonomic design for dental offices.

    Science.gov (United States)

    Ahearn, David J; Sanders, Martha J; Turcotte, Claudia

    2010-01-01

    The increasing complexity of the dental office environment influences productivity and workflow for dental clinicians. Advances in technology, and with it the range of products needed to provide services, have led to sprawl in operatory setups and the potential for awkward postures for dental clinicians during the delivery of oral health services. Although ergonomics often addresses the prevention of musculoskeletal disorders for specific populations of workers, concepts of workflow and productivity are integral to improved practice in work environments. This article provides suggestions for improving workflow and productivity for dental clinicians. The article applies ergonomic principles to dental practice issues such as equipment and supply management, office design, and workflow management. Implications for improved ergonomic processes and future research are explored. PMID:20448328

  12. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  13. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  14. An Inter-enterprise Workflow Model for Supply Chain and B2B E-commerce

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The goals of B2B electronic commerce and supply chain management system are to implement interoperability of independent enterprises, to smooth the information flow between them and to deploy business processes over multiple enterprises. The inherent characteristics of workflow system make it suitable to implement the cross organization management. This paper, firstly, proposes an inter-enterprises workflow model based on the agreement to support the construction of supply chain management system and B2B electronic commerce. This model has extended the standard workflow model. After that, an architecture which supports the model has been introduced, especially it details the structure and implementation of interfaces between enterprises.

  15. Konsistenz und Integrität in Workflows als Kontrollmechanismen dynamischer Änderungen

    OpenAIRE

    Herzberger, Thomas

    1999-01-01

    Derzeitige Workflow-Management-Systeme schreiben meist immer noch eine strikte zeitliche Trennung und Abfolge von Modellierung- und Ausführungphase vor, was auf die Ursprünge der Entwicklung von Workflow-Management und deren Motivation zurückgeht. Versuche, Workflow-Management-Systeme auch in anderen Bereichen zur Ausführung von Geschäftsprozessen einzusetzen, stoßen auf die Schwierigkeit, daß viele Prozesse nicht vollständig stukturiert sind. Dies bedingt, daß diese nur mit erheblichem Mehra...

  16. Un modelo de referencia para definir la perspectiva organizacional de modelos de workflows

    OpenAIRE

    Stroppi, Luis Jesús Ramón; Villarreal, Pablo David

    2009-01-01

    El soporte que las herramientas de modelado y sistemas de gestión de workflows brindan a la perspectiva organizacional es limitado. Esto se debe a la falta de herramientas que permitan entender los requerimientos de distribución de trabajo definidos en un modelo de workflow. Este trabajo presenta un modelo de referencia que da soporte a la definición de la perspectiva organizacional de los modelos de workflows. El modelo provee un conjunto de atributos de actividad que permiten identificar lo...

  17. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  18. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  19. A novel workflow for seismic net pay estimation with uncertainty

    CERN Document Server

    Glinsky, Michael E; Unaldi, Muhlis; Nagassar, Vishal

    2016-01-01

    This paper presents a novel workflow for seismic net pay estimation with uncertainty. It is demonstrated on the Cassra/Iris Field. The theory for the stochastic wavelet derivation (which estimates the seismic noise level along with the wavelet, time-to-depth mapping, and their uncertainties), the stochastic sparse spike inversion, and the net pay estimation (using secant areas) along with its uncertainty; will be outlined. This includes benchmarking of this methodology on a synthetic model. A critical part of this process is the calibration of the secant areas. This is done in a two step process. First, a preliminary calibration is done with the stochastic reflection response modeling using rock physics relationships derived from the well logs. Second, a refinement is made to the calibration to account for the encountered net pay at the wells. Finally, a variogram structure is estimated from the extracted secant area map, then used to build in the lateral correlation to the ensemble of net pay maps while matc...

  20. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Ryabinkin, E.; Wenaus, T.

    2016-02-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed Analysis)Workload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF), is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF's Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.