WorldWideScience

Sample records for cambafx workflow design

  1. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging.

    Science.gov (United States)

    Ooi, Cinly; Bullmore, Edward T; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  2. CamBAfx: workflow design, implementation and application for neuroimaging

    Directory of Open Access Journals (Sweden)

    Cinly Ooi

    2009-08-01

    Full Text Available CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers and those who design them (designers. It provides a front-end (user interface optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  3. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  4. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  5. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  6. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  7. Designing a road map for geoscience workflows

    Science.gov (United States)

    Duffy, Christopher; Gil, Yolanda; Deelman, Ewa; Marru, Suresh; Pierce, Marlon; Demir, Ibrahim; Wiener, Gerry

    2012-06-01

    Advances in geoscience research and discovery are fundamentally tied to data and computation, but formal strategies for managing the diversity of models and data resources in the Earth sciences have not yet been resolved or fully appreciated. The U.S. National Science Foundation (NSF) EarthCube initiative (http://earthcube.ning.com), which aims to support community-guided cyberinfrastructure to integrate data and information across the geosciences, recently funded four community development activities: Geoscience Workflows; Semantics and Ontologies; Data Discovery, Mining, and Integration; and Governance. The Geoscience Workflows working group, with broad participation from the geosciences, cyberinfrastructure, and other relevant communities, is formulating a workflows road map (http://sites.google.com/site/earthcubeworkflow/). The Geoscience Workflows team coordinates with each of the other community development groups given their direct relevance to workflows. Semantics and ontologies are mechanisms for describing workflows and the data they process.

  8. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of refere

  9. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  10. A computational workflow for designing silicon donor qubits

    Science.gov (United States)

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; Huang, Jingsong; Britton, Charles; Curtis, Franklin G.; Dumitrescu, Eugene F.; Mohiyaddin, Fahd A.; Sumpter, Bobby G.

    2016-10-01

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to perform detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. The resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.

  11. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  12. ANALYSIS OF WORKFLOW ON DESIGN PROJECTS IN INDIA

    Directory of Open Access Journals (Sweden)

    Senthilkumar Venkatachalam

    2010-12-01

    Full Text Available Proposal: The increase in privately funded infrastructure construction in India had compelled project owners to demand highly compressed project schedules due to political risks and early revenue generation. As a result, many of the contracts are based on EPC (Engineering Procurement and Construction contract enabling the contractor to plan and control the EPC phases. Sole responsibility for the three phases has facilitated the use of innovative approaches such as fast-track construction and concurrent engineering in order to minimize project duration. As a part of a research study to improve design processes, the first author spent a year as an observer in two design projects which was done by a leading EPC contractor in India. Both projects required accelerated design and fast-track construction. The first project involved the detailed design of a coal handling unit for a power plant and second the preliminary phase of a large airport design project. The research team had the mandate to analyze the design process and suggest changes to make it more efficient. On the first project, detailed data on the design/drawing workflow was collected and analyzed. The paper presents the analysis of the data identifying the bottlenecks in the process and compares the analysis results with the perceptions of the design team. On the second project, the overall organizational structure for coordinating the interfaces between the design processes was evaluated. The paper presents a structured method to organize the interface and interactions between the various design disciplines. The details of the method proposed, implementation issues and outcomes of implementation are also discussed.

  13. Design and Implementation of Visualized Workflow Modeling System Based on B/S Structure

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; LI wei-li

    2007-01-01

    According to the necessity of flexible workflow management system, the solution to set up the visualized workflow modelling system based on B/S structure is put forward, which conforms to the relevant specifications of WfMC and the workflow process definition meta-modei. The design for system structure is presented in detail, and the key technologies for system implementation are also introduced. Additionally, an example is illustrated to demonstrate the validity of system.

  14. Design and Evaluation of Data Annotation Workflows for CAVE-like Virtual Environments.

    Science.gov (United States)

    Pick, Sebastian; Weyers, Benjamin; Hentschel, Bernd; Kuhlen, Torsten W

    2016-04-01

    Data annotation finds increasing use in Virtual Reality applications with the goal to support the data analysis process, such as architectural reviews. In this context, a variety of different annotation systems for application to immersive virtual environments have been presented. While many interesting interaction designs for the data annotation workflow have emerged from them, important details and evaluations are often omitted. In particular, we observe that the process of handling metadata to interactively create and manage complex annotations is often not covered in detail. In this paper, we strive to improve this situation by focusing on the design of data annotation workflows and their evaluation. We propose a workflow design that facilitates the most important annotation operations, i.e., annotation creation, review, and modification. Our workflow design is easily extensible in terms of supported annotation and metadata types as well as interaction techniques, which makes it suitable for a variety of application scenarios. To evaluate it, we have conducted a user study in a CAVE-like virtual environment in which we compared our design to two alternatives in terms of a realistic annotation creation task. Our design obtained good results in terms of task performance and user experience.

  15. Armadillo 1.1: an original workflow platform for designing and conducting phylogenetic analysis and simulations.

    Directory of Open Access Journals (Sweden)

    Etienne Lord

    Full Text Available In this paper we introduce Armadillo v1.1, a novel workflow platform dedicated to designing and conducting phylogenetic studies, including comprehensive simulations. A number of important phylogenetic and general bioinformatics tools have been included in the first software release. As Armadillo is an open-source project, it allows scientists to develop their own modules as well as to integrate existing computer applications. Using our workflow platform, different complex phylogenetic tasks can be modeled and presented in a single workflow without any prior knowledge of programming techniques. The first version of Armadillo was successfully used by professors of bioinformatics at Université du Quebec à Montreal during graduate computational biology courses taught in 2010-11. The program and its source code are freely available at: .

  16. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    CERN Document Server

    Le Goff, J M; Bityukov, S; Estrella, F; Kovács, Z; Le Flour, T; Lieunard, S; McClatchey, R; Murray, S; Organtini, G; Vialle, J P; Bazan, A; Chevenier, G

    1997-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of ( often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems ( also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commer cial workflow products cannot support the highly dynamic activities found both in the design stages of product developmen...

  17. Linear CMOS RF power amplifiers a complete design workflow

    CERN Document Server

    Ruiz, Hector Solar

    2013-01-01

    The work establishes the design flow for the optimization of linear CMOS power amplifiers from the first steps of the design to the final IC implementation and tests. The authors also focuses on design guidelines of the inductor's geometrical characteristics for power applications and covers their measurement and characterization. Additionally, a model is proposed which would facilitate designs in terms of transistor sizing, required inductor quality factors or minimum supply voltage. The model considers limitations that CMOS processes can impose on implementation. The book also provides diffe

  18. Echoes of Semiotically-Based Design in the Development and Testing of a Workflow System

    Directory of Open Access Journals (Sweden)

    Clarisse Sieckenius de Souza

    2001-05-01

    Full Text Available Workflow systems are information-intensive task-oriented computer applications that typically involve a considerable number of users playing a wide variety of roles. Since communication, coordination and decision-making processes are essential for such systems, representing, interpreting and negotiating collective meanings are a crucial issue for software design and development processes. In this paper, we report and discuss our experience in implementing Qualitas, a web-based workflow system. Semiotic theory was extensively used to support design decisions and negotiations with users about technological signs. Taking scenarios as a type-sign exchanged throughout the whole process, we could trace the theoretic underpinnings of our experience and draw some revealing conclusions about the product and the process of technologically reified discourse. Although it is present in all information technology applications, this kind of discourse is seldom analyzed by software designers and developers. Our conjecture is that outside semiotic theory, professionals involved with human-computer interaction and software engineering practices have difficulty to coalesce concepts derived from such different disciplines as psychology, anthropology, linguistics and sociology, to name a few. Semiotics, however, can by itself provide a unifying ontological basis for interdisciplinary nowledge, raising issues and proposing alternatives, that may help professionals gain insights at lower learning costs. eywords: semiotic engineering, workflow systems, information-intensive task-oriented systems, scenario based design and development of computer systems, human-computer interaction

  19. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Energy Technology Data Exchange (ETDEWEB)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  20. Evaluation of User Interface and Workflow Design of a Bedside Nursing Clinical Decision Support System

    Science.gov (United States)

    Yuan, Michael Juntao; Finley, George Mike; Mills, Christy; Johnson, Ron Kim

    2013-01-01

    Background Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Objective Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Methods Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration

  1. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly avai

  2. A web accessible scientific workflow system for vadoze zone performance monitoring: design and implementation examples

    Science.gov (United States)

    Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.

    2005-12-01

    Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser

  3. Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands

    NARCIS (Netherlands)

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, José; Manderveld, Jocelyn

    2005-01-01

    Please refer to the original article in: Westera, W., Brouns, F., Pannekeet, K., Janssen, J., & Manderveld, J. (2005). Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands. Educational Technology & Society, 8 (3), 216-225. (URL: http://www.i

  4. Benchmarking ETL Workflows

    Science.gov (United States)

    Simitsis, Alkis; Vassiliadis, Panos; Dayal, Umeshwar; Karagiannis, Anastasios; Tziovara, Vasiliki

    Extraction-Transform-Load (ETL) processes comprise complex data workflows, which are responsible for the maintenance of a Data Warehouse. A plethora of ETL tools is currently available constituting a multi-million dollar market. Each ETL tool uses its own technique for the design and implementation of an ETL workflow, making the task of assessing ETL tools extremely difficult. In this paper, we identify common characteristics of ETL workflows in an effort of proposing a unified evaluation method for ETL. We also identify the main points of interest in designing, implementing, and maintaining ETL workflows. Finally, we propose a principled organization of test suites based on the TPC-H schema for the problem of experimenting with ETL workflows.

  5. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  6. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  7. Scientific workflows for bibliometrics.

    Science.gov (United States)

    Guler, Arzu Tugce; Waaijer, Cathelijn J F; Palmblad, Magnus

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discuss the specific capabilities of Taverna that makes this software a powerful tool in this field, such as automated data import via Web services, data extraction from XML by XPaths, and statistical analysis and visualization with R. The support of the latter is particularly relevant, as it allows integration of a number of recently developed R packages specifically for bibliometrics. Examples are used to illustrate the possibilities of Taverna in the fields of bibliometrics and scientometrics.

  8. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  9. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  10. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.

    2010-01-01

    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  11. Continuous and fast calibration of the CMS experiment design of the automated workflows and operational experience.

    CERN Document Server

    Oramus, Piotr Karol; Pfeiffer, Andreas; Franzoni, Giovanni; Govi, Giacomo Maria; Musich, Marco; Di Guida, Salvatore

    2017-01-01

    The exploitation of the full physics potential of the LHC experimentsrequires fast and efficient processing of the largest possible datasetwith the most refined understanding of the detector conditions. Toface this challenge, the CMS collaboration has setup an infrastructure for thecontinuous unattended computation of the alignment and calibrationconstants, allowing for a refined knowledge of the most time-criticalparameters already a few hours after the data have been saved to disk.This is the prompt calibration framework which, since the beginning ofthe LHC Run-I, enables the analysis and the High Level Trigger of theexperiment to consume the most up-to-date conditions optimizingthe performance of the physics objects. In Run-II this setup has beenfurther expanded to include even more complex calibration algorithmsrequiring higher statistics to reach the needed precision. Thisimposed the introduction of a new paradigm in the creation of thecalibration datasets for unattended workflows and opened the door ...

  12. Designing Collaborative Developmental Standards by Refactoring of the Earth Science Models, Libraries, Workflows and Frameworks.

    Science.gov (United States)

    Mirvis, E.; Iredell, M.

    2015-12-01

    The operational (OPS) NOAA National Centers for Environmental Prediction (NCEP) suite, traditionally, consist of a large set of multi- scale HPC models, workflows, scripts, tools and utilities, which are very much depending on the variety of the additional components. Namely, this suite utilizes a unique collection of the in-house developed 20+ shared libraries (NCEPLIBS), certain versions of the 3-rd party libraries (like netcdf, HDF, ESMF, jasper, xml etc.), HPC workflow tool within dedicated (sometimes even vendors' customized) HPC system homogeneous environment. This domain and site specific, accompanied with NCEP's product- driven large scale real-time data operations complicates NCEP collaborative development tremendously by reducing chances to replicate this OPS environment anywhere else. The NOAA/NCEP's Environmental Modeling Center (EMC) missions to develop and improve numerical weather, climate, hydrological and ocean prediction through the partnership with the research community. Realizing said difficulties, lately, EMC has been taken an innovative approach to improve flexibility of the HPC environment by building the elements and a foundation for NCEP OPS functionally equivalent environment (FEE), which can be used to ease the external interface constructs as well. Aiming to reduce turnaround time of the community code enhancements via Research-to-Operations (R2O) cycle, EMC developed and deployed several project sub-set standards that already paved the road to NCEP OPS implementation standards. In this topic we will discuss the EMC FEE for O2R requirements and approaches in collaborative standardization, including NCEPLIBS FEE and models code version control paired with the models' derived customized HPC modules and FEE footprints. We will share NCEP/EMC experience and potential in the refactoring of EMC development processes, legacy codes and in securing model source code quality standards by using combination of the Eclipse IDE, integrated with the

  13. Design and Implementation of the Workflow Business Generation System%工作流业务生成系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈保华; 詹舒波

    2016-01-01

    近年来,国内客户关系管理业务发展迅速,为了提供更好的客户服务质量,业务员迫切需要一套使用方便、功能齐全的工作流业务生成系统。本课题将结合数据流以及工作流管理等功能,探索提供方便有效的工作流业务生成方式,为数万业务人员提供直观、灵活、可扩展的业务生成功能,快速高效地生成业务文件。本系统目前已经实际运行在 CCWF 工作流引擎上,实践表明,本系统提高了业务人员的工作效率,促进了 CRM 系统的发展,促进呼叫中心和工作流融合的使用和发展。%This paper aimed to study CRM workflow business system and design a flexible CRM workflow business generation environment to fullfill the function of the workflow engine. This paper introduced an easy and convenient way to generate workflow business files using data stream and workflow management. This system has provided a straightforward and flexible function for thousands of businessman to easily generate business files. The system has been running in CCWF workflow engine and promote the development of the CRM system and the combination of call center and workflow.

  14. Toward Design, Modelling and Analysis of Dynamic Workflow Reconfigurations - A Process Algebra Perspective

    DEFF Research Database (Denmark)

    Mazzara, M.; Abouzaid, F.; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design and to ve......This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design...

  15. E-BioFlow: Different perspectives on scientific workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.; Breit, T.; Nijholt, A.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  16. E-BioFlow: Different Perspectives on Scientific Workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; Vet, van der P.E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R.F.; Schneider, K.; Toma, C.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  17. An Intelligent Software Workflow Process Design for Location Management on Mobile Devices

    CERN Document Server

    Rao, N Mallikharjuna

    2012-01-01

    Advances in the technologies of networking, wireless communication and trimness of computers lead to the rapid development in mobile communication infrastructure, and have drastically changed information processing on mobile devices. Users carrying portable devices can freely move around, while still connected to the network. This provides flexibility in accessing information anywhere at any time. For improving more flexibility on mobile device, the new challenges in designing software systems for mobile networks include location and mobility management, channel allocation, power saving and security. In this paper, we are proposing intelligent software tool for software design on mobile devices to fulfill the new challenges on mobile location and mobility management. In this study, the proposed Business Process Redesign (BPR) concept is aims at an extension of the capabilities of an existing, widely used process modeling tool in industry with 'Intelligent' capabilities to suggest favorable alternatives to an ...

  18. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  19. Use of workflow technology to assist remote caretakers in a smart kitchen environment designed for elderly people suffering from dementia

    OpenAIRE

    Sarni, T. (Tomi)

    2013-01-01

    Purpose of this study was to determine feasibility of an information system that enables remote assistance between caretakers and elderly people suffering from dementia in a smart kitchen environment. Such system could alleviate stress experienced by caretakers by enabling provisioning of care giving between any combination of informal and formal caretakers, and by increasing mobility of caretakers. Second research problem was to evaluate benefits and drawbacks of using workflow technology to...

  20. Professional Windows Workflow Foundation

    CERN Document Server

    Kitta, Todd

    2007-01-01

    If you want to gain the skills to build Windows Workflow Foundation solutions, then this is the book for you. It provides you with a clear, practical guide on how to develop workflow-based software and integrate it into existing technology landscapes. Throughout the pages, you'll also find numerous real-world examples and sample code that will help you to get started quickly.Each major area of Windows Workflow Foundation is explored in depth along with some of the fundamentals operations related to generic workflow applications. You'll also find detailed coverage on how to develop workflow in

  1. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  2. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; Vet, van der P.E.; Veer, van der G.C.; Roos, M.; Dijk, van E.M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  3. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  4. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  5. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  6. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  7. Optimizing Workflow Data Footprint

    Directory of Open Access Journals (Sweden)

    Gurmeet Singh

    2007-01-01

    Full Text Available In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.

  8. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  9. 基于MVC架构的轻量级工作流引擎研究与设计%Research and design of lightweight workflow engine based on MVC framework

    Institute of Scientific and Technical Information of China (English)

    丁苍峰

    2011-01-01

    为了克服传统工作流引擎的不足,设计基于MVC架构的轻量级工作流引擎,搭建了由客户层、表示逻辑层、持久层和数据库服务层组成的多层次工作流管理系统的整体架构,该引擎集成在引擎控制器中,通过引擎控制器控制任务的流转,并详细地介绍轻量级工作流引擎各个功能组件,能根据需要灵活的添加和裁减各种组件来扩展工作流引擎的功能,并且基于引擎的系统可以根据具体需求灵活地定制于不同领域的工作流管理系统中.%In order to overcome the deficiency of traditional workflow engine, this paper designs lightweight workflow engine based on MVC framework, builds overall architecture which is composed of multilayer of lightweight workflow management,designs core framework of lightweight workflow management, the engine is integrated in the engine controls which controls task flow, also elaborates various components and functions of workflow engine. According to specific needs, the core framework allows flexible addition and reduction of various components to expand the workflow engine function, also flexibly customs to workflow management system of different areas.

  10. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  11. A navigation system for percutaneous needle interventions based on PET/CT images: design, workflow and error analysis of soft tissue and bone punctures.

    Science.gov (United States)

    Oliveira-Santos, Thiago; Klaeser, Bernd; Weitzel, Thilo; Krause, Thomas; Nolte, Lutz-Peter; Peterhans, Matthias; Weber, Stefan

    2011-01-01

    Percutaneous needle intervention based on PET/CT images is effective, but exposes the patient to unnecessary radiation due to the increased number of CT scans required. Computer assisted intervention can reduce the number of scans, but requires handling, matching and visualization of two different datasets. While one dataset is used for target definition according to metabolism, the other is used for instrument guidance according to anatomical structures. No navigation systems capable of handling such data and performing PET/CT image-based procedures while following clinically approved protocols for oncologic percutaneous interventions are available. The need for such systems is emphasized in scenarios where the target can be located in different types of tissue such as bone and soft tissue. These two tissues require different clinical protocols for puncturing and may therefore give rise to different problems during the navigated intervention. Studies comparing the performance of navigated needle interventions targeting lesions located in these two types of tissue are not often found in the literature. Hence, this paper presents an optical navigation system for percutaneous needle interventions based on PET/CT images. The system provides viewers for guiding the physician to the target with real-time visualization of PET/CT datasets, and is able to handle targets located in both bone and soft tissue. The navigation system and the required clinical workflow were designed taking into consideration clinical protocols and requirements, and the system is thus operable by a single person, even during transition to the sterile phase. Both the system and the workflow were evaluated in an initial set of experiments simulating 41 lesions (23 located in bone tissue and 18 in soft tissue) in swine cadavers. We also measured and decomposed the overall system error into distinct error sources, which allowed for the identification of particularities involved in the process as well

  12. Design of Internet of Things Workflow Framework on Overlay Network%覆盖网络上物联网工作流框架的设计

    Institute of Scientific and Technical Information of China (English)

    魏歌

    2016-01-01

    针对物联网工作流系统的构建,文中提出一种通用的实现方法。遵照YD/T 2437—2012标准给出的框架,把网络/业务网络层中的应用支撑网,作为支撑虚拟的覆盖网络的基础。在此框架中,覆盖网络被视为一种物联网应用开发的支持平台。在此基础上,参照工作流管理联盟对于工作流系统框架建立的要求,对该系统的建立、运行和实现的三个阶段及其内部结构进行了设计。通过在此框架中的层次结构、功能及其相互关系等方面的问题逐一得到处理,这些方法得到了进一步的说明。%In order to construct the workflow system on the Internet of Things ( IoT) ,a general approach is put forward as the way of im-plementing it. According to given framework under the standard YD/T 2437—2012,the application supporting network in network layer/business network layer are regarded as the basis of supporting virtual overlay network. In this framework,the overlay network is consid-ered a kind of the Internet of Things platform for the support of application development. And on this basis,according to the requirements of Workflow Management Coalition( WfMC) to establish workflow system framework,three stages of the system ( the establishment,op-eration and implementation) and their internal structure are designed. These methods receive further explanation by problems with hierar-chy,functions and their mutual relations in the framework to be tackled one by one.

  13. The Design and Implementation of A Agents-based Workflow Model%一种基于Agent的工作流模型的设计与实现

    Institute of Scientific and Technical Information of China (English)

    陈善国; 高济

    2000-01-01

    In this paper,we present the SaFlow,a workflow management system based on agents. After describing the architecture,we discuss in details about the composing and implementation of MSA,the major part of SaFlow. At last,a product quotation workflow system is demonstrated,as an application of SaFlow.

  14. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  15. Study and Design of Workflow Management System Based on JBPM and JPDL%基于JBPM与JPDL的工作流管理系统的研究与设计

    Institute of Scientific and Technical Information of China (English)

    凌正俊

    2011-01-01

    阐明了工作流及工作流管理系统的概念.对JBPM进行深入的研究,包括工作流引擎JBPM的概述、流程定义语言、工作流开发步骤、工作流持久化数据库映射、事务管理、记录日志以及流程设计器等.并根据工作流引擎JBPM和JPDL方面的实践经验,在其基础上建立基于JBPM架构的工作流管理系统,以达到更好地实现流程管理,简化流程系统的设计与开发的目的.从而降低程序代码之间的耦合度,节约软件开发成本,提高软件开发的效率,提高系统的复用性、扩展性和可维护性.%First illustrates the workflow and the concept of workflow management system;Second on JBPM for in-depth of research, including workflow engine JBPM of overview, process defined language, workflow development steps, workflow lasting of database map, affairs management, records log and process design device,etc. Under workflow engine JBPM and JPDL aspects of practice experience in its foundation established based on JBPM schema of work flow management system to achieve better achieve process management, streamline processes, system design and development purposes. Thereby reducing the coupling between code and save software development costs and improve software development efficiency, improve system reusability, scalability and maintainability.

  16. Research and design of dynamic workflow engine based on migration strategy%基于迁移策略的动态工作流的研究与设计

    Institute of Scientific and Technical Information of China (English)

    谭巍巍; 李先国

    2014-01-01

    为了使工作流系统能够在运行过程中支持过程定义的动态可变,探讨工作流在过程方面发生动态修改的分类,分析重启、继续执行、迁移等多种动态修改策略的优缺点,重点研究了迁移策略的实现,提出了一种基于迁移策略的动态工作流的设计方法。该方法首先加载数据表中的过程定义信息,并根据反射机制动态创建相应的工作流对象及其方法。然后设计了一种迁移策略执行算法,根据迁移规则XML配置文件,动态完成运行时的旧工作流到新工作流的转换,从而提高了工作流系统的动态性。%In order to make the workflow system to support the dynamic variable of procedure definition in operation pro-cess,the classification of workflow which may occur the dynamic modification in the process is discussed. The advantages and disadvantages of dynamic modification strategies,including restart,continuous execution and migration are analyzed. The imple-mentation of migration strategy is researched emphatically. A design method of dynamic workflow based on migration strategy is proposed. The information of process definition in the data table is loaded,and the appropriate workflow objects and their methods according to the reflection mechanism are dynamically created in this method. An executing algorithm of migration strate-gy is also designed. According to the migration rules which are written by XML,the workflow system accomplishes the conver-sion from the old workflow to the new workflow to improve the dynamic property of the workflow system.

  17. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  18. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  19. Rockslide susceptibility and hazard assessment for mitigation works design along vertical rocky cliffs: workflow proposal based on a real case-study conducted in Sacco (Campania), Italy

    Science.gov (United States)

    Pignalosa, Antonio; Di Crescenzo, Giuseppe; Marino, Ermanno; Terracciano, Rosario; Santo, Antonio

    2015-04-01

    field mapping and on punctual geomechanical data from scan-line surveying, allowed the rock mass partitioning in homogeneous geomechanical sectors and data interpolation through bounded geostatistical analyses on 3d models. All data, resulting from both approaches, have been referenced and filed in a single spatial database and considered in global geo-statistical analyses for deriving a fully modelled and comprehensive evaluation of the rockslide susceptibility. The described workflow yielded the following innovative results: a) a detailed census of single potential instabilities, through a spatial database recording the geometrical, geological and mechanical features, along with the expected failure modes; b) an high resolution characterization of the whole slope rockslide susceptibility, based on the partitioning of the area according to the stability and mechanical conditions which can be directly related to specific hazard mitigation systems; c) the exact extension of the area exposed to the rockslide hazard, along with the dynamic parameters of expected phenomena; d) an intervention design for hazard mitigation.

  20. Designing an architectural style for dynamic medical Cross-Organizational Workflow management system: an approach based on agents and web services.

    Science.gov (United States)

    Bouzguenda, Lotfi; Turki, Manel

    2014-04-01

    This paper shows how the combined use of agent and web services technologies can help to design an architectural style for dynamic medical Cross-Organizational Workflow (COW) management system. Medical COW aims at supporting the collaboration between several autonomous and possibly heterogeneous medical processes, distributed over different organizations (Hospitals, Clinic or laboratories). Dynamic medical COW refers to occasional cooperation between these health organizations, free of structural constraints, where the medical partners involved and their number are not pre-defined. More precisely, this paper proposes a new architecture style based on agents and web services technologies to deal with two key coordination issues of dynamic COW: medical partners finding and negotiation between them. It also proposes how the proposed architecture for dynamic medical COW management system can connect to a multi-agent system coupling the Clinical Decision Support System (CDSS) with Computerized Prescriber Order Entry (CPOE). The idea is to assist the health professionals such as doctors, nurses and pharmacists with decision making tasks, as determining diagnosis or patient data analysis without stopping their clinical processes in order to act in a coherent way and to give care to the patient.

  1. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    Science.gov (United States)

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  2. CaGrid Workflow Toolkit: A taverna based workflow tool for cancer grid

    Directory of Open Access Journals (Sweden)

    Sulakhe Dinanath

    2010-11-01

    Full Text Available Abstract Background In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. Results CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short, an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. Conclusions By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is

  3. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  4. 基于数据驱动的工作流引擎的设计与实现%Design and Implementation of Data-Driven Based Workflow Engine

    Institute of Scientific and Technical Information of China (English)

    陈义松; 汪芸

    2012-01-01

    Workflow is a computerized business process.Workflow engine provides the runtime environment for workflow processes,playing a key role in the implementation of processes.The traditional workflow technology reflects a number of deficiencies in dealing with complicated and changing process.This paper proposes a data-driven based mode to achieve loose coupling between activities,and designs and implements workflow engine with the purpose of supporting this kind of data-driven mode.The engine copes with the complicated business processes with strong processing capabilities and can automatically mode the processes.%工作流是计算机化的业务流程.工作流引擎为工作流流程提供运行环境,在流程的执行过程中起着关键性作用.传统的工作流技术在处理复杂多变的流程时,缺乏良好的建模以及适应机制.本文提出一种基于数据驱动的工作流运行方式,实现了活动与活动之间的松散耦合,并以此为基础设计并实现支持这种数据驱动方式的工作流引擎.该引擎在应对复杂多变的业务流程时具有较强的处理能力,并能够实现流程的自动建模.

  5. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  6. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  7. Design and implementation of a kind of cross-organizational flexible workflow engine%一种跨组织柔性工作流引擎的设计与实现

    Institute of Scientific and Technical Information of China (English)

    蔡丽丽

    2014-01-01

    工作流在实际应用中经常涉及到不同系统间的业务流程协作问题,以及由于业务突发变动而产生的动态适应问题。为了解决这两者问题,提出了一种结合跨组织和柔性工作流技术的工作流引擎设计方案。对已有的跨组织工作流技术和柔性工作流技术的相关研究成果进行了分析总结,在此基础上进一步设计了一种工作流模型,该模型既支持跨业务系统交互,同时也支持业务内部的流程动态变更。结合实际场景分析来阐述该模型的跨组织业务处理和业务动态变更机制的功能设计。以流程回退为例介绍该模型的原型实现。%To resolve the matter of business process cooperation between different systems and the dynamic adaptive matter of process change which is made by some emergencies, this paper presents a blue print of workflow engine’s design which combines cross-organizational workflow and flexible workflow. The cross-organizational workflow and flexible workflow researches are summarized and analyzed. Then on the basis of this, a kind of workflow model supporting cross-organiza-tional interaction and process dynamic adaptability is put forward. According to analysis of the actual scenes, this paper expounds the design of the cross-organizational support and mechanism of dynamic change and introduces the realization of this model based on the example of rolling back process.

  8. A workflow for in silico design of hIL-10 and ebvIL-10 inhibitors using well-known miniprotein scaffolds.

    Science.gov (United States)

    Dueñas, Salvador; Aguila, Sergio A; Pimienta, Genaro

    2017-04-01

    The over-expression of immune-suppressors such as IL-10 is a crucial landmark in both tumor progression, and latent viral and parasite infection. IL-10 is a multifunctional protein. Besides its immune-cell suppressive function, it also promotes B-cell tumorigenesis of lymphomas and melanoma. Human pathogens like unicellular parasites and viruses that remain latent inside B cells promote the over-expression of hIL-10 upon infection, which inhibits cell-mediated immune surveillance, and at the same time mediates B cell proliferation. The B-cell specific oncogenic latent virus Epstein-Barr virus (EBV) encodes a viral homologue of hIL-10 (ebvIL-10), expressed during lytic viral proliferation. Once expressed, ebvIL-10 inhibits cell-mediated immune surveillance, assuring EBV re-infection. During long-term latency, EBV-infected B cells over-express hIL-10 to assure B-cell proliferation, occasionally inducing EBV-mediated lymphomas. The amino acid sequences of hIL-10 and ebvIL-10 are more than 80% identical and thus have a very similar tridimensional structure. Based on their published crystallographic structures bound to their human receptor IL10R1, we report a structure-based design of hIL-10 and ebvIL-10 inhibitors based on 3 loops from IL10R1 that establish specific hydrogen bonds with the two IL10s. We have grafted these loops onto a permissible loop in three well-known miniprotein scaffolds-the Conus snail toxin MVIIA, the plant-derived trypsin inhibitor EETI, and the human appetite modulator AgRP. Our computational workflow described in detail below was invigorated by the negative and positive controls implemented, and therefore paves the way for future in vitro and in vivo validation assays of the IL-10 inhibitors engineered.

  9. Implementing and Running a Workflow Application on Cloud Resources

    Directory of Open Access Journals (Sweden)

    Gabriela Andreea MORAR

    2011-01-01

    Full Text Available Scientist need to run applications that are time and resource consuming, but, not all of them, have the requires knowledge to run this applications in a parallel manner, by using grid, cluster or cloud resources. In the past few years many workflow building frameworks were developed in order to help scientist take a better advantage of computing resources, by designing workflows based on their applications and executing them on heterogeneous resources. This paper presents a case study of implementing and running a workflow for an E-bay data retrieval application. The workflow was designed using Askalon framework and executed on the cloud resources. The purpose of this paper is to demonstrate how workflows and cloud resources can be used by scientists in order to achieve speedup for their application without the need of spending large amounts of money on computational resources.

  10. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  11. Design and development of a mobile computer application to reengineer workflows in the hospital and the methodology to evaluate its effectiveness.

    Science.gov (United States)

    Holzinger, Andreas; Kosec, Primoz; Schwantzer, Gerold; Debevc, Matjaz; Hofmann-Wellenhof, Rainer; Frühauf, Julia

    2011-12-01

    This paper describes a new method of collecting additional data for the purpose of skin cancer research from the patients in the hospital using the system Mobile Computing in Medicine Graz (MoCoMed-Graz). This system departs from the traditional paper-based questionnaire data collection methods and implements a new composition of evaluation methods to demonstrate its effectiveness. The patients fill out a questionnaire on a Tablet-PC (or iPad Device) and the resulting medical data is integrated into the electronic patient record for display when the patient enters the doctor's examination room. Since the data is now part of the electronic patient record, the doctor can discuss the data together with the patient making corrections or completions where necessary, thus enhancing data quality and patient empowerment. A further advantage is that all questionnaires are in the system at the end of the day - and manual entry is no longer necessary - consequently raising data completeness. The front end was developed using a User Centered Design Process for touch tablet computers and transfers the data in XML to the SAP based enterprise hospital information system. The system was evaluated at the Graz University Hospital - where about 30 outpatients consult the pigmented lesion clinic each day - following Bronfenbrenner's three level perspective: The microlevel, the mesolevel and the macrolevel: On the microlevel, the questions answered by 194 outpatients, evaluated with the System Usability Scale (SUS) resulted in a median of 97.5 (min: 50, max: 100) which showed that it is easy to use. On the mesolevel, the time spent by medical doctors was measured before and after the implementation of the system; the medical task performance time of 20 doctors (age median 43 (min: 29; max: 50)) showed a reduction of 90%. On the macrolevel, a cost model was developed to show how much money can be saved by the hospital management. This showed that, for an average of 30 patients per day

  12. Design and implementation of WF-based workflow system%基于WF的工作流系统设计与实现

    Institute of Scientific and Technical Information of China (English)

    王淑蓉; 孟德恩

    2011-01-01

    Due to the requirement of office automation in enterprise, the architecture of WF-based web workflow system is presented while integrating WF into Asp. Net; A research on communication between workflow and host has been done, and the persistence of workflow has also been related. This system is developed and implemented in the developing environment of Visual Studio 2008 and SQL Server 2005.%针对企业办公自动化的应用需求,将WF与Asp.Net技术结合,提出了基于WF的Web工作流系统架构.研究了系统开发所涉及的关键技术,即工作流与宿主间的通信及工作流实例的持久化问题.在Visual Studi0 2008与SQL Server 2005开发环境中,开发并实现了工作流系统.

  13. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  14. A workflow for digitalization projects

    OpenAIRE

    De Mulder, Tom

    2005-01-01

    More and more institutions want to convert their traditional content to digital formats. In such pro jects the digitalization and metadata stages often happen asynchronously. This paper identifies the importance of frequent cross-verification of both. We suggest a workflow to formalise this process, and a possible technical implementation to automate this workflow.

  15. Automating Workflow using Dialectical Argumentation

    NARCIS (Netherlands)

    Urovi, Visara; Bromuri, Stefano; McGinnis, Jarred; Stathis, Kostas; Omicini, Andrea

    2008-01-01

    This paper presents a multi-agent framework based on argumentative agent technology for the automation of the workflow selection and execution. In this framework, workflow selection is coordinated by agent interactions governed by the rules of a dialogue game whose purpose is to evaluate the workflo

  16. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful i

  17. The Design and Implementation of Student Financial Assistance Management System Based on Workflow Mechanism%基于工作流机制的学生资助管理系统的设计与实现

    Institute of Scientific and Technical Information of China (English)

    揭财明; 吴凌

    2014-01-01

    Based on business content of college student financial assistance management work, this paper takes workflow mechanism as the core, and uses popular Asp.net technology to explore the design and implementation of student financial assistance management system, and achieves initial results.%本文基于高校学生资助管理工作的业务内容,以工作流机制为核心,采用如今较为流行的Asp.net技术,探讨学生资助管理系统的设计与实现,取得了初步成效。

  18. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  19. CMS Alignement and Calibration workflows lesson learned and future plans

    CERN Document Server

    De Guio, Federico

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  20. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-10-06

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.

  1. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical...

  2. 基于工作流的拱墅停电管理系统设计与实现%Design and Implementation of Outage Management System based on Workflow

    Institute of Scientific and Technical Information of China (English)

    葛群波; 杨玉山

    2011-01-01

    This paper introduced workflow technology and the current daily outage management of Gongshu Power Supply Bureau. A workflow technology-based outage management approach, described the outage management system design principle. Discussed some concrete implementations such as process design and power outage notifieationAt the end, draw a conclusion that the outage management system is a good solution to solve outage work of Gongshu power supply bureau.%介绍了工作流技术和目前拱墅供电局日常停电管理现状,提出了一种基于工作流技术实现停电管理的方法,阐述了停电管理系统的设计原理,对流程设计、停电通知等具体实现进行了一定的论述,认为停电管理系统是目前一种较好地解决拱墅供电局停电工作的方案.

  3. CA-PLAN, a Service-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Shung-Bin Yan; Feng-Jian Wang

    2005-01-01

    Workflow management systems (WfMSs) are accepted worldwide due to their ability to model and control business processes. Previously, we defined an intra-organizational workflow specification model, Process LANguage (PLAN).PLAN, with associated tools, allowed a user to describe a graph specification for processes, artifacts, and participants in an organization. PLAN has been successfully implemented in Agentflow to support workflow (Agentflow) applications. PLAN,and most current WfMSs are designed to adopt a centralized architecture so that they can be applied to a single organization.However, in such a structure, participants in Agentflow applications in different organizations cannot serve each other with workflows.In this paper, a service-oriented cooperative workflow model, Cooperative Agentflow Process LANguage (CA-PLAN) is presented. CA-PLAN proposes a workflow component model to model inter-organizational processes. In CA-PLAN, an interorganizational process is partitioned into several intra-organizational processes. Each workflow system inside an organization is modeled as an Integrated Workflow Component (IWC). Each IWC contains a process service interface, specifying process services provided by an organization, in conjunction with a remote process interface specifying what remote processes are used to refer to remote process services provided by other organizations, and intra-organizational processes. An IWC is a workflow node and participant. An inter-organizational process is made up of connections among these process services and remote processes with respect to different IWCs. In this paper, the related service techniques and supporting tools provided in Agentflow systems are presented.

  4. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  5. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  6. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  7. 应用工作流引擎的用电信息采集系统设计与实现%Design and implementation of electricity information acquisition system based on workflow engine

    Institute of Scientific and Technical Information of China (English)

    罗天; 赵丹阳; 郑静雯

    2015-01-01

    Now one of important problems of the current elec-tricity information collection system is that business process is mainly in accordance with the procedures. It needs to modify the program to adapt the changes of the business logic, which causes a large amount of construction and maintenance. Workflow engine can be used to solve the above problems. It can automatically per-form predefined business processes and drive software to run order-ly. Based on the workflow technology, this article designed and im-plemented the electricity information acquisition system. It rede-fined, managed and monitored the workflow, which can effectively improve the efficiency of the business. It can also easily adjust and redefine workflow to adapt the change of the actual business logic, and greatly reduce the development cost and maintenance cost.%目前用电信息采集系统存在着业务流程固化于程序里,当业务逻辑发生变化时需要通过修改程序来适应其变化,造成了较大的开发量和维护量的问题.工作流引擎可以用来解决上述问题,它能够按照预先定义好的工作流逻辑执行工作流实例,并对整个工作流程进行管理和监控.基于工作流引擎对用电信息采集系统进行了设计实现,对工作流程进行定义、管理和监控,有效提升了业务办理效率,并且在实际业务逻辑发生变化时,可以只通过适当调整或重新定义工作流程来适应其变化,大大降低了开发成本和维护成本.

  8. Standards for business analytics and departmental workflow.

    Science.gov (United States)

    Erickson, Bradley J; Meenan, Christopher; Langer, Steve

    2013-02-01

    Efficient workflow is essential for a successful business. However, there is relatively little literature on analytical tools and standards for defining workflow and measuring workflow efficiency. Here, we describe an effort to define a workflow lexicon for medical imaging departments, including the rationale, the process, and the resulting lexicon.

  9. 基于工作流的政府电子采购系统设计%Design of government electronic procurement system based on workflow

    Institute of Scientific and Technical Information of China (English)

    徐静; 秦龙

    2012-01-01

    In order to make the government departments in the government procurement of information timely and fully grasp, and at the appropriate time to make real-time response, according to some optimization strategies, in the right evaluation method to ensure fast and effective procurement organization. Workflow technology is a new solution method was tried into government procurement, in order to achieve a reasonable and perfect purchase. Workflow technology in government procurement in the introduction, for government procurement, procurement efficiency, save money, have greatly improved. This paper analyzed and discussed.%为了使政府部门在政府采购中对有关信息及时充分的掌握,同时在适当的时间作出实时的响应,按照一定的优化策略,在正确的评估方法保证下快速有效的组织采购。工作流技术作为一种新兴的解决方法被尝试着引入到了政府采购中,以便实现合理完善的采购。工作流技术在政府采购中的引入,对于政府采购速度,采购效率,节约资金等方面有很大的提高。本文就此进行了分析及论述。

  10. Design of large warehouse logistics monitoring system based on workflow%基于工作流的大型仓库物流监控系统设计

    Institute of Scientific and Technical Information of China (English)

    李志民; 赵一丁

    2016-01-01

    当前大部分大型仓库物流监控系统,主要面向结构化的业务过程,功能单一,监控性能差。而工作流管理技术能够提高仓库物流监控系统的灵活性,通过高效率塑造业务工程,确保企业效益最大化。因此,提出基于工作流的大型仓库物流监控系统。该系统采用构件化设计,使用脚本语言描述仓库物流监控的工作流过程,用数据库保存工作流系统中的数据,客户端通过浏览器的方式完成信息交互,系统包括数据层、业务层以及表示层。系统由运输管理功能、仓储管理功能、客户管理功能、系统维护功能、监控管理功能和工作流管理功能六大功能模块构成。给出用户采用系统进行检索和查询的关键代码。实验结果表明,所设计的大型仓库物流监控系统的监控时间、监控效率以及监控精度性能都较优,具有较高的应用价值。%The current most large warehouse logistics monitoring systems mainly face the structured business process due to its unitary function and poor monitoring performance. The workflow management technology can improve the flexibility of the warehouse logistics monitoring system,and ensure the enterprise benefit maximization by shaping the business project with high efficiency,so the large warehouse logistics monitoring system based on workflow is proposed. The system adopts component⁃based design,takes scripting language to describe the workflow process of the warehouse logistics monitoring,and uses database to store the data being in workflow system. After that,the information interaction on the client⁃side is accomplished through the browser. The system including data layer,business layer and presentation layer is composed of six function modules of transpor⁃tation management,warehouse management,customer management,system maintenance,monitoring management and workflow management. The key codes for user’s retrieval

  11. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  12. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    Science.gov (United States)

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  13. Flash Builder and Flash Catalyst The New Workflow

    CERN Document Server

    Peeters, Steven

    2010-01-01

    The Flash Platform is changing. Flash Builder and Flash Catalyst have brought a new separation of design and coding to web development that enables a much more efficient and streamlined workflow. For designers and developers used to the close confines of Flash, this is a hugely liberating but at first alien concept. This book teaches you the new workflow for the Flash platform. It gives an overview of the technologies involved and provides you with real-world project examples and best-practice guidelines to get from design to implementation with the tools at hand. * Includes many examples* Foc

  14. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  15. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  16. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  17. Modeling workflow using XML and Petri net

    Institute of Scientific and Technical Information of China (English)

    杨东; 温泉; 张申生

    2004-01-01

    Nowadays an increasing number of workflow products and research prototypes begin to adopt XML for representing workflow models owing to its easy use and well understanding for people and machines. However, most of workflow products and research prototypes provide the few supports for the verification of XML-based workflow model, such as free-deadlock properties, which is essential to successful application of workflow technology. In this paper, we tackle this problem by mapping the XML-based workflow model into Petri-net, a kind of well-known formalism for modeling,analyzing and verifying system. As a result, the XML-based workflow model can be automatically verified with the help of general Petri-net tools, such as DANAMICS. The presented approach not only enables end users to represent workflow model with XML-based modeling language, but also the correctness of model can be ensured, thus satisfying the needs of business processes.

  18. Rapid Energy Modeling Workflow Demonstration

    Science.gov (United States)

    2013-10-31

    BIM Building Information Modeling BPA Building Performance Analysis BTU British Thermal Unit CBECS Commercial Building ...geometry, orientation, weather, and materials, generates 3D Building Information Models ( BIM ) guided by satellite views of building footprints and...Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and conceptual energy analysis.

  19. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  20. Constructing workflows from script applications

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.; Malawski, M.

    2012-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  1. An Overview of Workflow Management on Mobile Agent Technology

    Directory of Open Access Journals (Sweden)

    Anup Patnaik

    2014-07-01

    Full Text Available Mobile agent workflow management/plugins is quite appropriate to handle control flows in open distributed system; basically it is the emerging technology which can bring the process oriented tasks to run as a single unit from diverse frameworks. This workflow technology offers organizations the opportunity to reshape business processes beyond the boundaries of their own organizations so that instead of static models, modern era incurring dynamic workflows which can respond the changes during its execution, provide necessary security measures, great degree of adaptivity, troubleshoot the running processes and recovery of lost states through fault tolerance. The prototype that we are planning to design makes sure to hold reliability, security, robustness, scalability without being forced to make tradeoffs the performance. This paper is concerned with design, implementation and evaluation of performance on the improved methods of proposed prototype models based on current research in this domain.

  2. Execution Time Estimation for Workflow Scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A..S.Z.; Kovalchuk, S.V.; Makkes, M.X.

    2014-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and th

  3. Integration of the radiotherapy irradiation planning in the digital workflow; Integration der Bestrahlungsplanung in den volldigitalen Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Roehner, F.; Schmucker, M.; Henne, K.; Bruggmoser, G.; Grosu, A.L.; Frommhold, H.; Heinemann, F.E. [Universitaetsklinikum Freiburg (Germany). Klinik fuer Strahlenheilkunde; Momm, F. [Ortenau Klinikum, Offenburg-Gengenbach (Germany). Radio-Onkologie

    2013-02-15

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  4. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  5. The Prosthetic Workflow in the Digital Era

    Science.gov (United States)

    De Franco, Michele; Bosetti, Giovanni

    2016-01-01

    The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland) intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010) software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient. PMID:27829834

  6. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  7. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  8. Nursing medication administration and workflow using computerized physician order entry.

    Science.gov (United States)

    Tschannen, Dana; Talsma, Akkeneel; Reinemeyer, Nicholas; Belt, Christine; Schoville, Rhonda

    2011-07-01

    The benefits of computerized physician order entry systems have been described widely; however, the impact of computerized physician order entry on nursing workflow and its potential for error are unclear. The purpose of this study was to determine the impact of a computerized physician order entry system on nursing workflow. Using an exploratory design, nurses employed on an adult ICU (n = 36) and a general pediatric unit (n = 50) involved in computerized physician order entry-based medication delivery were observed. Nurses were also asked questions regarding the impact of computerized physician order entry on nursing workflow. Observations revealed total time required for administering medications averaged 8.45 minutes in the ICU and 9.93 minutes in the pediatric unit. Several additional steps were required in the process for pediatric patients, including preparing the medications and communicating with patients and family, which resulted in greater time associated with the delivery of medications. Frequent barriers to workflow were noted by nurses across settings, including system issues (ie, inefficient medication reconciliation processes, long order sets requiring more time to determine medication dosage), less frequent interaction between the healthcare team, and greater use of informal communication modes. Areas for nursing workflow improvement include (1) medication reconciliation/order duplication, (2) strategies to improve communication, and (3) evaluation of the impact of computerized physician order entry on practice standards.

  9. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  10. The Design of Dynamic Workflow System of Hospital Office System%医院办公系统中动态工作流程的设计与实现

    Institute of Scientific and Technical Information of China (English)

    柳适; 曾小平; 陈俊

    2015-01-01

    In order to solve the increasingly complex problems of hospital work processes,improve the approval process,improve the efficiency of the approval procedures,to realize the standardized,information,paperless of the approval process. The First Affiliated Hospital of Wenzhou Medical University design a application which is suitable for the hospital's dynamic workflow approval procedures. This paper introduce the design ,the key technologies and the effect of the application.%为了解决医院办事流程日益繁杂的问题,改进业务审批流程,提高办事流程的审批效率,实现流程审批的规范化、无纸化、信息化。温州医科大学附属第一医院设计并开发了一套适合医院使用的动态工作流程审批程序。对该程序的设计及使用的关键技术和使用效果进行了介绍。

  11. A Drupal-Based Collaborative Framework for Science Workflows

    Science.gov (United States)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between

  12. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  13. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  14. Agile parallel bioinformatics workflow management using Pwrake.

    OpenAIRE

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  15. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...

  16. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  17. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  18. Design and Implemetation of Online Customs Agent Management System Based on Workflow%基于工作流的在线代理报关管理系统设计与实现

    Institute of Scientific and Technical Information of China (English)

    周文琼; 王乐球; 郑述招

    2013-01-01

    随着计算机技术的不断发展和互联网的普遍应用,代理报关企业迫切需要实施管理信息系统,以对内建立企业内控机制,对外提高客户服务质量.本文基于工作流技术,阐述在线代理报关管理系统的设计与实现,重点描述系统各个功能模块、系统流程、系统运行环境、系统数据库设计.该系统在某港口报关行已实施,取得了良好效果,与同类其他系统比较,该系统具有更安全、更稳定、更易扩展、更易维护之优点.%With the continuous development of computer technology and the popularity of Internet application,in order to establish internal control mechanisms and improve the quality of customer service externally,this paper describes the design and implementation of online customs agent management system based on Workflow technology,focusing on the various functional modules of the system description,system processes,system operating environment,system database design.The system has been implemented on a port customs broker,compared with similar other systems,the system is safer,more stable,easier to expand and easier to maintain.

  19. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  20. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  1. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    Analysis of workflow processes with quantitative aspects like timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and study the foundational problems of soundness and strong (time-bounded) soundness. We explore the decidability o...

  2. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  3. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.; Grefen, P.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  4. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    Science.gov (United States)

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  5. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    output, given the input resource at hand. Still, in such cases it may be possible to reach the set goal by chaining a number of tools. The approach presented here frees the user of having to meddle with tools and the construction of workflows. Instead, the user only needs to supply the workflow manager......The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  6. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  7. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  8. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... and the readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...

  9. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  10. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  11. Workflow in the operating room: review of Arrowhead 2004 seminar on imaging and informatics (Invited Paper)

    Science.gov (United States)

    Lemke, Heinz U.; Ratib, Osman M.; Horii, Steven C.

    2005-04-01

    This review paper is based on the 2004 UCLA Seminar on Imaging and Informatics (http://www.radnet.ucla.edu/Arrowhead2004/) which is a joint endeavour between the UCLA and the CARS organization, focussing on workflow analysis tools and the digital operating room. Eleven specific presentations of the Arrowhead Seminar have been summarized in this review referring to redesigning perioperative care for a high velocity OR, intraoperative ultrasound process and model, surgical workflow and surgical PACS, an integrated view , interactions in the surgical OR, workflow automation strategies and target applications, visualisation solutions for the operating room, navigating the fifth dimension, and design of digital operating rooms and interventional suites

  12. Visual compression of workflow visualizations with automated detection of macro motifs.

    Science.gov (United States)

    Maguire, Eamonn; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Davies, Jim; Chen, Min

    2013-12-01

    This paper is concerned with the creation of 'macros' in workflow visualization as a support tool to increase the efficiency of data curation tasks. We propose computation of candidate macros based on their usage in large collections of workflows in data repositories. We describe an efficient algorithm for extracting macro motifs from workflow graphs. We discovered that the state transition information, used to identify macro candidates, characterizes the structural pattern of the macro and can be harnessed as part of the visual design of the corresponding macro glyph. This facilitates partial automation and consistency in glyph design applicable to a large set of macro glyphs. We tested this approach against a repository of biological data holding some 9,670 workflows and found that the algorithmically generated candidate macros are in keeping with domain expert expectations.

  13. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  14. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    usage history to help Earth scientists better understand existing artifacts and how to use them in a proper manner? R2: Informed by insights derived from their computing contexts, how could such hidden knowledge be used to facilitate artifact reuse by Earth scientists? Our study of the two research questions will provide answers to three technical questions aiming to assist NEX users during workflow development: 1) How to determine what topics interest the researcher? 2) How to find appropriate artifacts? and 3) How to advise the researcher in artifact reuse? In this paper, we report our on-going efforts of leveraging social networking theory and analysis techniques to provide dynamic advice on artifact reuse to NEX users based on their surrounding contexts. As a proof of concept, we have designed and developed a plug-in to the VisTrails workflow design tool. When users develop workflows using VisTrails, our plug-in will proactively recommend most relevant sub-workflows to the users.

  15. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  16. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  17. Iterative Workflows for Numerical Simulations in Subsurface Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Chase, Jared M.; Schuchardt, Karen L.; Chin, George; Daily, Jeffrey A.; Scheibe, Timothy D.

    2008-07-08

    Numerical simulators are frequently used to assess future risks, support remediation and monitoring program decisions, and assist in design of specific remedial actions with respect to groundwater contaminants. Due to the complexity of the subsurface environment and uncertainty in the models, many alternative simulations must be performed, each producing data that is typically post-processed and analyzed before deciding on the next set of simulations Though parts of the process are readily amenable to automation through scientific workflow tools, the larger”research workflow”, is not supported by current tools. We present a detailed use case for subsurface modeling, describe the use case in terms of workflow structure, briefly summarize a prototype that seeks to facilitate the overall modeling process, and discuss the many challenges for building such a comprehensive environment.

  18. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  19. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  20. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  1. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  2. Formal Verification of Temporal Properties for Reduced Overhead in Grid Scientific Workflows

    Institute of Scientific and Technical Information of China (English)

    Jun-Wei Cao; Fan Zhang; Ke Xu; Lian-Chcn Liu; Chcng Wu

    2011-01-01

    With quick development of grid techniques and growing complexity of grid applications,it is becoming critical for reasoning temporal properties of grid workflows to probe potential pitfalls and errors,in order to ensure reliability and trustworthiness at the initial design phase.A state Pi calculus is proposed and implemented in this work,which not only enables flexible abstraction and management of historical grid system events,but also facilitates modeling and temporal verification of grid workflows.Furthermore,a relaxed region analysis (RRA) approach is proposed to decompose large scale grid workflows into sequentially composed regions with relaxation of parallel workflow branches,and corresponding verification strategies are also decomposed following modular verification principles.Performance evaluation results show that the RRA approach can dramatically reduce CPU time and memory usage of formal verification.

  3. Quantitative ethnographic study of physician workflow and interactions with electronic health record systems.

    Science.gov (United States)

    Asan, Onur; Chiou, Erin; Montague, Enid

    2015-09-01

    This study explores the relationship between primary care physicians' interactions with health information technology and primary care workflow. Clinical encounters were recorded with high-resolution video cameras to capture physicians' workflow and interaction with two objects of interest, the electronic health record (EHR) system, and their patient. To analyze the data, a coding scheme was developed based on a validated list of primary care tasks to define the presence or absence of a task, the time spent on each task, and the sequence of tasks. Results revealed divergent workflows and significant differences between physicians' EHR use surrounding common workflow tasks: gathering information, documenting information, and recommend/discuss treatment options. These differences suggest impacts of EHR use on primary care workflow, and capture types of workflows that can be used to inform future studies with larger sample sizes for more effective designs of EHR systems in primary care clinics. Future research on this topic and design strategies for effective health information technology in primary care are discussed.

  4. Workflow to numerically reproduce laboratory ultrasonic datasets

    Institute of Scientific and Technical Information of China (English)

    A. Biryukov; N. Tisato; G. Grasselli

    2014-01-01

    The risks and uncertainties related to the storage of high-level radioactive waste (HLRW) can be reduced thanks to focused studies and investigations. HLRWs are going to be placed in deep geological re-positories, enveloped in an engineered bentonite barrier, whose physical conditions are subjected to change throughout the lifespan of the infrastructure. Seismic tomography can be employed to monitor its physical state and integrity. The design of the seismic monitoring system can be optimized via con-ducting and analyzing numerical simulations of wave propagation in representative repository geometry. However, the quality of the numerical results relies on their initial calibration. The main aim of this paper is to provide a workflow to calibrate numerical tools employing laboratory ultrasonic datasets. The finite difference code SOFI2D was employed to model ultrasonic waves propagating through a laboratory sample. Specifically, the input velocity model was calibrated to achieve a best match between experi-mental and numerical ultrasonic traces. Likely due to the imperfections of the contact surfaces, the resultant velocities of P- and S-wave propagation tend to be noticeably lower than those a priori assigned. Then, the calibrated model was employed to estimate the attenuation in a montmorillonite sample. The obtained low quality factors (Q) suggest that pronounced inelastic behavior of the clay has to be taken into account in geophysical modeling and analysis. Consequently, this contribution should be considered as a first step towards the creation of a numerical tool to evaluate wave propagation in nuclear waste repositories.

  5. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  6. Comparison of Workflow Scheduling Algorithms in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Navjot Kaur

    2011-10-01

    Full Text Available Cloud computing has gained popularity in recent times. Cloud computing is internet based computing, whereby shared resources, software and information are provided to computers and other devices on demand, like a public utility. Cloud computing is technology that uses the internet and central remote servers to maintain data and applications. This technology allows consumers and businesses to use application without installation and access their personal files at any computer with internet access. The main aim of my work is to study various problems, issues and types of scheduling algorithms for cloud workflows as well as on designing new workflow algorithms for cloud Workflow management system. The proposed algorithms are implemented on real time cloud which is developed using Microsoft .Net Technologies. The algorithms are compared with each other on the basis of parameters like Total execution time, Execution time for algorithm, Estimated execution time. Experimental results generated via simulation shown that Algorithm 2 is much better than Algorithm 1, as it reduced makespan time.

  7. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  8. How Workflow Documentation Facilitates Curation Planning

    Science.gov (United States)

    Wickett, K.; Thomer, A. K.; Baker, K. S.; DiLauro, T.; Asangba, A. E.

    2013-12-01

    The description of the specific processes and artifacts that led to the creation of a data product provide a detailed picture of data provenance in the form of a workflow. The Site-Based Data Curation project, hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, has been investigating how workflows can be used in developing curation processes and policies that move curation "upstream" in the research process. The team has documented an individual workflow for geobiology data collected during a single field trip to Yellowstone National Park. This specific workflow suggests a generalized three-part process for field data collection that comprises three distinct elements: a Planning Stage, a Fieldwork Stage, and a Processing and Analysis Stage. Beyond supplying an account of data provenance, the workflow has allowed the team to identify 1) points of intervention for curation processes and 2) data products that are likely candidates for sharing or deposit. Although these objects may be viewed by individual researchers as 'intermediate' data products, discussions with geobiology researchers have suggested that with appropriate packaging and description they may serve as valuable observational data for other researchers. Curation interventions may include the introduction of regularized data formats during the planning process, data description procedures, the identification and use of established controlled vocabularies, and data quality and validation procedures. We propose a poster that shows the individual workflow and our generalization into a three-stage process. We plan to discuss with attendees how well the three-stage view applies to other types of field-based research, likely points of intervention, and what kinds of interventions are appropriate and feasible in the example workflow.

  9. An analytical method for well-formed workflow/Petri net verification of classical soundness

    Directory of Open Access Journals (Sweden)

    Clempner Julio

    2014-12-01

    Full Text Available In this paper we consider workflow nets as dynamical systems governed by ordinary difference equations described by a particular class of Petri nets. Workflow nets are a formal model of business processes. Well-formed business processes correspond to sound workflow nets. Even if it seems necessary to require the soundness of workflow nets, there exist business processes with conditional behavior that will not necessarily satisfy the soundness property. In this sense, we propose an analytical method for showing that a workflow net satisfies the classical soundness property using a Petri net. To present our statement, we use Lyapunov stability theory to tackle the classical soundness verification problem for a class of dynamical systems described by Petri nets. This class of Petri nets allows a dynamical model representation that can be expressed in terms of difference equations. As a result, by applying Lyapunov theory, the classical soundness property for workflow nets is solved proving that the Petri net representation is stable. We show that a finite and non-blocking workflow net satisfies the sound property if and only if its corresponding PN is stable, i.e., given the incidence matrix A of the corresponding PN, there exists a Փ strictly positive m vector such that AՓ≤ 0. The key contribution of the paper is the analytical method itself that satisfies part of the definition of the classical soundness requirements. The method is designed for practical applications, guarantees that anomalies can be detected without domain knowledge, and can be easily implemented into existing commercial systems that do not support the verification of workflows. The validity of the proposed method is successfully demonstrated by application examples.

  10. Integrating configuration workflows with project management system

    Science.gov (United States)

    Nilsen, Dimitri; Weber, Pavel

    2014-06-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  11. Analysis and Design of Teaching System of "Work-Study Combination" by Workflow Technology%工作流技术下“工学结合”教学系统分析与设计

    Institute of Scientific and Technical Information of China (English)

    林风人; 金敏

    2011-01-01

    The aurricular development theory and method based on working process put forward by higher vocational education experts from the education ministry,needs to be bridged with practice in order to ensure die successful implementation of the experts,ideas and methods. It stresses on solving two problems:firstly,to solve the bridging problems between experts theory and teaching practice;Secondly ,to solve the study problem of "work-study combination" in the software technology application. It innovatively combines " work-study"curricular model and study model. It analyzes the curricular developing process and study model process based on working process ,introduces in software technology and JBPM technology,and develops teaching system based on working process,so as to meet the requirements of "doing by learning,and learning by doing"and achieve the purpose of bridging theory and practice. It designs a teaching system which contains professional goal oriented subsystem,teaching subsystem, learning subsystem,evaluation subsystem, training subsystem and management subsystem. It developed by Browser/Server mode and combining with J2EE technology & JBPM workflow technology. The teaching system realization based on process customization,adopts automatic form technology,process management automation technology.%基于教育部高职教育专家提出的基于工作过程的课程开发理论和方法,在教学实施过程中,十分需要在理论和实践之间架构一座桥梁,保障专家的思想与方法顺利实施.文中重点解决两个问题:一是解决专家的教学理论与教学实践的桥梁问题;二是在软件技术应用中解决工学结合的学习问题.文中创新性地糅合“工学结合”课程模式与学习模式,分析基于工作过程的课程开发流程、学习模式流程,引入软件技术和工作流技术,开发基于工作过程的教学系统,来满足“学中做、做中学”的需要,达到理论与实践的桥梁目的.文

  12. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  13. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  14. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  15. Digital workflow management for quality assessment in pathology.

    Science.gov (United States)

    Kalinski, Thomas; Sel, Saadettin; Hofmann, Harald; Zwönitzer, Ralf; Bernarding, Johannes; Roessner, Albert

    2008-01-01

    Information systems (IS) are well established in the multitude of departments and practices of pathology. Apart from being a collection of doctor's reports, IS can be used to organize and evaluate workflow processes. We report on such a digital workflow management using IS at the Department of Pathology, University Hospital Magdeburg, Germany, and present an evaluation of workflow data collected over a whole year. This allows us to measure workflow processes and to distinguish the effects of alterations in the workflow for quality assessment. Moreover, digital workflow management provides the basis for the integration of diagnostic virtual microscopy.

  16. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  17. 信息管理系统中的工作流技术研究与设计%Design and Realization of Workflow Technology Applied in Management Information System

    Institute of Scientific and Technical Information of China (English)

    张敬普; 马飞

    2012-01-01

    Based on introductions of the background of work flow, the principle and thinking of work flow in the educational administration system are discussed, a work flow model which could be edited, managed and monitored, and Highlighted the running mechanism of workflow is proposed. Work flow technology helped automate business process and promote paperless information flow within enterprises or organizations. The graphic definition of work flow boosted work efficiency while improved meticulous office procedures, a university has a-chieved good results in applying workflow to the educational administration system.%介绍了工作流的应用背景,分析了某校教务系统中采用的工作流技术的原理和设计思想.提出可编辑、管理和监控的工作流模型,并重点介绍了工作流程运行机制.工作流技术使单位内部业务流程自动化,信息流转逐步实现无纸化.利用其图形化定义流程的功能,在办公流程越来越严谨的前提下,办公效率也大幅提高.某校在教务系统中运用了工作流技术,取得了良好的效果.

  18. 基于关系数据库的部队信息化试点工作流设计与实现%Design of RDBMS-Based Forces Information Project Workflow Engine

    Institute of Scientific and Technical Information of China (English)

    赵鹏; 冯增辉; 周全; 张小锋

    2016-01-01

    Workflow technology is used more and more in OA and is adopted by many critical business applications. According to the definition of WMFC, use the activity diagram of UML for dynamic modeling. This paper describes a database model of workflow engine and implements it in django. The implementation of this models was used in the forces information project by the author and got a good effect.%工作流技术在OA办公系统中广泛应用到众多的关键业务中。根据WMFC提出的工作流定义,利用UML活动图分析工作流,进行动态建模。本文给出一个工作引擎实现的数据库模型,并且在django环境中进行了实现,应用在部队信息化试点建设的项目中,取得了良好的效果。

  19. Comprehensive workflow for wireline fluid sampling in an unconsolidated formations utilizing new large volume sampling equipment

    Energy Technology Data Exchange (ETDEWEB)

    Kvinnsland, S.; Brun, M. [TOTAL EandP Norge (Norway); Achourov, V.; Gisolf, A. [Schlumberger (Canada)

    2011-07-01

    Precise and accurate knowledge of fluid properties is essential in unconsolidated formations to the design of production facilities. Wireline formation testers (WFT) have a wide range of applications and the latest WFT can be used to define fluid properties in the wells drilled with oil based mud (OBM) by acquiring PVT and large volume samples. To use these technologies, a comprehensive workflow has to be implemented and the aim of this paper is to present such a workflow. A sampling was conducted in highly unconsolidated sand saturated with biodegradable fluid in the Hild filed in the North Sea. Results showed the use of comprehensive workflow to be successful in obtaining large volume samples with the contamination level below 1%. Oil was precisely characterized thanks to these samples and design updates to the project were made possible. This paper highlighted that the use of the latest WFT technologies can help better characterize fluids in unconsolidated formations and thus optimize production facilities design.

  20. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  1. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  2. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  3. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...utilizes information on operations, geometry, orientation, weather, and materials, generating Three-Dimensional (3D) Building Information Models ( BIM ...executed a demonstration of Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and

  4. Beyond Scientific Workflows: Networked Open Processes

    NARCIS (Netherlands)

    Cushing, R.; Bubak, M.; Belloum, A.; de Laat, C.

    2013-01-01

    The multitude of scientific services and processes being developed brings about challenges for future in silico distributed experiments. Choosing the correct service from an expanding body of processes means that the the task of manually building workflows is becoming untenable. In this paper we pro

  5. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  6. A Reference Architecture for Workflow Management Systems

    NARCIS (Netherlands)

    Grefen, Paul; Remmerts de Vries, Remmert

    1998-01-01

    In the workflow management field, fast developments are taking place. A growing number of systems is currently under development, both in academic and commercial environments. Consequently, a wide variety of ad hoc architectures has come into existence. Reference models are necessary, however, to al

  7. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  8. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  9. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  10. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  11. An Adaptable Seismic Data Format for Modern Scientific Workflows

    Science.gov (United States)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  12. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory...

  13. The application of workflows to digital heritage systems

    OpenAIRE

    Al-Barakati, Abdullah

    2012-01-01

    Digital heritage systems usually handle a rich and varied mix of digital objects, accompanied by complex and intersecting workflows and processes. However, they usually lack effective workflow management within their components as evident in the lack of integrated solutions that include workflow components. There are a number of reasons for this limitation in workflow management utilization including some technical challenges, the unique nature of each digital resource and the challenges impo...

  14. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  15. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    Directory of Open Access Journals (Sweden)

    Cherian Mathew

    2014-12-01

    Full Text Available The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users.

  16. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  17. Applying direct observation to model workflow and assess adoption.

    Science.gov (United States)

    Unertl, Kim M; Weinger, Matthew B; Johnson, Kevin B

    2006-01-01

    Lack of understanding about workflow can impair health IT system adoption. Observational techniques can provide valuable information about clinical workflow. A pilot study using direct observation was conducted in an outpatient chronic disease clinic. The goals of the study were to assess workflow and information flow and to develop a general model of workflow and information behavior. Over 55 hours of direct observation showed that the pilot site utilized many of the features of the informatics systems available to them, but also employed multiple non-electronic artifacts and workarounds. Gaps existed between clinic workflow and informatics tool workflow, as well as between institutional expectations of informatics tool use and actual use. Concurrent use of both paper-based and electronic systems resulted in duplication of effort and inefficiencies. A relatively short period of direct observation revealed important information about workflow and informatics tool adoption.

  18. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  19. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    Science.gov (United States)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  20. Reduction of Hospital Physicians' Workflow Interruptions: A Controlled Unit-Based Intervention Study

    Directory of Open Access Journals (Sweden)

    Matthias Weigl

    2012-01-01

    Full Text Available Highly interruptive clinical environments may cause work stress and suboptimal clinical care. This study features an intervention to reduce workflow interruptions by re-designing work and organizational practices in hospital physicians providing ward coverage. A prospective, controlled intervention was conducted in two surgical and two internal wards. The intervention was based on physician quality circles - a participative technique to involve employees in the development of solutions to overcome work-related stressors. Outcome measures were the frequency of observed workflow interruptions. Workflow interruptions by fellow physicians and nursing staff were significantly lower after the intervention. However, a similar decrease was also observed in control units. Additional interviews to explore process-related factors suggested that there might have been spill-over effects in the sense that solutions were not strictly confined to the intervention group. Recommendations for further research on the effectiveness and consequences of such interventions for professional communication and patient safety are discussed.

  1. EPUB as publication format in Open Access journals: Tools and workflow

    Directory of Open Access Journals (Sweden)

    Trude Eikebrokk

    2014-04-01

    Full Text Available In this article, we present a case study of how the main publishing format of an Open Access journal was changed from PDF to EPUB by designing a new workflow using JATS as the basic XML source format. We state the reasons and discuss advantages for doing this, how we did it, and the costs of changing an established Microsoft Word workflow. As an example, we use one typical sociology article with tables, illustrations and references. We then follow the article from JATS markup through different transformations resulting in XHTML, EPUB and MOBI versions. In the end, we put everything together in an automated XProc pipeline. The process has been developed on free and open source tools, and we describe and evaluate these tools in the article. The workflow is suitable for non-professional publishers, and all code is attached and free for reuse by others.

  2. KNOWLEDGE MANAGEMENT DRIVEN BUSINESS PROCESS AND WORKFLOW MODELING WITHIN AN ORGANIZATION FOR CUSTOMER SATISFACTION

    Directory of Open Access Journals (Sweden)

    Atsa Etoundi roger,

    2010-12-01

    Full Text Available Enterprises to deal with the competitive pressure of the network economy have to design their business processes and workflow systems based on the satisfaction of customers. Therefore, mass product productions have to be abandoned for individual and customized products. Enterprises that fail to meet this challenge will beobliged to step down. Those which tackle this problem need to manage the knowledge of various customers in order to come out with a set of criteria for the delivery of services or production of products. These criteria should then be use to reengineer the business processes and workflows accordingly for the satisfaction of eachof the customers. In this paper, based on the knowledge management approach, we define an enterprise business process and workflow model for the delivery of services and production of goods based on the satisfaction of customers.

  3. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  4. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  5. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  6. Quantifying nursing workflow in medication administration.

    Science.gov (United States)

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  7. Tailored business solutions by workflow technologies

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available VISP (Virtual Internet Service Provider is an IST-STREP project, which is conducting research in the field of these new technologies, targeted to telecom/ISP companies. One of the first tasks of the VISP project is to identify the most appropriate technologies in order to construct the VISP platform. This paper presents the most significant results in the field of choreography and orchestration, two key domains that must accompany process modeling in the construction of a workflow environment.

  8. EDMS based workflow for Printing Industry

    OpenAIRE

    Prathap Nayak; Anuradha Rao; Ramakrishna Nayak

    2013-01-01

    Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always depende...

  9. From chart tracking to workflow management.

    Science.gov (United States)

    Srinivasan, P; Vignes, G; Venable, C; Hazelwood, A; Cade, T

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system.

  10. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  11. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems...

  12. Exploration of Management Workflow of Cataract Surgery in an Impoverished Population in Urban China

    Institute of Scientific and Technical Information of China (English)

    Haofeng Jiang; Haotian Lin; Bo Qu; Weirong Chen

    2014-01-01

    Purpose:To explore and establish a rational management workflow for a free cataract surgery program for the poor pop-ulation in urban China, aiming to improve surgical efficiency. Methods:.Establishment of a management workflow mainly includes system design and an auxiliary facility. System design procedures consist of outpatient screening, outpatient physical examination,.surgical procedures,.and postoperative clinic visits. After establishing the management workflow of cataract surgery, a free cataract surgery program was conducted for 15 months. Results:Based upon the established management mode, 9003 patients received preoperative screening and 2358 underwent cataract surgery..During the 15-month investigation,.each pro-cedure was successfully conducted,.the efficiency of screening and operation attained the highest standards in China,.and no surgical malpractice occurred intraoperatively. Conclusion:.In this study,.a management workflow for cataract surgery was designed for a poverty relief project in urban China. During the 15-month project, the degree of pa-tient satisfaction was enhanced without disrupting the normal practice and safety of the sponsor hospital.

  13. Integrated Cloud-Based Services for Medical Workflow Systems

    Directory of Open Access Journals (Sweden)

    Gharbi Nada

    2016-12-01

    Full Text Available Recent years have witnessed significant progress of workflow systems in different business areas. However, in the medical domain, the workflow systems are comparatively scarcely researched. In the medical domain, the workflows are as important as in other areas. In fact, the flow of information in the healthcare industry is even more critical than it is in other industries. Workflow can provide a new way of looking at how processes and procedures are completed in particular medical systems, and it can help improve the decision-making in these systems. Despite potential capabilities of workflow systems, medical systems still often perceive critical challenges in maintaining patient medical information that results in the difficulties in accessing patient data by different systems. In this paper, a new cloud-based service-oriented architecture is proposed. This architecture will support a medical workflow system integrated with cloud services aligned with medical standards to improve the healthcare system.

  14. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  15. Use of contextual inquiry to understand anatomic pathology workflow: Implications for digital pathology adoption

    Directory of Open Access Journals (Sweden)

    Jonhan Ho

    2012-01-01

    Full Text Available Background: For decades anatomic pathology (AP workflow have been a highly manual process based on the use of an optical microscope and glass slides. Recent innovations in scanning and digitizing of entire glass slides are accelerating a move toward widespread adoption and implementation of a workflow based on digital slides and their supporting information management software. To support the design of digital pathology systems and ensure their adoption into pathology practice, the needs of the main users within the AP workflow, the pathologists, should be identified. Contextual inquiry is a qualitative, user-centered, social method designed to identify and understand users′ needs and is utilized for collecting, interpreting, and aggregating in-detail aspects of work. Objective: Contextual inquiry was utilized to document current AP workflow, identify processes that may benefit from the introduction of digital pathology systems, and establish design requirements for digital pathology systems that will meet pathologists′ needs. Materials and Methods: Pathologists were observed and interviewed at a large academic medical center according to contextual inquiry guidelines established by Holtzblatt et al. 1998. Notes representing user-provided data were documented during observation sessions. An affinity diagram, a hierarchal organization of the notes based on common themes in the data, was created. Five graphical models were developed to help visualize the data including sequence, flow, artifact, physical, and cultural models. Results: A total of six pathologists were observed by a team of two researchers. A total of 254 affinity notes were documented and organized using a system based on topical hierarchy, including 75 third-level, 24 second-level, and five main-level categories, including technology, communication, synthesis/preparation, organization, and workflow. Current AP workflow was labor intensive and lacked scalability. A large number

  16. Workflow-based Context-aware Control of Surgical Robots

    OpenAIRE

    Beyl, Tim

    2015-01-01

    Surgical assistance system such as medical robots enhanced the capabilities of medical procedures in the last decades. This work presents a new perspective on the use of workflows with surgical robots in order to improve the technical capabilities and the ease of use of such systems. This is accomplished by a 3D perception system for the supervision of the surgical operating room and a workflow-based controller, that allows to monitor the surgical process using workflow-tracking techniques.

  17. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  18. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  19. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  20. Process Makna - A Semantic Wiki for Scientific Workflows

    CERN Document Server

    Paschke, Adrian

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Process Makna Semantic Wiki system is a novel combination of an business process management system adapted for scientific workflows with a Corporate Semantic Web Wiki user interface supporting knowledge intensive human interaction tasks during scientific workflow execution.

  1. Beginning WF Windows Workflow in .NET 4.0

    CERN Document Server

    Collins, M

    2010-01-01

    Windows Workflow Foundation is a ground-breaking addition to the core of the .NET Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for a workflow-based solution has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been compose

  2. Model Checking Workflow Net Based on Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Conghua; CHEN Zhenyu

    2006-01-01

    The soundness is a very important criterion for the correctness of the workflow.Specifying the soundness with Computation Tree Logic (CTL) allows us to verify the soundness with symbolic model checkers.Therefore the state explosion problem in verifying soundness can be overcome efficiently.When the property is not satisfied by the system,model checking can give a counter-example, which can guide us to correct the workflow.In addition, relaxed soundness is another important criterion for the workflow.We also prove that Computation Tree Logic * (CTL * ) can be used to character the relaxed soundness of the workflow.

  3. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  4. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  5. Workflow Management for a Cosmology Collaboratory

    Institute of Scientific and Technical Information of China (English)

    StewartC.Loken; CharlesMcParland

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problms in particle and nuclear physics.Itsd goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles)in precision measurements of cosmological parameters.Over the past several years,astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and,during the 4 to 8 weeks of their most"explosive~ activity,measure their changing magnitude and spectra.The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and MaunaKea,Hawaii and Mt.Palomar,California,This new program provides a demanding testbed for the integration of computational,data management and collaboratory technologies.A citical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments .This paper describes the workflow management framework for the project,discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  6. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  7. Scalable Scientific Workflows Management System SWFMS

    Directory of Open Access Journals (Sweden)

    M. Abdul Rahman

    2016-11-01

    Full Text Available In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

  8. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  9. Workflow management for a cosmology collaboratory

    Energy Technology Data Exchange (ETDEWEB)

    Loken, Stewart C.; McParland, Charles

    2001-07-20

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  10. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  11. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Science.gov (United States)

    Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G; Hogg, David C; Stricker, Didier

    2015-01-01

    Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user's pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a

  12. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Directory of Open Access Journals (Sweden)

    Gabriele Bleser

    Full Text Available Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1 fusion of inertial and visual sensor information from an on-body sensor network (BSN to robustly track the user's pose in magnetically polluted environments; 2 learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3 domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4 context-sensitive augmented reality (AR user feedback using a head-mounted display (HMD. A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen

  13. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  14. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  15. Research of Web-based Workflow Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The state of the art of workflow management techniques in research is introduced. The research and development trends of Workflow Manag ement System (WFMS) are presented. On basis of analysis and comparison of kinds of WFMSs, a WFMS based on Web technology and distributed object management is pr oposed. Finally, the application of the WFMS in supply chain management is descr ibed in detail.

  16. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    Sinderen, van Marten J.; Joosten, Stef M.M.; Guareis de Farias, Clever R.

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the corres

  17. An architecture including network QoS in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    The quality of the network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune the execution quality of workflows neglecting network resources, and by selecting only optimal software services and computing resources. One

  18. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  19. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  20. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  1. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Z. Zhao; A. Belloum; C. de Laat; P. Adriaans; B. Hertzberger

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of ex

  2. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  3. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  4. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  5. An Integrated Workflow for DNA Methylation Analysis

    Institute of Scientific and Technical Information of China (English)

    Pingchuan Li; Feray Demirci; Gayathri Mahalingam; Caghan Demirci; Mayumi Nakano; Blake C.Meyers

    2013-01-01

    The analysis of cytosine methylation provides a new way to assess and describe epigenetic regulation at a whole-genome level in many eukaryotes.DNA methylation has a demonstrated role in the genome stability and protection,regulation of gene expression and many other aspects of genome function and maintenance.BS-seq is a relatively unbiased method for profiling the DNA methylation,with a resolution capable of measuring methylation at individual cytosines.Here we describe,as an example,a workflow to handle DNA methylation analysis,from BS-seq library preparation to the data visualization.We describe some applications for the analysis and interpretation of these data.Our laboratory provides public access to plant DNA methylation data via visualization tools available at our "Next-Gen Sequence" websites (http://mpss.udel.edu),along with small RNA,RNA-seq and other data types.

  6. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  7. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    it as a general formal model for specification and execution of declarative, event-based business processes, as a generalization of a concurrency model, the classic event structures. The model allows for an intuitive operational semantics and mapping of execution state by a notion of markings of the graphs and we...... the declarative nature of the projected graphs (which are also DCR graphs). We have also provided semantics for distributed executions based on synchronous communication among network of projected graphs and proved that global and distributed executions are equivalent. Further, to support modeling of processes......Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...

  8. Software workflow for the automatic tagging of medieval manuscript images (SWATI)

    Science.gov (United States)

    Chandna, Swati; Tonne, Danah; Jejkal, Thomas; Stotzka, Rainer; Krause, Celia; Vanscheidt, Philipp; Busch, Hannah; Prabhune, Ajinkya

    2015-01-01

    Digital methods, tools and algorithms are gaining in importance for the analysis of digitized manuscript collections in the arts and humanities. One example is the BMBF-funded research project "eCodicology" which aims to design, evaluate and optimize algorithms for the automatic identification of macro- and micro-structural layout features of medieval manuscripts. The main goal of this research project is to provide better insights into high-dimensional datasets of medieval manuscripts for humanities scholars. The heterogeneous nature and size of the humanities data and the need to create a database of automatically extracted reproducible features for better statistical and visual analysis are the main challenges in designing a workflow for the arts and humanities. This paper presents a concept of a workflow for the automatic tagging of medieval manuscripts. As a starting point, the workflow uses medieval manuscripts digitized within the scope of the project Virtual Scriptorium St. Matthias". Firstly, these digitized manuscripts are ingested into a data repository. Secondly, specific algorithms are adapted or designed for the identification of macro- and micro-structural layout elements like page size, writing space, number of lines etc. And lastly, a statistical analysis and scientific evaluation of the manuscripts groups are performed. The workflow is designed generically to process large amounts of data automatically with any desired algorithm for feature extraction. As a result, a database of objectified and reproducible features is created which helps to analyze and visualize hidden relationships of around 170,000 pages. The workflow shows the potential of automatic image analysis by enabling the processing of a single page in less than a minute. Furthermore, the accuracy tests of the workflow on a small set of manuscripts with respect to features like page size and text areas show that automatic and manual analysis are comparable. The usage of a computer

  9. Understanding dental CAD/CAM for restorations--the digital workflow from a mechanical engineering viewpoint.

    Science.gov (United States)

    Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P

    2015-01-01

    As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology.

  10. Cognitive processes as integrative component for developing expert decision-making systems: a workflow centered framework.

    Science.gov (United States)

    Jalote-Parmar, Ashis; Badke-Schaub, Petra; Ali, Wajid; Samset, Eigil

    2010-02-01

    The development of expert decision-making systems, which improve task performance and reduce errors within an intra-operative clinical workspace, is critically dependent on two main aspects: (a) Analyzing the clinical requirements and cognitive processes within the workflow and (b) providing an optimal context for accurate situation awareness through effective intra-operative information visualization. This paper presents a workflow centered framework and its theoretical underpinnings to design expert decision-making systems. The framework integrates knowledge of the clinical workflow based on the requirements within the clinical workspace. Furthermore, it builds upon and integrates the theory of situation awareness into system design to improve decision-making. As an application example, this framework has been used to design an intra-operative visualization system (IVS), which provides image guidance to the clinicians to perform minimally invasive procedure. An evaluative study, comparing the traditional ultrasound guided procedure with the new developed IVS, has been conducted with expert intervention radiologists and medical students. The results reveal significant evidence for improved decision-making when using the IVS. Therefore, it can be stated that this study demonstrates the benefits of integrating knowledge of cognitive processes into system development to support clinical decision-making and hence improvement of task performance and prevention of errors.

  11. Service-based flexible workflow system for virtual enterprise

    Institute of Scientific and Technical Information of China (English)

    WU Shao-fei

    2008-01-01

    Using the services provided by virtual enterprises, we presented a solution to implement flexible inter-enterprise workflow management. Services were the responses of events that can be accessed programmatically on the Internet by HTTP protocol. Services were obtained according to some standardized service templates. The workflow engine's flexible control to a request was bound to appropriate services and their providers by using a constraint-based, dynamic binding mechanism. Hence, a flexible and collaborative business was achieved. The workflow management system supports virtual enterprise, and the styles of virtual enterprises can be adjusted readily to adapt various situations.

  12. Editorial and Technological Workflow Tools to Promote Website Quality

    Directory of Open Access Journals (Sweden)

    Emily G. Morton-Owens

    2011-09-01

    Full Text Available Library websites are an increasingly visible representation of the library as an institution, which makes website quality an important way to communicate competence and trustworthiness to users. A website editorial workflow is one way to enforce a process and ensure quality. In a workflow, users receive roles, like author or editor, and content travels through various stages in which grammar, spelling, tone, and format are checked. One library used a workflow system to involve librarians in the creation of content. This system, implemented in Drupal, an opensource content management system, solved problems of coordination, quality, and comprehensiveness that existed on the library’s earlier, static website.

  13. A Workflow Process Mining Algorithm Based on Synchro-Net

    Institute of Scientific and Technical Information of China (English)

    Xing-Qi Huang; Li-Fu Wang; Wen Zhao; Shi-Kun Zhang; Chong-Yi Yuan

    2006-01-01

    Sometimes historic information about workflow execution is needed to analyze business processes. Process mining aims at extracting information from event logs for capturing a business process in execution. In this paper a process mining algorithm is proposed based on Synchro-Net which is a synchronization-based model of workflow logic and workflow semantics. With this mining algorithm based on the model, problems such as invisible tasks and short-loops can be dealt with at ease. A process mining example is presented to illustrate the algorithm, and the evaluation is also given.

  14. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  15. What is needed for effective open access workflows?

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  16. Reduction techniques of workflow verification and its implementation

    Institute of Scientific and Technical Information of China (English)

    李沛武; 卢正鼎; 付湘林

    2004-01-01

    Many workflow management systems have emerged in recent years, but few of them provide any form of support for verification. This frequently results in runtime errors that need to be corrected at prohibitive costs. In Ref. [ 1 ], a few reduction rules of verifying workflow graph are given. After analyzing the reduction rules, the overlapped reduction rule is found to be inaccurate. In this paper, the improved reduction rules are presented and the matrix-based implementing algorithm is given, so that the scope of the verification of workflow is expanded and the efficiency of the algorithm is enhanced. The method is simple and natural, and its implementation is easy too.

  17. Solutions for complex, multi data type and multi tool analysis: principles and applications of using workflow and pipelining methods.

    Science.gov (United States)

    Munro, Robin E J; Guo, Yike

    2009-01-01

    Analytical workflow technology, sometimes also called data pipelining, is the fundamental component that provides the scalable analytical middleware that can be used to enable the rapid building and deployment of an analytical application. Analytical workflows enable researchers, analysts and informaticians to integrate and access data and tools from structured and non-structured data sources so that analytics can bridge different silos of information; compose multiple analytical methods and data transformations without coding; rapidly develop applications and solutions by visually constructing analytical workflows that are easy to revise should the requirements change; access domain-specific extensions for specific projects or areas, for example, text extraction, visualisation, reporting, genetics, cheminformatics, bioinformatics and patient-based analytics; automatically deploy workflows directly into web portals and as web services to be part of a service-oriented architecture (SOA). By performing workflow building, using a middleware layer for data integration, it is a relatively simple exercise to visually design an analytical process for data analysis and then publish this as a service to a web browser. All this is encapsulated into what can be referred to as an 'Embedded Analytics' methodology which will be described here with examples covering different scientifically focused data analysis problems.

  18. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  19. EDMS based workflow for Printing Industry

    Directory of Open Access Journals (Sweden)

    Prathap Nayak

    2013-04-01

    Full Text Available Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always dependendent on each other. Hence the information has to be harmonized artistically in order to avoid production downtime or employees pointing fingers at each other. This paper analyses how the implementation of Electronic Document Management System (EDMS could contribute to the Printing Industry for immediate access to stored documents within and across departments irrespective of geographical boundaries. The paper outlines initially with a brief history, contemporary EDMS system and some illustrated examples with a study done by choosing Library as a pilot area for evaluating EDMS. The paper ends with an imitative proposal that maps several document management based activities for implementation of EDMS for a Printing Industry.

  20. Using Make for Reproducible and Parallel Neuroimaging Workflow and Quality Assurance

    Directory of Open Access Journals (Sweden)

    Mary K. Askren

    2016-02-01

    Full Text Available The contribution of this paper is to describe how we can program neuroimaging workflow using Make, a software development tool designed for describing how to build executables from source files. We show that we can achieve many of the features of more sophisticated neuroimaging pipeline systems, including reproducibility, parallelization, fault tolerance, and quality assurance reports. We suggest that Make represents a large step towards these features with only a modest increase in programming demands over shell scripts. This approach reduces the technical skill and time required to write, debug, and maintain neuroimaging workflows in a dynamic environment, where pipelines are often modified to accommodate new best practices or to study the effect of alternative preprocessing steps, and where the underlying packages change frequently. This paper has a comprehensive accompanying manual with lab practicals and examples (see Supplemental Materials and all data, scripts and makefiles necessary to run the practicals and examples are available in the makepipelines project at NITRC.

  1. The Rutgers Workflow Management System: Migrating a Digital Object Management Utility to Open Source

    Directory of Open Access Journals (Sweden)

    Grace Agnew

    2007-12-01

    Full Text Available This article examines the development, architecture, and future plans for the Workflow Management System, software developed by Rutgers University Libraries (RUL to create and catalog digital objects for repository ingest and access. The Workflow Management System (WMS was created as a front-end utility for the Fedora open source repository platform and a vehicle for a flexible, extensible metadata architecture, to serve the information needs of a large university and its collaborators. The next phase of development for the WMS shifted to a re-engineering of the WMS as an open source application. This paper discusses the design and architecture of the WMS, its re-engineering for open source release, remaining issues to be addressed before application release, and future development plans for the WMS.

  2. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  3. Network resource control for grid workflow management systems

    NARCIS (Netherlands)

    Strijkers, R.J.; Cristea, M.; Korkhov, V.; Marchal, D.; Belloum, A.; Laat, C.de; Meijer, R.J.

    2010-01-01

    Grid workflow management systems automate the orchestration of scientific applications with large computational and data processing needs, but lack control over network resources. Consequently, the management system cannot prevent multiple communication intensive applications to compete for network

  4. A Community-Driven Workflow Recommendation and Reuse Infrastructure Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX...

  5. Coupling between a multi-physics workflow engine and an optimization framework

    Science.gov (United States)

    Di Gallo, L.; Reux, C.; Imbeaux, F.; Artaud, J.-F.; Owsiak, M.; Saoutic, B.; Aiello, G.; Bernardi, P.; Ciraolo, G.; Bucalossi, J.; Duchateau, J.-L.; Fausser, C.; Galassi, D.; Hertout, P.; Jaboulay, J.-C.; Li-Puma, A.; Zani, L.

    2016-03-01

    A generic coupling method between a multi-physics workflow engine and an optimization framework is presented in this paper. The coupling architecture has been developed in order to preserve the integrity of the two frameworks. The objective is to provide the possibility to replace a framework, a workflow or an optimizer by another one without changing the whole coupling procedure or modifying the main content in each framework. The coupling is achieved by using a socket-based communication library for exchanging data between the two frameworks. Among a number of algorithms provided by optimization frameworks, Genetic Algorithms (GAs) have demonstrated their efficiency on single and multiple criteria optimization. Additionally to their robustness, GAs can handle non-valid data which may appear during the optimization. Consequently GAs work on most general cases. A parallelized framework has been developed to reduce the time spent for optimizations and evaluation of large samples. A test has shown a good scaling efficiency of this parallelized framework. This coupling method has been applied to the case of SYCOMORE (SYstem COde for MOdeling tokamak REactor) which is a system code developed in form of a modular workflow for designing magnetic fusion reactors. The coupling of SYCOMORE with the optimization platform URANIE enables design optimization along various figures of merit and constraints.

  6. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    Science.gov (United States)

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  7. A Web-Based Rapid Prototyping Workflow Management Information System for Computer Repair and Maintenance

    Directory of Open Access Journals (Sweden)

    A. H. El-Mousa

    2008-01-01

    Full Text Available Problem statement: Response to paper-based requests for computer and peripheral repair and maintenance has become very troublesome and slow due to the large demand and expansion at the University of Jordan. The objectives of this study were to: (i investigate the current system processes associated with the paper-based workflow system. (ii design and implement, using a suitable workflow management approach, a totally electronic alterative to improve performance. Approach: The methodology followed in the transform of the business processes from the paper to the electronic-based was using the rapid prototyping workflow management approach. This approach is seen to be especially effective in such cases where there is direct continuous interaction and involvement between the different stakeholders during the lifecycle of the project. Results: The system had been implemented and tested with the result that efficiency, accountability and response time have greatly improved in handling repair and maintenance orders. The transform from paper to electronic resulted in greatly enhanced user satisfaction especially since the developed system provided automatic feedback regarding order status. Conclusion: The design process and results provide a working blueprint for easy and quick various similar university-based business processes to be transformed to electronic. This should highly motivate internal in-house restructuring of business process to utilize technology and IT to enhance performance and accountability.

  8. Optimization of tomographic reconstruction workflows on geographically distributed resources.

    Science.gov (United States)

    Bicer, Tekin; Gürsoy, Dogˇa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T

    2016-07-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  9. Taverna: a tool for building and running workflows of services

    Science.gov (United States)

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  10. A scheduling framework applied to digital publishing workflows

    Science.gov (United States)

    Lozano, Wilson; Rivera, Wilson

    2006-02-01

    This paper presents the advances in developing a dynamic scheduling technique suitable for automating digital publishing workflows. Traditionally scheduling in digital publishing has been limited to timing criteria. The proposed scheduling strategy takes into account contingency and priority fluctuations. The new scheduling algorithm, referred to as QB-MUF, gives high priority to jobs with low probability of failing according to artifact recognition and workflow modeling critera. The experimental results show the suitability and efficiency of the scheduling strategy.

  11. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  12. CDK-Taverna: an open workflow environment for cheminformatics

    Directory of Open Access Journals (Sweden)

    Zielesny Achim

    2010-03-01

    Full Text Available Abstract Background Small molecules are of increasing interest for bioinformatics in areas such as metabolomics and drug discovery. The recent release of large open access chemistry databases generates a demand for flexible tools to process them and discover new knowledge. To freely support open science based on these data resources, it is desirable for the processing tools to be open source and available for everyone. Results Here we describe a novel combination of the workflow engine Taverna and the cheminformatics library Chemistry Development Kit (CDK resulting in a open source workflow solution for cheminformatics. We have implemented more than 160 different workers to handle specific cheminformatics tasks. We describe the applications of CDK-Taverna in various usage scenarios. Conclusions The combination of the workflow engine Taverna and the Chemistry Development Kit provides the first open source cheminformatics workflow solution for the biosciences. With the Taverna-community working towards a more powerful workflow engine and a more user-friendly user interface, CDK-Taverna has the potential to become a free alternative to existing proprietary workflow tools.

  13. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  14. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases.

  15. An ultrasound image-guided surgical workflow model

    Science.gov (United States)

    Guo, Bing; Lemke, Heinz; Liu, Brent; Huang, H. K.; Grant, Edward G.

    2006-03-01

    A 2003 report in the Journal of Annual Surgery predicted an increase in demand for surgical services to be as high as 14 to 47% in the workload of all surgical fields by 2020. Medical difficulties which are already now apparent in the surgical OR (Operation Room) will be amplified in the near future and it is necessary to address this problem and develop strategies to handle the workload. Workflow issues are central to the efficiency of the OR and in response to today's continuing workforce shortages and escalating costs. Among them include: Inefficient and redundant processes, System Inflexibility, Ergonomic deficiencies, Scattered Data, Lack of Guidelines, Standards, and Organization. The objective of this research is to validate the hypothesis that a workflow model does improve the efficiency and quality of surgical procedure. We chose to study the image-guided surgical workflow for US as a first proof of concept by minimizing the OR workflow issues. We developed, and implemented deformable workflow models using existing and projected future clinical environment data as well as a customized ICT system with seamless integration and real-time availability. An ultrasound (US) image-guided surgical workflow (IG SWF) for a specific surgical procedure, the US IG Liver Biopsy, was researched to find out the inefficient and redundant processes, scattered data in clinical systems, and improve the overall quality of surgical procedures to the patient.

  16. Web API for biology with a workflow navigation system.

    Science.gov (United States)

    Kwon, Yeondae; Shigemoto, Yasumasa; Kuwana, Yoshikazu; Sugawara, Hideaki

    2009-07-01

    DNA Data Bank of Japan (DDBJ) provides Web-based systems for biological analysis, called Web APIs for biology (WABI). So far, we have developed over 20 SOAP services and several workflows that consist of a series of method invocations. In this article, we present newly developed services of WABI, that is, REST-based Web services, additional workflows and a workflow navigation system. Each Web service and workflow can be used as a complete service or a building block for programmers to construct more complex information processing systems. The workflow navigation system aims to help non-programming biologists perform analysis tasks by providing next applicable services on Web browsers according to the output of a previously selected service. With this function, users can apply multiple services consecutively only by following links without any programming or manual copy-and-paste operations on Web browsers. The listed services are determined automatically by the system referring to the dictionaries of service categories, the input/output types of services and HTML tags. WABI and the workflow navigation system are freely accessible at http://www.xml.nig.ac.jp/index.html and http://cyclamen.ddbj.nig.ac.jp/, respectively.

  17. BReW: Blackbox Resource Selection for e-Science Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh [Univ. of Southern California, Los Angeles, CA (United States); Soroush, Emad [Univ. of Washington, Seattle, WA (United States); Van Ingen, Catharine [Microsoft Research, San Francisco, CA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-04

    Workflows are commonly used to model data intensive scientific analysis. As computational resource needs increase for eScience, emerging platforms like clouds present additional resource choices for scientists and policy makers. We introduce BReW, a tool enables users to make rapid, highlevel platform selection for their workflows using limited workflow knowledge. This helps make informed decisions on whether to port a workflow to a new platform. Our analysis of synthetic and real eScience workflows shows that using just total runtime length, maximum task fanout, and total data used and produced by the workflow, BReW can provide platform predictions comparable to whitebox models with detailed workflow knowledge.

  18. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  19. DNA qualification workflow for next generation sequencing of histopathological samples.

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  20. 基于GERT网络的工作流动态调整引擎的设计与实现%Design and implementation of dynamic adjustment workflow engine based on GERT net

    Institute of Scientific and Technical Information of China (English)

    吴冰; 冷文浩; 张燕; 周斌

    2011-01-01

    为了解决工作流业务执行过程中的随机性,提出了基于GERT网络的工作流动态调整引擎的设计方法.介绍了海基数据管理系统和信号流图,探讨了GERT网络解析算法原理,针对发文处理流程,通过海基数据管理系统有效地跟踪发文流程的执行,建立了GERT网络模型进行了求解得到流程执行过程中耗时的范围,最后根据结果由系统管理员设置干预时间来实时的监控和管理工作流.按照实际运行时流程状态重新设计或修改流程定义,实验结果表明了该设计可以有效提高了业务流程的柔性、动态扩展和自动化程度.%To solve the randomness in workfiow process execution, a design method of the dynamic adjustment workfiow engine based on GERT net is presented. First the hikey data management system and signal flow graph is introduced, then the analytical algorithm principle of the GERT network is investigated, and according to the issuing documents deal process in shanghai association for science and technology, the implementation of the issuing documents is effectively tracked by hikey data management system, a GERT network model is established in order to solve time-consuming scope in workfiow process execution. Finally according to the results, the system administrator is allowed to set intervention time to monitoring and management the workfiow at real time. By redesign or modify the process definition in the light ofactual conditions, the presented method is demonstrated that can effectively improve the fiexibility, dynamic extension and high degree automation of business process.

  1. The CESM Workflow Re-Engineering Project

    Science.gov (United States)

    Strand, G.

    2015-12-01

    The Community Earth System Model (CESM) Workflow Re-Engineering Project is a collaborative project between the CESM Software Engineering Group (CSEG) and the NCAR Computation and Information Systems Lab (CISL) Application Scalability and Performance (ASAP) Group to revamp how CESM saves its output. The CMIP3 and particularly CMIP5 experiences in submitting CESM data to those intercomparison projects revealed that the output format of the CESM is not well-suited for the data requirements common to model intercomparison projects. CESM, for efficiency reasons, creates output files containing all fields for each model time sampling, but MIPs require individual files for each field comprising all model time samples. This transposition of model output can be very time-consuming; depending on the volume of data written by the specific simulation, the time to re-orient the data can be comparable to the time required for the simulation to complete. Previous strategies including using serial tools to perform this transposition, but they are now far too inefficient to deal with the many terabytes of output a single simulation can generate. A new set of Python tools, using data parallelism, have been written to enable this re-orientation, and have achieved markedly improved I/O performance. The perspective of a data manager/data producer in the use of these new tools is presented, and likely future work on their development and use will be shown. These tools are a critical part of the NCAR CESM submission to the upcoming CMIP6, with the intention that a much more timely and efficient submission of the expected petabytes of data will be accomplished in the given time frame.

  2. Next-generation sequencing meets genetic diagnostics: development of a comprehensive workflow for the analysis of BRCA1 and BRCA2 genes

    Science.gov (United States)

    Feliubadaló, Lídia; Lopez-Doriga, Adriana; Castellsagué, Ester; del Valle, Jesús; Menéndez, Mireia; Tornero, Eva; Montes, Eva; Cuesta, Raquel; Gómez, Carolina; Campos, Olga; Pineda, Marta; González, Sara; Moreno, Victor; Brunet, Joan; Blanco, Ignacio; Serra, Eduard; Capellá, Gabriel; Lázaro, Conxi

    2013-01-01

    Next-generation sequencing (NGS) is changing genetic diagnosis due to its huge sequencing capacity and cost-effectiveness. The aim of this study was to develop an NGS-based workflow for routine diagnostics for hereditary breast and ovarian cancer syndrome (HBOCS), to improve genetic testing for BRCA1 and BRCA2. A NGS-based workflow was designed using BRCA MASTR kit amplicon libraries followed by GS Junior pyrosequencing. Data analysis combined Variant Identification Pipeline freely available software and ad hoc R scripts, including a cascade of filters to generate coverage and variant calling reports. A BRCA homopolymer assay was performed in parallel. A research scheme was designed in two parts. A Training Set of 28 DNA samples containing 23 unique pathogenic mutations and 213 other variants (33 unique) was used. The workflow was validated in a set of 14 samples from HBOCS families in parallel with the current diagnostic workflow (Validation Set). The NGS-based workflow developed permitted the identification of all pathogenic mutations and genetic variants, including those located in or close to homopolymers. The use of NGS for detecting copy-number alterations was also investigated. The workflow meets the sensitivity and specificity requirements for the genetic diagnosis of HBOCS and improves on the cost-effectiveness of current approaches. PMID:23249957

  3. 工作流管理规范综述%The Workflow Management Specification Overview

    Institute of Scientific and Technical Information of China (English)

    陈畅; 吴朝晖

    2000-01-01

    Workflow Management is a fast evolving technology,many software vendors have WFM products available today in the market.To enable interoperability between heterogeneous workflow products and improve integration of workflow applications with other IT services,it is necessary to work out common specifications.The purpose of this paper is to provide a framework to specifications for implementation in workflow products developed by the WFM Coalition.It provides a common"Reference Model"for workflow management systems.

  4. Design of Project Management System for Conservation and Emission Based Workflow and Rapid Development Platform%基于工作流和快速开发平台技术的企业节能减排项目管理系统设计

    Institute of Scientific and Technical Information of China (English)

    丁金全; 丁琛

    2015-01-01

    A typical model of conservation and emission project management process was build base on workflow technology, the business logic and specific function of project management were designed sepa-rately, so a unified and orderly project management was realized, and flexibility and system adaptability were enhanced too. Rapid development platform technology was chosen in the business system develop-ment, this technology afford a develop mode of parameter customization and it could manage intelligence report,data maintenance,MVC service control and other parameters. Compared to the traditional program-ming development mode, a complex and high quality business system can be developed conveniently and fast.%以工作流技术为核心对企业节能减排项目管理系统进行建模,从而使项目管理的业务逻辑和具体功能独立开来,达到项目管理的统一、有序的管理,提高管理系统的兼容性和整体协作性,使系统的组织更顺畅,工作流程更灵活、适应性更强。采用快速开发平台技术,通过参数定制的方式进行开发,相较于传统的编程开发模式,通过对智能报表、数据维护、MVC业务控制和其他参数的管理,可以方便、快速、高质量地开发复杂的业务系统。

  5. Improving access to space weather data via workflows and web services

    Science.gov (United States)

    Sundaravel, Anu Swapna

    The Space Physics Interactive Data Resource (SPIDR) is a web-based interactive tool developed by NOAA's National Geophysical Data Center to provide access to historical space physics datasets. These data sets are widely used by physicists for space weather modeling and predictions. Built on a distributed network of databases and application servers, SPIDR offers services in two ways: via a web page interface and via a web service interface. SPIDR exposes several SOAP-based web services that client applications implement to connect to a number of data sources for data download and processing. At present, the usage of the web services has been difficult, adding unnecessary complexity to client applications and inconvenience to the scientists who want to use these datasets. The purpose of this study focuses on improving SPIDR's web interface to better support data access, integration and display. This is accomplished in two ways: (1) examining the needs of scientists to better understand what web services they require to better access and process these datasets and (2) developing a client application to support SPIDR's SOAP-based services using the Kepler scientific workflow system. To this end, we identified, designed and developed several web services for filtering the existing datasets and created several Kepler workflows to automate routine tasks associated with these datasets. These workflows are a part of the custom NGDC build of the Kepler tool. Scientists are already familiar with Kepler due to its extensive use in this domain. As a result, this approach provides them with tools that are less daunting than raw web services and ultimately more useful and customizable. We evaluated our work by interviewing various scientists who make use of SPIDR and having them use the developed Kepler workflows while recording their feedback and suggestions. Our work has improved SPIDR such that new web services are now available and scientists have access to a desktop

  6. Improving the digital workflow: direct transfer from patient to virtual articulator.

    Science.gov (United States)

    Solaberrieta, Eneko; Minguez, Rikardo; Etxaniz, Olatz; Barrenetxea, Lander

    2013-01-01

    When designing a custom-made dental restoration, using a digital workflow represents an important advance over mechanical tools such as facebows or mechanical articulators. When using virtual scanning procedures, there is a direct transfer from the patient to the articulator. This paper presents a novel methodology to design custom-made restorations. This new approach permits the transfer of all data directly from the patient to the virtual articulator, always taking into account the kinematics of the mandible. Rapid further developments can be expected in the near future.

  7. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  8. ANALYSIS OF WORKFLOW ON DESIGN PROJECTS IN INDIA

    OpenAIRE

    Senthilkumar Venkatachalam; Koshy Varghuese

    2010-01-01

    Proposal: The increase in privately funded infrastructure construction in India had compelled project owners to demand highly compressed project schedules due to political risks and early revenue generation. As a result, many of the contracts are based on EPC (Engineering Procurement and Construction) contract enabling the contractor to plan and control the EPC phases. Sole responsibility for the three phases has facilitated the use of innovative approaches such as fast-track construction an...

  9. Scientific Workflows + Provenance = Better (Meta-)Data Management

    Science.gov (United States)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata

  10. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  11. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  12. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  13. Optimizing high performance computing workflow for protein functional annotation.

    Science.gov (United States)

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  14. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  15. A HYBRID PETRI-NET MODEL OF GRID WORKFLOW

    Institute of Scientific and Technical Information of China (English)

    Ji Yimu; Wang Ruchuan; Ren Xunyi

    2008-01-01

    In order to effectively control the random tasks submitted and executed in grid workflow, a grid workflow model based on hybrid petri-net is presented. This model is composed of random petri-net, colored petri-net and general petri-net. Therein random petri-net declares the relationship between the number of grid users' random tasks and the size of service window and computes the server intensity of grid system. Colored petri-net sets different color for places with grid services and provides the valid interfaces for grid resource allocation and task scheduling. The experiment indicated that the model presented in this letter could compute the valve between the number of users' random tasks and the size of grid service window in grid workflow management system.

  16. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  17. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  18. Hybrid Workflow Policy Management for Heart Disease Identification

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Kim

    2009-12-01

    Full Text Available As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS in Grid computing are promisingsolution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we proposehybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS workflow applications.

  19. Hybrid Workflow Policy Management for Heart Disease Identification

    CERN Document Server

    Kim, Dong-Hyun; Youn, Chan-Hyun

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we propose hybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS) workflow applications.

  20. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  1. Scientific Workflow Systems for 21st Century e-Science, New Bottle or New Wine?

    CERN Document Server

    Zhao, Yong; Foster, Ian

    2008-01-01

    With the advances in e-Sciences and the growing complexity of scientific analyses, more and more scientists and researchers are relying on workflow systems for process coordination, derivation automation, provenance tracking, and bookkeeping. While workflow systems have been in use for decades, it is unclear whether scientific workflows can or even should build on existing workflow technologies, or they require fundamentally new approaches. In this paper, we analyze the status and challenges of scientific workflows, investigate both existing technologies and emerging languages, platforms and systems, and identify the key challenges that must be addressed by workflow systems for e-science in the 21st century.

  2. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  3. An open source approach to enable the reproducibility of scientific workflows in the ocean sciences

    Science.gov (United States)

    Di Stefano, M.; Fox, P. A.; West, P.; Hare, J. A.; Maffei, A. R.

    2013-12-01

    Every scientist should be able to rerun data analyses conducted by his or her team and regenerate the figures in a paper. However, all too often the correct version of a script goes missing, or the original raw data is filtered by hand and the filtering process is undocumented, or there is lack of collaboration and communication among scientists working in a team. Here we present 3 different use cases in ocean sciences in which end-to-end workflows are tracked. The main tool that is deployed to address these use cases is based on a web application (IPython Notebook) that provides the ability to work on very diverse and heterogeneous data and information sources, providing an effective way to share the and track changes to source code used to generate data products and associated metadata, as well as to track the overall workflow provenance to allow versioned reproducibility of a data product. Use cases selected for this work are: 1) A partial reproduction of the Ecosystem Status Report (ESR) for the Northeast U.S. Continental Shelf Large Marine Ecosystem. Our goal with this use case is to enable not just the traceability but also the reproducibility of this biannual report, keeping track of all the processes behind the generation and validation of time-series and spatial data and information products. An end-to-end workflow with code versioning is developed so that indicators in the report may be traced back to the source datasets. 2) Realtime generation of web pages to be able to visualize one of the environmental indicators from the Ecosystem Advisory for the Northeast Shelf Large Marine Ecosystem web site. 3) Data and visualization integration for ocean climate forecasting. In this use case, we focus on a workflow to describe how to provide access to online data sources in the NetCDF format and other model data, and make use of multicore processing to generate video animation from time series of gridded data. For each use case we show how complete workflows

  4. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows

  5. Linking Geobiology Fieldwork and Data Curation Through Workflow Documentation

    Science.gov (United States)

    Thomer, A.; Baker, K. S.; Jett, J. G.; Gordon, S.; Palmer, C. L.

    2014-12-01

    Describing the specific processes and artifacts that lead to the creation of data products provides a detailed picture of data provenance in the form of a high-level workflow. The resulting diagram identifies:1. "points of intervention" at which curation processes can be moved upstream, and 2. data products that may be important for sharing and preservation. The Site-Based Data Curation project, an Institute of Museum and Library Services-funded project hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, previously inferred a geobiologist's planning, field and laboratory workflows through close study of the data products produced during a single field trip to Yellowstone National Park (Wickett et al, 2013). We have since built on this work by documenting post hoc curation processes, and integrating them with the existing workflow. By holistically considering both data collection and curation, we are able to identify concrete steps that scientists can take to begin curating data in the field. This field-to-repository workflow represents a first step toward a more comprehensive and nuanced model of the research data lifecycle. Using our initial three-phase workflow, we identify key data products to prioritize for curation, and the points at which data curation best practices integrate with research processes with minimal interruption. We then document the processes that make key data products sharable and ready for preservation. We append the resulting curatorial phases to the field data collection workflow: Data Staging, Data Standardizing and Data Packaging. These refinements demonstrate:1) the interdependence of research and curatorial phases;2) the links between specific research products, research phases and curatorial processes; 3) the interdependence of laboratory-specific standards and community-wide best practices. We propose a poster that shows the six-phase workflow described above. We plan to discuss

  6. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  7. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    a set of local process graphs communicating by shared events, such that the distributed execution of the local processes is equivalent to executing the original process. The technique is based on our recent similar work on safe distribution of Dynamic Condition Response (DCR) Graphs applied to cross......We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events...

  8. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    Science.gov (United States)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  9. Workflow for the use of a high-resolution image detector in endovascular interventional procedures

    Science.gov (United States)

    Rana, R.; Loughran, B.; Swetadri Vasan, S. N.; Pope, L.; Ionita, C. N.; Siddiqui, A.; Lin, N.; Bednarek, D. R.; Rudin, S.

    2014-03-01

    Endovascular image-guided intervention (EIGI) has become the primary interventional therapy for the most widespread vascular diseases. These procedures involve the insertion of a catheter into the femoral artery, which is then threaded under fluoroscopic guidance to the site of the pathology to be treated. Flat Panel Detectors (FPDs) are normally used for EIGIs; however, once the catheter is guided to the pathological site, high-resolution imaging capabilities can be used for accurately guiding a successful endovascular treatment. The Micro-Angiographic Fluoroscope (MAF) detector provides needed high-resolution, high-sensitivity, and real-time imaging capabilities. An experimental MAF enabled with a Control, Acquisition, Processing, Image Display and Storage (CAPIDS) system was installed and aligned on a detector changer attached to the C-arm of a clinical angiographic unit. The CAPIDS system was developed and implemented using LabVIEW software and provides a user-friendly interface that enables control of several clinical radiographic imaging modes of the MAF including: fluoroscopy, roadmap, radiography, and digital-subtraction-angiography (DSA). Using the automatic controls, the MAF detector can be moved to the deployed position, in front of a standard FPD, whenever higher resolution is needed during angiographic or interventional vascular imaging procedures. To minimize any possible negative impact to image guidance with the two detector systems, it is essential to have a well-designed workflow that enables smooth deployment of the MAF at critical stages of clinical procedures. For the ultimate success of this new imaging capability, a clear understanding of the workflow design is essential. This presentation provides a detailed description and demonstration of such a workflow design.

  10. REDACLE A Database for the Workflow Management of the CMS ECAL Construction

    CERN Document Server

    Barone, Luciano; Costantini, Silvia; Dafinei, Ioan; Diemoz, Marcella; Organtini, Giovanni; Paramatti, Riccardo; Pellegrino, Fabio; Moro, R; Cau, S

    2003-01-01

    The REDACLE Project aims at the realization of a simple, flexible and fast database to assist the construction of the CMS electromagnetic calorimeter. The project started in January 2003 as a backup solution for the previously used product: CRISTAL. The REDACLE database was designed to be flexible enough to be used for the construction of virtually any kind of product. One of the key element of the project was the complete decoupling between the database structure and the workflow process software: rather than being a missing feature it allows to use the database for very different projects ranging from very simple to much more complex systems.

  11. Visualized Workflow System Based on XML and Relational Database%基于XML和关系数据库的可视化工作流系统

    Institute of Scientific and Technical Information of China (English)

    陈谊; 侯堃; 新吉乐; 陈红倩

    2012-01-01

    In order to implement a workflow management system which can support visual designing and execution of a project management workflow, a 4-layer workflow management system framework based on XML and related database was proposed, which covered the concepts, history and current advance of workflow management system, with introduction of dominant standards and technologies, and described the 4-layer system structure and system composition, data structure, the connect method of each procedure in workflow, the parsing method for XML, and implementation techniques of visual design of workflow. The workflow management system was realized in ASP.NET environment. Application shows that the system could effectively realize the project management process.%以实现一个具有可视化工作流设计和执行功能的工作流管理系统为目标,提出了一种将XML技术和关系数据库技术相结合、具有层次体系结构的工作流管理系统框架。介绍了工作流管理技术的概念、历史现状以及主要标准和技术,给出了系统的层次结构和系统组成,详细描述了系统的数据存储结构、工作流交接处理过程、XML解析方法以及利用客户端脚本技术实现工作流设计可视化的具体方法,最后在.NET平台上实现了该系统。实际应用表明,该系统能够有效实现企业项目管理中的工作流程管理。

  12. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp...

  13. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...

  14. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    CERN Document Server

    Amsterdamer, Yael; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2012-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style provenance, by using Pig Latin to expose the functionality of modules, thus capturing internal state and fine-grained dependencies. A critical ingredient in our solution is the use of a novel form of provenance graph that models module invocations and yields a compact representation of fine-grained workflow provenance. It also enables a number of novel graph transformation operations, allowing to choose the desired level of granularity in provenance querying (ZoomIn and ZoomOut), and supporting "what-if" workflow analyti...

  15. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  16. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    Directory of Open Access Journals (Sweden)

    SARAVANAN .M.S,

    2011-06-01

    Full Text Available This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM. A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. The process mining aims at extracting useful and meaningful information from event logs, which is a set of real executions of business process at any organizations. This paper briefly reviews the state-or-the-art of business processes developed so far and the techniques adopted. Also presents, the survey of workflow analysis in the view of business process can be broadly classified into four major categories, they are Business Process Modeling, Ontology based Business Process Management, Workflow based Business Process Controlling and Business Process Mining.

  17. Actor-driven workflow execution in distributed environments

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum; M. Bubak

    2010-01-01

    Currently, most workflow management systems (WfMS) in Grid environments provide push-oriented task distribution strategies, where tasks are directly bound to suitable resources. In those scenarios the dedicated resources execute the submitted tasks according to the request of a WfMS or sometimes by

  18. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas

    2013-01-01

    with researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction...

  19. Images crossing borders: image and workflow sharing on multiple levels.

    Science.gov (United States)

    Ross, Peeter; Pohjonen, Hanna

    2011-04-01

    Digitalisation of medical data makes it possible to share images and workflows between related parties. In addition to linear data flow where healthcare professionals or patients are the information carriers, a new type of matrix of many-to-many connections is emerging. Implementation of shared workflow brings challenges of interoperability and legal clarity. Sharing images or workflows can be implemented on different levels with different challenges: inside the organisation, between organisations, across country borders, or between healthcare institutions and citizens. Interoperability issues vary according to the level of sharing and are either technical or semantic, including language. Legal uncertainty increases when crossing national borders. Teleradiology is regulated by multiple European Union (EU) directives and legal documents, which makes interpretation of the legal system complex. To achieve wider use of eHealth and teleradiology several strategic documents were published recently by the EU. Despite EU activities, responsibility for organising, providing and funding healthcare systems remains with the Member States. Therefore, the implementation of new solutions requires strong co-operation between radiologists, societies of radiology, healthcare administrators, politicians and relevant EU authorities. The aim of this article is to describe different dimensions of image and workflow sharing and to analyse legal acts concerning teleradiology in the EU.

  20. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2002-01-01

    Electronic service outsourcing creates a new paradigm for automated enterprise collaboration. The service-oriented paradigm requires a high level of flexibility of current workflow management systems and support for Business-to-Business (B2B) collaboration to realize collaborative enterprises. This

  1. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  2. CrossFlow: Integrating Workflow Management and Electronic Commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  3. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  4. What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components

    Science.gov (United States)

    Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.

    2013-12-01

    Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of

  5. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  6. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  7. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  8. A P2P approach to resource discovery in on-line monitoring of Grid workflows

    NARCIS (Netherlands)

    Łabno, B.; Bubak, M.; Baliś, B.

    2008-01-01

    On-line monitoring of Grid workflows is challenging since workflows are loosely coupled and highly dynamic. An efficient mechanism of automatic resource discovery is needed in order to discover new producers of workflow monitoring data fast. However, currently used Grid information systems are not s

  9. Characterizing workflow-based activity on a production e-infrastructure using provenance data.

    NARCIS (Netherlands)

    Madougou, S.; Shahand, S.; Santcroos, M.; van Schaik, B.; Benabdelkader, A.; van Kampen, A.; Olabarriaga, S.

    2013-01-01

    Grid computing and workflow management systems emerged as solutions to the challenges arising from the processing and storage of shear volumes of data generated by modern simulations and data acquisition devices. Workflow management systems usually document the process of the workflow execution eith

  10. Performance prediction for Grid workflow activities based on features-ranked RBF network

    Institute of Scientific and Technical Information of China (English)

    Wang Jie; Duan Rubing; Farrukh Nadeem

    2009-01-01

    Accurate performance prediction of Grid workflow activities can help Grid schedulers map activities to appropriate Grid sites. This paper describes an approach based on features-ranked RBF neural network to predict the performance of Grid workflow activities. Experimental results for two kinds of real world Grid workflow activities are presented to show effectiveness of our approach.

  11. Computer aided process planning system based on workflow technology and integrated bill of material tree

    Institute of Scientific and Technical Information of China (English)

    LU Chun-guang; MENG Li-li

    2006-01-01

    It is extremely important for procedure of process design and management of process data for product life cycle in Computer Aided Process Planning (CAPP) system,but there are many shortcomings with traditional CAPP system in these respects.To solve these questions,application of workflow technology in CAPP system based on web-integrated Bill of Material (BOM) tree is discussed,and a concept of integrated BOM tree was brought forward.Taking integrated BOM as the thread,CAPP systematic technological process is analyzed.The function,system architecture,and implementation mechanism of CAPP system based on Browser/Server and Customer/Server model are expatiated.Based on it,the key technologies of workflow management device were analyzed.Eventually,the implementation mechanism of integrated BOM tree was analyzed from viewpoints of material information encoding,organization node design of integrated BOM tree,transformation from Engineering BOM (EBOM)to Process BOM (PBOM),and the programming implementation technology.

  12. Digital stereo photogrammetry for grain-scale monitoring of fluvial surfaces: Error evaluation and workflow optimisation

    Science.gov (United States)

    Bertin, Stephane; Friedrich, Heide; Delmas, Patrice; Chan, Edwin; Gimel'farb, Georgy

    2015-03-01

    Grain-scale monitoring of fluvial morphology is important for the evaluation of river system dynamics. Significant progress in remote sensing and computer performance allows rapid high-resolution data acquisition, however, applications in fluvial environments remain challenging. Even in a controlled environment, such as a laboratory, the extensive acquisition workflow is prone to the propagation of errors in digital elevation models (DEMs). This is valid for both of the common surface recording techniques: digital stereo photogrammetry and terrestrial laser scanning (TLS). The optimisation of the acquisition process, an effective way to reduce the occurrence of errors, is generally limited by the use of commercial software. Therefore, the removal of evident blunders during post processing is regarded as standard practice, although this may introduce new errors. This paper presents a detailed evaluation of a digital stereo-photogrammetric workflow developed for fluvial hydraulic applications. The introduced workflow is user-friendly and can be adapted to various close-range measurements: imagery is acquired with two Nikon D5100 cameras and processed using non-proprietary "on-the-job" calibration and dense scanline-based stereo matching algorithms. Novel ground truth evaluation studies were designed to identify the DEM errors, which resulted from a combination of calibration errors, inaccurate image rectifications and stereo-matching errors. To ensure optimum DEM quality, we show that systematic DEM errors must be minimised by ensuring a good distribution of control points throughout the image format during calibration. DEM quality is then largely dependent on the imagery utilised. We evaluated the open access multi-scale Retinex algorithm to facilitate the stereo matching, and quantified its influence on DEM quality. Occlusions, inherent to any roughness element, are still a major limiting factor to DEM accuracy. We show that a careful selection of the camera

  13. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  14. Adopting Collaborative Workflow Pattern : Use Case

    Directory of Open Access Journals (Sweden)

    Antonio Capodieci

    2015-03-01

    Full Text Available In recent years, the use of web 2.0 tools has incre ased in companies and organisation. This phenomenon, has modified common organisational and operative practices. This has led “knowledge workers” to change their working practic es through the use of Web 2.0 communication tools. Unfortunately, these tools hav e not been integrated with existing enterprise information systems. This is an importan t problem in an organisational context because knowledge of information exchanged is neede d. In previous works we demonstrate that it is possible to capture this knowledge using coll aboration processes, which are processes of abstraction created in accordance with design patte rns and applied to new organisational operative practices. In this article, we want to pr esent the experience of the adoption of the methodology and the catalog of patterns from some p attern designers of a company operating in Public Administration and Finance, with the aim of shaping an information system Enterprise 2.0, which takes into account the collaborative pro cesses.

  15. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  16. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  17. A workflow-oriented framework-driven implementation and local adaptation of clinical information systems: a case study of nursing documentation system implementation at a tertiary rehabilitation hospital.

    Science.gov (United States)

    Choi, Jeeyae; Kim, Hyeoneui

    2012-08-01

    Health information systems are often designed and developed without integrating users' specific needs and preferences. This decreases the users' productivity, satisfaction, and acceptance of the system and increases the necessity for a local adaptation process to reduce the unwanted outcomes after implementation. A workflow-oriented framework developed in a previous study indicates that users' needs and preferences could be incorporated into the system when implementation follows the steps of the framework, eventually increasing satisfaction with and usefulness of the system. The overall goal of this study was to demonstrate application of the workflow-oriented framework to the implementation of a nursing documentation system at Spaulding Rehabilitation Hospital. In this case study, we present specific steps of implementing and adapting a health information system at a local site and raise critical questions that need to be answered in each step based on the workflow-oriented framework.

  18. Understanding the impact on intensive care staff workflow due to the introduction of a critical care information system: a mixed methods research methodology.

    Science.gov (United States)

    Shaw, N T; Mador, R L; Ho, S; Mayes, D; Westbrook, J I; Creswick, N; Thiru, K; Brown, M

    2009-01-01

    The Intensive Care Unit (ICU) is a complex and dynamic tertiary care environment that requires health care providers to balance many competing tasks and responsibilities. Inefficient and interruption-driven workflow is believed to increase the likelihood of medical errors and, therefore, present a serious risk to patients in the ICU. The introduction of a Critical Care Information System (CCIS), is purported to result in fewer medical errors and better patient care by streamlining workflow. Little objective research, however, has investigated these assertions. This paper reports on the design of a research methodology to explore the impact of a CCIS on the workflow of Respiratory Therapists, Pediatric Intensivists, Nurses, and Unit Clerks in a Pediatric ICU (PICU) and a General Systems ICU (GSICU) in Northern Canada.

  19. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  20. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    Directory of Open Access Journals (Sweden)

    Guillermo Palacios

    2012-11-01

    Full Text Available Sensing technologies in mobile devices play a key role in reducing the gapbetween the physical and the digital world. The use of automatic identification capabilitiescan improve user participation in business processes where physical elements are involved(Smart Workflows. However, identifying all objects in the user surroundings does notautomatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each ofthe participants in a smart workflow. Parkour is based on a pluggable architecture thatcan be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are alsoprovided. Several Parkour-based systems have been developed in order to validate theapplicability of the proposal.

  1. Research on Architecture of Enterprise Modeling in Workflow System

    Institute of Scientific and Technical Information of China (English)

    李伟平; 齐慧彬; 薛劲松; 朱云龙

    2002-01-01

    The market that an enterprise is faced is changing and can 't be forecastedaccurately in this information time. In order to find the chances in the marketpractitioners have focused on business processes through their re-engineeringprogramme to improve enterprise efficiency. It is necessary to manage an enterpriseusing process-based method for the requirement of enhancing work efficiency and theability of competition in the market. And information system developers haveemphasized the use of standard models to accelerate the speed of configuration andimplementation of integrated systems for enterprises. So we have to model anenterprise with process-based modeling method. An architecture of enterprise modelingis presented in this paper. This architecture is composed of four views and supportingthe whole lifecycle of enterprise model. Because workflow management system is basedon process definition, this architecture can be directly used in the workflowmanagement system. The implement method of this model was thoroughly describedmeanwhile the workflow management software supporting the building and running themodel was also given.

  2. Enhanced Phosphoproteomic Profiling Workflow For Growth Factor Signaling Analysis

    DEFF Research Database (Denmark)

    Sylvester, Marc; Burbridge, Mike; Leclerc, Gregory;

    2010-01-01

    Background Our understanding of complex signaling networks is still fragmentary. Isolated processes have been studied extensively but cross-talk is omnipresent and precludes intuitive predictions of signaling outcomes. The need for quantitative data on dynamic systems is apparent especially for our...... A549 lung carcinoma cells were used as a model and stimulated with hepatocyte growth factor, epidermal growth factor or fibroblast growth factor. We employed a quick protein digestion workflow with spin filters without using urea. Phosphopeptides in general were enriched by sequential elution from...... transfer dissociation adds confidence in modification site assignment. The workflow is relatively simple but the integration of complementary techniques leads to a deeper insight into cellular signaling networks and the potential pharmacological intervention thereof....

  3. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  4. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  5. Networked Print Production: Does JDF Provide a Perfect Workflow?

    OpenAIRE

    Bernd Zipper

    2004-01-01

    The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format) and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between p...

  6. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  7. Advanced Workflows for Fluid Transfer in Faulted Basins.

    OpenAIRE

    Thibaut Muriel; Jardin Anne; Faille Isabelle; Willien Françoise; Guichet Xavier

    2014-01-01

    modélisation de bassin ; faille ; logiciel ;; International audience; The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical sli...

  8. Multidimensional Interactive Radiology Report and Analysis: standardization of workflow and reporting for renal mass tracking and quantification

    Science.gov (United States)

    Hwang, Darryl H.; Ma, Kevin; Yepes, Fernando; Nadamuni, Mridula; Nayyar, Megha; Liu, Brent; Duddalwar, Vinay; Lepore, Natasha

    2015-12-01

    A conventional radiology report primarily consists of a large amount of unstructured text, and lacks clear, concise, consistent and content-rich information. Hence, an area of unmet clinical need consists of developing better ways to communicate radiology findings and information specific to each patient. Here, we design a new workflow and reporting system that combines and integrates advances in engineering technology with those from the medical sciences, the Multidimensional Interactive Radiology Report and Analysis (MIRRA). Until recently, clinical standards have primarily relied on 2D images for the purpose of measurement, but with the advent of 3D processing, many of the manually measured metrics can be automated, leading to better reproducibility and less subjective measurement placement. Hence, we make use this newly available 3D processing in our workflow. Our pipeline is used here to standardize the labeling, tracking, and quantifying of metrics for renal masses.

  9. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available.

  10. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  11. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  12. Layered Workflow Process Model Based on Extended Synchronizer

    Directory of Open Access Journals (Sweden)

    Gang Ni

    2014-07-01

    Full Text Available The layered workflow process model provide a modeling approach and analysis for the key process with Petri Net. It not only describes the relation between the process of business flow and transition nodes clearly, but also limits the rapid increase in the scale of libraries, transition and directed arcs. This paper studies the process like reservation and complaint handling information management system, especially for the multi-mergence and discriminator patterns which can not be directly modeled with existing synchronizers. Petri Net is adopted to provide formalization description for the workflow patterns and the relation between Arcs and weight class are also analyzed. We use the number of in and out arcs to generalize the workflow into three synchronous modes: fully synchronous mode, competition synchronous mode and asynchronous mode. The types and parameters for synchronization are added to extend the modeling ability of the synchronizers and the synchronous distance is also expanded. The extended synchronizers have the ability to terminate branches automatically or activate the next link many times, besides the ability of original synchronizers. By the analyses on cases of the key business, it is verified that the original synchronizers can not model directly, while the extended synchronizers based on Petri Net can provide modeling for multi-mergence and discriminator modes.

  13. Optimal Workflow Scheduling in Critical Infrastructure Systems with Neural Networks

    Directory of Open Access Journals (Sweden)

    S. Vukmirović

    2012-04-01

    Full Text Available Critical infrastructure systems (CISs, such as power grids, transportation systems, communication networks and water systems are the backbone of a country’s national security and industrial prosperity. These CISs execute large numbers of workflows with very high resource requirements that can span through different systems and last for a long time. The proper functioning and synchronization of these workflows is essential since humanity’s well-being is connected to it. Because of this, the challenge of ensuring availability and reliability of these services in the face of a broad range of operating conditions is very complicated. This paper proposes an architecture which dynamically executes a scheduling algorithm using feedback about the current status of CIS nodes. Different artificial neural networks (ANNs were created in order to solve the scheduling problem. Their performances were compared and as the main result of this paper, an optimal ANN architecture for workflow scheduling in CISs is proposed. A case study is shown for a meter data management system with measurements from a power distribution management system in Serbia. Performance tests show that significant improvement of the overall execution time can be achieved by ANNs.

  14. A Three-Layer Model for Business Processes——Process Logic, Case Semantics and Workflow Management

    Institute of Scientific and Technical Information of China (English)

    Chong-Yi Yuan; Wen Zhao; Shi-Kun Zhang; Yu Huang

    2007-01-01

    Workflow management aims at the controlling, monitoring, optimizing and supporting of business processes.Well designed formal models will facilitate such management since they provide explicit representations of business processesas the basis for computerized analysis, verification and execution. Petri Nets have been recognized as the most suitablecandidate for workflow modeling, and as such, formal models based on Petri Nets have been proposed, among them WF-netby Aalst is the most popular one. But WF-net has turned out to be conceptually chaotic as will be illustrated in this paperwith an example from Aalst's book. This paper proposes a series of models for the description and analysis of businessprocesses at conceptually different hierarchical layers. Analytic goals and methods at these layers are also discussed. Theunderlying structure, shared by all these models, is SYNCHRONIZER, which is designed with the guidance of synchronytheory of GNT (General Net Theory) and serves as the conceptual foundation of workflow formal models. Structurally,synchronizers connect tasks to form a whole while dynamically synchronizers control tasks to achieve synchronization.

  15. MCRUNJOB: A High energy physics workflow planner for grid production processing

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job description languages or new application level tasks.

  16. Nurses' perceptions of how clinical information system implementation affects workflow and patient care.

    Science.gov (United States)

    Ward, Marcia M; Vartak, Smruti; Schwichtenberg, Tammy; Wakefield, Douglas S

    2011-09-01

    There is a little evidence of the impact of clinical information system implementation on nurses' workflow and patient care to guide institutions across the nation as they implement electronic health records. This study compared changes in nurse's perceptions about patient care processes and workflow before and after a comprehensive clinical information system implementation at a rural referral hospital. The study used the Information Systems Expectations and Experiences survey, which consists of seven scales-provider-patient communication, interprovider communication, interorganizational communication, work-life changes, improved care, support and resources, and patient care processes. Survey responses were examined across three administrations-before and after training and after implementation. The survey responses decreased significantly for eight of the 47 survey items from the first administration to the second and for 37 items from the second administration to the third. Perceptions were more positive in nurses who had previous experience with electronic health records and less positive in nurses with more years of work experience. These findings point to the importance of setting realistic expectations, assessing user perceptions throughout the implementation process, designing training to meet the needs of the end user, and adapting training and implementation processes to support nurses who have concerns.

  17. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  18. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    Directory of Open Access Journals (Sweden)

    Elias Cesar Araujo de Carvalho

    Full Text Available BACKGROUND: With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. METHODOLOGY/PRINCIPAL FINDINGS: Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a lack of standardized process for data registration at source document, (b multiplicity of data repositories and (c scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. CONCLUSIONS/SIGNIFICANCE: Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and

  19. Inter-laboratory evaluation of instrument platforms and experimental workflows for quantitative accuracy and reproducibility assessment

    Directory of Open Access Journals (Sweden)

    Andrew J. Percy

    2015-09-01

    Full Text Available The reproducibility of plasma protein quantitation between laboratories and between instrument types was examined in a large-scale international study involving 16 laboratories and 19 LC–MS/MS platforms, using two kits designed to evaluate instrument performance and one kit designed to evaluate the entire bottom-up workflow. There was little effect of instrument type on the quality of the results, demonstrating the robustness of LC/MRM-MS with isotopically labeled standards. Technician skill was a factor, as errors in sample preparation and sub-optimal LC–MS performance were evident. This highlights the importance of proper training and routine quality control before quantitation is done on patient samples.

  20. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  1. Learning Clinical Workflows to Identify Subgroups of Heart Failure Patients

    Science.gov (United States)

    Yan, Chao; Chen, You; Li, Bo; Liebovitz, David; Malin, Bradley

    2016-01-01

    Heart Failure (HF) is one of the most common indications for readmission to the hospital among elderly patients. This is due to the progressive nature of the disease, as well as its association with complex comorbidities (e.g., anemia, chronic kidney disease, chronic obstructive pulmonary disease, hyper- and hypothyroidism), which contribute to increased morbidity and mortality, as well as a reduced quality of life. Healthcare organizations (HCOs) have established diverse treatment plans for HF patients, but such routines are not always formalized and may, in fact, arise organically as a patient’s management evolves over time. This investigation was motivated by the hypothesis that patients associated with a certain subgroup of HF should follow a similar workflow that, once made explicit, could be leveraged by an HCO to more effectively allocate resources and manage HF patients. Thus, in this paper, we introduce a method to identify subgroups of HF through a similarity analysis of event sequences documented in the clinical setting. Specifically, we 1) structure event sequences for HF patients based on the patterns of electronic medical record (EMR) system utilization, 2) identify subgroups of HF patients by applying a k-means clustering algorithm on utilization patterns, 3) learn clinical workflows for each subgroup, and 4) label each subgroup with diagnosis and procedure codes that are distinguishing in the set of all subgroups. To demonstrate its potential, we applied our method to EMR event logs for 785 HF inpatient stays over a 4 month period at a large academic medical center. Our method identified 8 subgroups of HF, each of which was found to associate with a canonical workflow inferred through an inductive mining algorithm. Each subgroup was further confirmed to be affiliated with specific comorbidities, such as hyperthyroidism and hypothyroidism. PMID:28269922

  2. 自定义工作流在办公自动化中的应用研究%Research of Custom Workflow in Office Automation

    Institute of Scientific and Technical Information of China (English)

    杨镒菲

    2012-01-01

    自定义工作流是指由用户根据实际的业务工作流程和活动,在软件系统中定义公文流转的节点和条件,驱动公文按照设计的工作路径流转.对自定义工作流分析和设计,利用UML建模语言给出了系统的类图和时序图,并实现了原型系统.将自定义工作流应用于办公自动化系统中,利用其灵活性,可以避免工作流程改变导致系统修改的情况,提高工作效率.%Custom Workflow refers to that the user defines the document circulation nodes and conditions in the software system based on the actual business processes and activities and drives documents in accordance with the designed path of work flow. For the analysis and design of custom workflow, the paper makes good use of UML modeling language, gives class diagrams and sequence diagrams of system, and achieves a prototype system. Putting the custom workflow into office automation systems, its flexibility can be able to avoid change in the system to modify the workflow and enhance efficiency.

  3. The Workflow Specification of Process Definition%工作流过程定义规范

    Institute of Scientific and Technical Information of China (English)

    缪晓阳; 石文俊; 吴朝晖

    2000-01-01

    This paper discusses the representation of a business process in a form.There are three basic aspects:the concept of workflow process definition,on which the idea of process definition interchange is raised;the meta-model of workflow,which is used to describe the entities and attributes of entities within the process definition;the workflow process definition language(WPDL),which is used to implement the Process definition.

  4. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-11-06

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse.

  5. Allocation optimale multicontraintes des workflows aux ressources d’un environnement Cloud Computing

    OpenAIRE

    Yassa, Sonia

    2014-01-01

    Cloud Computing is increasingly recognized as a new way to use on-demand, computing, storage and network services in a transparent and efficient way. In this thesis, we address the problem of workflows scheduling on distributed heterogeneous infrastructure of Cloud Computing. The existing workflows scheduling approaches mainly focus on the bi-objective optimization of the makespan and the cost. In this thesis, we propose news workflows scheduling algorithms based on metaheuristics. Our algori...

  6. Robust Workflow Systems + Flexible Geoprocessing Services = Geo-enabled Model Web?

    OpenAIRE

    GRANELL CANUT CARLOS

    2013-01-01

    The chapter begins briefly exploring the concept of modeling in geosciences which notably benefits from advances on the integration of geoprocessing services and workflow systems. In section 3, we provide a comprehensive background on the technology trends we treat in the chapter. On one hand we deal with workflow systems, categorized normally in the literature as scientific and business workflow systems (Barga and Gannon 2007). In particular, we introduce some prominent examples of scient...

  7. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  8. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  9. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  10. PhyloGrid: a development for a workflow in Phylogeny

    CERN Document Server

    Montes, Esther; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do this, no knowledge from his/her side about the computational procedure is required.

  11. Workflow for large-scale analysis of melanoma tissue samples

    Directory of Open Access Journals (Sweden)

    Maria E. Yakovleva

    2015-09-01

    Full Text Available The aim of the present study was to create an optimal workflow for analysing a large cohort of malignant melanoma tissue samples. Samples were lysed with urea and enzymatically digested with trypsin or trypsin/Lys C. Buffer exchange or dilution was used to reduce urea concentration prior to digestion. The tissue digests were analysed directly or following strong cation exchange (SCX fractionation by nano LC–MS/MS. The approach which resulted in the largest number of protein IDs involved a buffer exchange step before enzymatic digestion with trypsin and chromatographic separation in 120 min gradient followed by SCX–RP separation of peptides.

  12. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  13. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M Jr; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  14. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  15. Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing

    Science.gov (United States)

    Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.

    2008-12-01

    The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with

  16. Bandwidth-Aware Scheduling of Workflow Application on Multiple Grid Sites

    Directory of Open Access Journals (Sweden)

    Harshadkumar B. Prajapati

    2014-01-01

    Full Text Available Bandwidth-aware workflow scheduling is required to improve the performance of a workflow application in a multisite Grid environment, as the data movement cost between two low-bandwidth sites can adversely affect the makespan of the application. Pegasus WMS, an open-source and freely available WMS, cannot fully utilize its workflow mapping capability due to unavailability of integration of any bandwidth monitoring infrastructure in it. This paper develops the integration of Network Weather Service (NWS in Pegasus WMS to enable the bandwidth-aware mapping of scientific workflows. Our work demonstrates the applicability of the integration of NWS by making existing Heft site-selector of Pegasus WMS bandwidth aware. Furthermore, this paper proposes and implements a new workflow scheduling algorithm—Level based Highest Input and Processing Weight First. The results of the performed experiments indicate that the bandwidth-aware workflow scheduling algorithms perform better than bandwidth-unaware algorithms: Random and Heft of Pegasus WMS. Moreover, our proposed workflow scheduling algorithm performs better than the bandwidth-aware Heft algorithms. Thus, the proposed bandwidth-aware workflow scheduling enhances capability of Pegasus WMS and can increase performance of workflow applications.

  17. Pegasus: A Framework for Mapping Complex Scientific Workflows onto Distributed Systems

    Directory of Open Access Journals (Sweden)

    Ewa Deelman

    2005-01-01

    Full Text Available This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. Pegasus enables users to represent the workflows at an abstract level without needing to worry about the particulars of the target execution systems. The paper describes general issues in mapping applications and the functionality of Pegasus. We present the results of improving application performance through workflow restructuring which clusters multiple tasks in a workflow into single entities. A real-life astronomy application is used as the basis for the study.

  18. A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems.

    Science.gov (United States)

    Brüderle, Daniel; Petrovici, Mihai A; Vogginger, Bernhard; Ehrlich, Matthias; Pfeil, Thomas; Millner, Sebastian; Grübl, Andreas; Wendt, Karsten; Müller, Eric; Schwartz, Marc-Olivier; de Oliveira, Dan Husmann; Jeltsch, Sebastian; Fieres, Johannes; Schilling, Moritz; Müller, Paul; Breitwieser, Oliver; Petkov, Venelin; Muller, Lyle; Davison, Andrew P; Krishnamurthy, Pradeep; Kremkow, Jens; Lundqvist, Mikael; Muller, Eilif; Partzsch, Johannes; Scholze, Stefan; Zühl, Lukas; Mayr, Christian; Destexhe, Alain; Diesmann, Markus; Potjans, Tobias C; Lansner, Anders; Schüffny, René; Schemmel, Johannes; Meier, Karlheinz

    2011-05-01

    In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.

  19. Geometric processing workflow for vertical and oblique hyperspectral frame images collected using UAV

    Science.gov (United States)

    Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.

    2014-08-01

    Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.

  20. LiSIs: An Online Scientific Workflow System for Virtual Screening.

    Science.gov (United States)

    Kannas, Christos C; Kalvari, Ioanna; Lambrinidis, George; Neophytou, Christiana M; Savva, Christiana G; Kirmitzoglou, Ioannis; Antoniou, Zinonas; Achilleos, Kleo G; Scherf, David; Pitta, Chara A; Nicolaou, Christos A; Mikros, Emanuel; Promponas, Vasilis J; Gerhauser, Clarissa; Mehta, Rajendra G; Constantinou, Andreas I; Pattichis, Constantinos S

    2015-01-01

    Modern methods of drug discovery and development in recent years make a wide use of computational algorithms. These methods utilise Virtual Screening (VS), which is the computational counterpart of experimental screening. In this manner the in silico models and tools initial replace the wet lab methods saving time and resources. This paper presents the overall design and implementation of a web based scientific workflow system for virtual screening called, the Life Sciences Informatics (LiSIs) platform. The LiSIs platform consists of the following layers: the input layer covering the data file input; the pre-processing layer covering the descriptors calculation, and the docking preparation components; the processing layer covering the attribute filtering, compound similarity, substructure matching, docking prediction, predictive modelling and molecular clustering; post-processing layer covering the output reformatting and binary file merging components; output layer covering the storage component. The potential of LiSIs platform has been demonstrated through two case studies designed to illustrate the preparation of tools for the identification of promising chemical structures. The first case study involved the development of a Quantitative Structure Activity Relationship (QSAR) model on a literature dataset while the second case study implemented a docking-based virtual screening experiment. Our results show that VS workflows utilizing docking, predictive models and other in silico tools as implemented in the LiSIs platform can identify compounds in line with expert expectations. We anticipate that the deployment of LiSIs, as currently implemented and available for use, can enable drug discovery researchers to more easily use state of the art computational techniques in their search for promising chemical compounds. The LiSIs platform is freely accessible (i) under the GRANATUM platform at: http://www.granatum.org and (ii) directly at: http://lisis.cs.ucy.ac.cy.

  1. Magallanes: a web services discovery and automatic workflow composition tool

    Directory of Open Access Journals (Sweden)

    Trelles Oswaldo

    2009-10-01

    Full Text Available Abstract Background To aid in bioinformatics data processing and analysis, an increasing number of web-based applications are being deployed. Although this is a positive circumstance in general, the proliferation of tools makes it difficult to find the right tool, or more importantly, the right set of tools that can work together to solve real complex problems. Results Magallanes (Magellan is a versatile, platform-independent Java library of algorithms aimed at discovering bioinformatics web services and associated data types. A second important feature of Magallanes is its ability to connect available and compatible web services into workflows that can process data sequentially to reach a desired output given a particular input. Magallanes' capabilities can be exploited both as an API or directly accessed through a graphic user interface. The Magallanes' API is freely available for academic use, and together with Magallanes application has been tested in MS-Windows™ XP and Unix-like operating systems. Detailed implementation information, including user manuals and tutorials, is available at http://www.bitlab-es.com/magallanes. Conclusion Different implementations of the same client (web page, desktop applications, web services, etc. have been deployed and are currently in use in real installations such as the National Institute of Bioinformatics (Spain and the ACGT-EU project. This shows the potential utility and versatility of the software library, including the integration of novel tools in the domain and with strong evidences in the line of facilitate the automatic discovering and composition of workflows.

  2. A High Throughput Workflow Environment for Cosmological Simulations

    CERN Document Server

    Erickson, Brandon M S; Evrard, August E; Becker, Matthew R; Busha, Michael T; Kravtsov, Andrey V; Marru, Suresh; Pierce, Marlon; Wechsler, Risa H

    2012-01-01

    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manua...

  3. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  4. A novel spectral library workflow to enhance protein identifications.

    Science.gov (United States)

    Li, Haomin; Zong, Nobel C; Liang, Xiangbo; Kim, Allen K; Choi, Jeong Ho; Deng, Ning; Zelaya, Ivette; Lam, Maggie; Duan, Huilong; Ping, Peipei

    2013-04-09

    The innovations in mass spectrometry-based investigations in proteome biology enable systematic characterization of molecular details in pathophysiological phenotypes. However, the process of delineating large-scale raw proteomic datasets into a biological context requires high-throughput data acquisition and processing. A spectral library search engine makes use of previously annotated experimental spectra as references for subsequent spectral analyses. This workflow delivers many advantages, including elevated analytical efficiency and specificity as well as reduced demands in computational capacity. In this study, we created a spectral matching engine to address challenges commonly associated with a library search workflow. Particularly, an improved sliding dot product algorithm, that is robust to systematic drifts of mass measurement in spectra, is introduced. Furthermore, a noise management protocol distinguishes spectra correlation attributed from noise and peptide fragments. It enables elevated separation between target spectral matches and false matches, thereby suppressing the possibility of propagating inaccurate peptide annotations from library spectra to query spectra. Moreover, preservation of original spectra also accommodates user contributions to further enhance the quality of the library. Collectively, this search engine supports reproducible data analyses using curated references, thereby broadening the accessibility of proteomics resources to biomedical investigators. This article is part of a Special Issue entitled: From protein structures to clinical applications.

  5. Design and Optimization of Future Hybrid and Electric Propulsion Systems: An Advanced Tool Integrated in a Complete Workflow to Study Electric Devices Développement et optimisation des futurs systèmes de propulsion hybride et électrique : un outil avancé et intégré dans une chaîne complète dédiée à l’étude des composants électriques

    Directory of Open Access Journals (Sweden)

    Le Berr F.

    2012-08-01

    Full Text Available Electrification to reduce greenhouse effect gases in transport sector is now well-known as a relevant and future solution studied intensively by the whole actors of the domain. To reach this objective, a tool for design and characterization of electric machines has been developed at IFP Energies nouvelles. This tool, called EMTool, is based on physical equations and is integrated to a complete workflow of simulation tools, as Finite Element Models or System Simulation. This tool offers the possibility to study several types of electric machine topologies: permanent magnet synchronous machine with radial or axial flux, induction machines, etc. This paper presents the main principles of design and the main equations integrated in the EMTool, the methods to evaluate electric machine performances and the validations performed on existing machine. Finally, the position of the EMTool in the simulation tool workflow and application examples are presented, notably by coupling the EMTool with advanced optimization algorithms or finite element models. Le recours à l’électrification pour réduire les émissions de gaz à effet de serre dans le domaine du transport est désormais reconnu comme une solution pertinente et d’avenir, très étudiée par l’ensemble des acteurs du domaine. Dans cet objectif, un outil d’aide au dimensionnement et à la caractérisation de machines électriques a été développé à IFP Energies nouvelles. Cet outil, appelé EMTool, est basé sur les équations physiques du domaine et est intégré à un ensemble d’outils de simulation dédiés à l’étude des groupes motopropulseurs électrifiés, comme les outils de modélisation par éléments finis ou les outils de simulation système. Il permet d’étudier plusieurs types de topologies de machines électriques : machines synchrones à aimants permanents à flux radial ou axial, machines asynchrones, etc. Ce papier présente les grands principes de

  6. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  7. Improvement of workflow and processes to ease and enrich meaningful use of health information technology

    Directory of Open Access Journals (Sweden)

    Singh R

    2013-11-01

    Full Text Available Ranjit Singh,1 Ashok Singh,2 Devan R Singh,3 Gurdev Singh1 1Department of Family Medicine, UB Patient Safety Research Center, School of Medicine and Management, State University of NY at Buffalo, NY, USA; 2Niagara Family Medicine Associates, Niagara Falls, NY, USA; 3SaferPatients LLC, Lewiston, NY, USA Abstract: The introduction of health information technology (HIT can have unexpected and unintended patient safety and/or quality consequences. This highly desirable but complex intervention requires workflow changes in order to be effective. Workflow is often cited by providers as the number one 'pain point'. Its redesign needs to be tailored to the organizational context, current workflow, HIT system being introduced, and the resources available. Primary care practices lack the required expertise and need external assistance. Unfortunately, the current methods of using esoteric charts or software are alien to health care workers and are, therefore, perceived to be barriers. Most importantly and ironically, these do not readily educate or enable staff to inculcate a common vision, ownership, and empowerment among all stakeholders. These attributes are necessary for creating highly reliable organizations. We present a tool that addresses US Accreditation Council for Graduate Medical (ACGME competency requirements. Of the six competencies called for by the ACGME, the two that this tool particularly addresses are 'system-based practice' and 'practice-based learning and continuing improvement'. This toolkit is founded on a systems engineering approach. It includes a motivational and orientation presentation, 128 magnetic pictorial and write-erase icons of 40 designs, dry-erase magnetic board, and five visual aids for reducing cognitive and emotive biases in staff. Pilot tests were carried out in practices in Western New York and Colorado, USA. In addition, the toolkit was presented at the 2011 North American Primary Care Research Group (NAPCRG

  8. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  9. A lightweight messaging-based distributed processing and workflow execution framework for real-time and big data analysis

    Science.gov (United States)

    Laban, Shaban; El-Desouky, Aly

    2014-05-01

    To achieve a rapid, simple and reliable parallel processing of different types of tasks and big data processing on any compute cluster, a lightweight messaging-based distributed applications processing and workflow execution framework model is proposed. The framework is based on Apache ActiveMQ and Simple (or Streaming) Text Oriented Message Protocol (STOMP). ActiveMQ , a popular and powerful open source persistence messaging and integration patterns server with scheduler capabilities, acts as a message broker in the framework. STOMP provides an interoperable wire format that allows framework programs to talk and interact between each other and ActiveMQ easily. In order to efficiently use the message broker a unified message and topic naming pattern is utilized to achieve the required operation. Only three Python programs and simple library, used to unify and simplify the implementation of activeMQ and STOMP protocol, are needed to use the framework. A watchdog program is used to monitor, remove, add, start and stop any machine and/or its different tasks when necessary. For every machine a dedicated one and only one zoo keeper program is used to start different functions or tasks, stompShell program, needed for executing the user required workflow. The stompShell instances are used to execute any workflow jobs based on received message. A well-defined, simple and flexible message structure, based on JavaScript Object Notation (JSON), is used to build any complex workflow systems. Also, JSON format is used in configuration, communication between machines and programs. The framework is platform independent. Although, the framework is built using Python the actual workflow programs or jobs can be implemented by any programming language. The generic framework can be used in small national data centres for processing seismological and radionuclide data received from the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear

  10. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  11. CrossFlow: cross-organizational workflow management in dynamic virtual enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    This paper gives a detailed overview of the approach to cross-organizational workflow management developed in the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enter

  12. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  13. MapReduce Operations with WS-VLAM Workflow Management System

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.

    2013-01-01

    Workflow management systems are widely used to solve scientific problems as they enable orchestration of remote and lo- cal services such as database queries, job submission and running an application. To extend the role that workflow systems play in data-intensive science, we propose a solution tha

  14. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  15. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  16. The Impact of Computerized Provider Order Entry Systems on Inpatient Clinical Workflow: A Literature Review

    NARCIS (Netherlands)

    Z. Niazkhani (Zahra); H. Pirnejad (Habibollah); M. Berg (Marc); J.E.C.M. Aarts (Jos)

    2009-01-01

    textabstractPrevious studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990

  17. CrossFlow: Cross-Organizational Workflow Management for Service Outsourcing in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Ludwig, Heiko; Hoffner, Yigal

    2001-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  18. A Six‐Stage Workflow for Robust Application of Systems Pharmacology

    Science.gov (United States)

    Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH

    2016-01-01

    Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936

  19. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  20. Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows

    Directory of Open Access Journals (Sweden)

    Marquis P. Vawter

    2012-08-01

    Full Text Available Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG, to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders.

  1. An Improvement on Algorithm of Grid-Workflow Based on QoS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yun-feng; GE Wei

    2004-01-01

    With the emergence of grid computing, new challenges have arisen in workflow tasks scheduling.The goal of grid-workflow task scheduling is to achieve high system throughput and to match the application needs with the available computing resources.This matching of resources in a non-deterministically share heterogeneous environment leads to concerns on quality of service (QoS).Grid concept is presented in this paper, coupled with the QoS requirement of workflow task and an improved algorithm-ILGSS algorithm, has been brought out.The complexity of the improved scheduling algorithm has been analyzed.The experiment results show that the improved algorithm can lead to significant performance gain in various applications.An important research domain-adaptive workflow transaction in grid computing environment, has been explored and a new solution for the scheduling of distribute workflow has been bring forward in grid environment.

  2. The Distributed Workflow Management System--FlowAgent

    Institute of Scientific and Technical Information of China (English)

    王文军; 仲萃豪

    2000-01-01

    While mainframe or 2-tier client/server system have serious problems in flexibility and scalability for the large-scale business processes, 3-tier client/server architecture and object-oriented system modeling which construct business process on service components seem to bring software system some scalability. As enabling infrastructure for object-oriented methodology, distributed WFMS (Work-flow Management System) can flexibly describe business rules among autonomous 'service tasks', and support scalability of large-scale business process. But current distributed WFMS still have difficulty to manage a large number of distributed tasks, the 'multi-TaskDomain' architecture of FlowAgent will try to solve this problem, and bring a dynamic and distributed environment for task-scheduling.

  3. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  4. A Workflow for Differentially-Private Graph Synthesis

    CERN Document Server

    Proserpio, Davide; McSherry, Frank

    2012-01-01

    We present a new workflow for differentially-private publication of graph topologies. First, we produce differentially private measurements of interesting graph statistics using our new version of the PINQ programming language, Weighted PINQ, which is based on a generalization of differential privacy to weighted sets. Next, we show how to generate graphs that fit any set of measured graph statistics, even if they are inconsistent (due to noise), or if they are only indirectly related to actual statistics that we want our synthetic graph to preserve. We do this by combining the answers to Weighted PINQ queries with an incremental evaluator (Markov Chain Monte Carlo (MCMC)) that allows us to synthesize graphs where the statistic of interest aligns with that of the protected graph. This paper presents our preliminary results; we show how to cast a few graph statistics (degree distribution, edge multiplicity, joint degree distribution) as queries in Weighted PINQ, and then present experimental results of syntheti...

  5. A Component Based Approach to Scientific Workflow Management

    CERN Document Server

    Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  6. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  7. Using Simulations to Integrate Technology into Health Care Aidesཿ Workflow

    Directory of Open Access Journals (Sweden)

    Sharla King

    2013-07-01

    Full Text Available Health care aides (HCAs are critical to home care, providing a range of services to people with chronic conditions, aging or are unable to care for themselves independently. The current HCA supply will not keep up with this increasing demand without fundamental changes in their work environment. One possible solution to some of the workflow challenges and workplace stress of HCAs is hand-held tablet technology. In order to introduce the use of tablets with HCAs, simulations were developed. Once an HCA was comfortable with the tablet, a simulated client was introduced. The HCA interacted with the simulated client and used the tablet applications to assist with providing care. After the simulations, the HCAs participated in a focus group. HCAs completed a survey before and after the tablet training and simulation to determine their perception and acceptance of the tablet. Future deployment and implementation of technologies in home care should be further evaluated for outcomes.

  8. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    Science.gov (United States)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point

  9. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  10. Workflow Management Application Programming Interface Specification%工作流管理应用编程接口规范

    Institute of Scientific and Technical Information of China (English)

    刘华伟; 吴朝晖

    2000-01-01

    The document 'Workflow Management Application Programming Interface Specification'is distributed by the Workflow Management Coalition to specify stadard APIs which can be supported by workflow management products.In this paper,we first introduce two parts of this interface,then discuss the standardized data structure and functions definition,finally address the future work.

  11. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook

    Directory of Open Access Journals (Sweden)

    Jean-Luc Richard Stevens

    2013-12-01

    Full Text Available Lancet is a new, simulator-independent Python utility for succinctlyspecifying, launching, and collating results from large batches ofinterrelated computationally demanding program runs. This paperdemonstrates how to combine Lancet with IPython Notebook to provide aflexible, lightweight, and agile workflow for fully reproduciblescientific research. This informal and pragmatic approach usesIPython Notebook to capture the steps in a scientific computation asit is gradually automated and made ready for publication, withoutmandating the use of any separate application that can constrainscientific exploration and innovation. The resulting notebookconcisely records each step involved in even very complexcomputational processes that led to a particular figure or numericalresult, allowing the complete chain of events to be replicatedautomatically.Lancet was originally designed to help solve problems in computationalneuroscience, such as analyzing the sensitivity of a complexsimulation to various parameters, or collecting the results frommultiple runs with different random starting points. However, becauseit is never possible to know in advance what tools might be requiredin future tasks, Lancet has been designed to be completely general,supporting any type of program as long as it can be launched as aprocess and can return output in the form of files. For instance,Lancet is also heavily used by one of the authors in a separateresearch group for launching batches of microprocessor simulations.This general design will allow Lancet to continue supporting a givenresearch project even as the underlying approaches and tools change.

  12. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  13. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook.

    Science.gov (United States)

    Stevens, Jean-Luc R; Elver, Marco; Bednar, James A

    2013-01-01

    Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change.

  14. Geo-processing workflow driven wildfire hot pixel detection under sensor web environment

    Science.gov (United States)

    Chen, Nengcheng; Di, Liping; Yu, Genong; Gong, Jianya

    2010-03-01

    Integrating Sensor Web Enablement (SWE) services with Geo-Processing Workflows (GPW) has become a bottleneck for Sensor Web-based applications, especially remote-sensing observations. This paper presents a common GPW framework for Sensor Web data service as part of the NASA Sensor Web project. This abstract framework includes abstract GPW model construction, GPW chains from service combination, and data retrieval components. The concrete framework consists of a data service node, a data processing node, a data presentation node, a Catalogue Service node, and a BPEL engine. An abstract model designer is used to design the top level GPW model, a model instantiation service is used to generate the concrete Business Process Execution Language (BPEL), and the BPEL execution engine is adopted. This framework is used to generate several kinds of data: raw data from live sensors, coverage or feature data, geospatial products, or sensor maps. A prototype, including a model designer, model instantiation service, and GPW engine-BPELPower is presented. A scenario for an EO-1 Sensor Web data service for wildfire hot pixel detection is used to test the feasibility of the proposed framework. The execution time and influences of the EO-1 live Hyperion data wildfire classification service framework are evaluated. The benefits and high performance of the proposed framework are discussed. The experiments of EO-1 live Hyperion data wildfire classification service show that this framework can improve the quality of services for sensor data retrieval and processing.

  15. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  16. myExperiment: a repository and social network for the sharing of bioinformatics workflows.

    Science.gov (United States)

    Goble, Carole A; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-07-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org.

  17. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  18. An Implementation-Focused Bio/Algorithmic Workflow for Synthetic Biology.

    Science.gov (United States)

    Goñi-Moreno, Angel; Carcajona, Marta; Kim, Juhyun; Martínez-García, Esteban; Amos, Martyn; de Lorenzo, Víctor

    2016-10-21

    As synthetic biology moves away from trial and error and embraces more formal processes, workflows have emerged that cover the roadmap from conceptualization of a genetic device to its construction and measurement. This latter aspect (i.e., characterization and measurement of synthetic genetic constructs) has received relatively little attention to date, but it is crucial for their outcome. An end-to-end use case for engineering a simple synthetic device is presented, which is supported by information standards and computational methods and focuses on such characterization/measurement. This workflow captures the main stages of genetic device design and description and offers standardized tools for both population-based measurement and single-cell analysis. To this end, three separate aspects are addressed. First, the specific vector features are discussed. Although device/circuit design has been successfully automated, important structural information is usually overlooked, as in the case of plasmid vectors. The use of the Standard European Vector Architecture (SEVA) is advocated for selecting the optimal carrier of a design and its thorough description in order to unequivocally correlate digital definitions and molecular devices. A digital version of this plasmid format was developed with the Synthetic Biology Open Language (SBOL) along with a software tool that allows users to embed genetic parts in vector cargoes. This enables annotation of a mathematical model of the device's kinetic reactions formatted with the Systems Biology Markup Language (SBML). From that point onward, the experimental results and their in silico counterparts proceed alongside, with constant feedback to preserve consistency between them. A second aspect involves a framework for the calibration of fluorescence-based measurements. One of the most challenging endeavors in standardization, metrology, is tackled by reinterpreting the experimental output in light of simulation results, allowing

  19. Flexible Data-Aware Scheduling for Workflows over an In-Memory Object Store

    Energy Technology Data Exchange (ETDEWEB)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin; Wozniak, Justin M.; Carretero, Jesus; Ross, Rob

    2016-01-01

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  20. Implementation of a Workflow Management System for Non-Expert Users

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2016-01-01

    In the Danish CLARIN-DK infrastructure, chaining language technology (LT) tools into a workflow is easy even for a non-expert user, because she only needs to specify the input and the desired output of the workflow. With this information and the registered input and output profiles of the available...... tools, the CLARIN-DK workflow management system (WMS) computes combinations of tools that will give the desired result. This advanced functionality was originally not envisaged, but came within reach by writing the WMS partly in Java and partly in a programming language for symbolic computation, Bracmat...

  1. An Inter-enterprise Workflow Model for Supply Chain and B2B E-commerce

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The goals of B2B electronic commerce and supply chain management system are to implement interoperability of independent enterprises, to smooth the information flow between them and to deploy business processes over multiple enterprises. The inherent characteristics of workflow system make it suitable to implement the cross organization management. This paper, firstly, proposes an inter-enterprises workflow model based on the agreement to support the construction of supply chain management system and B2B electronic commerce. This model has extended the standard workflow model. After that, an architecture which supports the model has been introduced, especially it details the structure and implementation of interfaces between enterprises.

  2. Inheritance Optimization of Extend Case Transfer Model of Interoganization Workflows Management

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Extend case transfer architecture inter-organization workflow management fits the needs of Collaboration commerce. However, during the third step of extend case transfer architecture, modifications of private workflows might cause some fatal problems, such as dead locks, live locks and dead tasks. These problems could change the soundness and efficiency of overall work flow. This paper presents a Petri net based approach to protect the inheritance of public work flows in private domains, and discusses an implementation of our collaboration commerce workflow model.

  3. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  4. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat Sam; Halama, James; Bova, Davide

    2017-03-09

    A written directive (WD) is a requirement of the United States Nuclear Regulatory Commission (USNRC) regulations and is required for all uses of I-131 above 1.11 MBq (30 microcuries) and for patients receiving therapy with radiopharmaceuticals. These regulations have also been adopted and are required to be enforced by the agreement states. A paper trail method of WD management is inefficient and prone to error, loss, and duplication. As the options for therapy in Nuclear Medicine increase with the introduction of new radiopharmaceuticals, the time spent on the regulatory burden and paperwork has also increased. The management of regulatory requirements has a significant impact on physician and technologist time utilization and these pressures may increase the potential for inaccurate or incomplete WD data and subsequent regulatory violations. A software tool for the management of WDs using a HIPAA compliant database has been created. This WD software allows for the secure sharing of data among physicians, technologists and managers while saving time, reducing errors and eliminating the possibility of loss and duplication. Methods: Software development was performed using Microsoft Visual Basic® (Microsoft Corporation, Redmond, WA) which is part of the Microsoft Visual Studio® development environment for the Microsoft Windows® platform. The database repository for patient data is Microsoft Access® and stored locally on a HIPAA secure server or hard disk. Once a working version was developed, it was installed and used at our institution for the management of WDs. Updates and modifications were released regularly until no significant problems were found with the operation of the software. Results: The software has been in use at our institution for over two years and has reliably kept track of all directives during that time. All physicians and technologists use the software as part of their daily workflow and find it superior to paper directives. We are able to

  5. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    Science.gov (United States)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  6. RELATION -BASED LIGHTWEIGHT WORKFLOW ENGINE%基于关系结构的轻量级工作流引擎

    Institute of Scientific and Technical Information of China (English)

    何清法; 李国杰; 焦丽梅; 刘力力

    2001-01-01

    Workflow technology plays an important role in the development of critical business applications. According to the analysis of the requirements to develop critical business applications, a framework of a relationship-based lightweight workflow engine is presented, which is built over the conventional relational database system. It consists of three components, namely: organization model, information model and control model. The reasons that the concepts of relationship and lightweight are adopted to design the workflow engine are thoroughly discussed. The principle to design, the representation of and the way to implement the organization, information and control model of the workflow engine are described respectively. A prototype of this workflow engine has been applied to a project about the automation of trademark registration and administration, which shows the application of the workflow engine can markedly shorten the cycle of the development critical business applications.%针对关键业务应用的开发离不开工作流技术的支持,通过对关键业务的实际开发需求的分析,在传统的关系数据库基础上,提出了一个适用于关键业务开发的基于关系结构的轻量级工作流引擎的框架结构.此工作流模型由机构模型、信息模型和控制模型3部分组成.深入讨论了采用关系结构和轻量级理念来设计工作流引擎的原因,并详细地给出了相关的机构模型、信息模型和控制模型的设计原理以及具体的表示和实现方法.其原型已经应用到实际的应用系统中.实践证明,利用此工作流引擎可以显著地缩短关键业务的开发周期.

  7. Bridging ImmunoGenomic Data Analysis Workflow Gaps (BIGDAWG): An integrated case-control analysis pipeline.

    Science.gov (United States)

    Pappas, Derek J; Marin, Wesley; Hollenbach, Jill A; Mack, Steven J

    2016-03-01

    Bridging ImmunoGenomic Data-Analysis Workflow Gaps (BIGDAWG) is an integrated data-analysis pipeline designed for the standardized analysis of highly-polymorphic genetic data, specifically for the HLA and KIR genetic systems. Most modern genetic analysis programs are designed for the analysis of single nucleotide polymorphisms, but the highly polymorphic nature of HLA and KIR data require specialized methods of data analysis. BIGDAWG performs case-control data analyses of highly polymorphic genotype data characteristic of the HLA and KIR loci. BIGDAWG performs tests for Hardy-Weinberg equilibrium, calculates allele frequencies and bins low-frequency alleles for k×2 and 2×2 chi-squared tests, and calculates odds ratios, confidence intervals and p-values for each allele. When multi-locus genotype data are available, BIGDAWG estimates user-specified haplotypes and performs the same binning and statistical calculations for each haplotype. For the HLA loci, BIGDAWG performs the same analyses at the individual amino-acid level. Finally, BIGDAWG generates figures and tables for each of these comparisons. BIGDAWG obviates the error-prone reformatting needed to traffic data between multiple programs, and streamlines and standardizes the data-analysis process for case-control studies of highly polymorphic data. BIGDAWG has been implemented as the bigdawg R package and as a free web application at bigdawg.immunogenomics.org.

  8. Laboratory Information Management Software for genotyping workflows: applications in high throughput crop genotyping

    Directory of Open Access Journals (Sweden)

    Prasanth VP

    2006-08-01

    Full Text Available Abstract Background With the advances in DNA sequencer-based technologies, it has become possible to automate several steps of the genotyping process leading to increased throughput. To efficiently handle the large amounts of genotypic data generated and help with quality control, there is a strong need for a software system that can help with the tracking of samples and capture and management of data at different steps of the process. Such systems, while serving to manage the workflow precisely, also encourage good laboratory practice by standardizing protocols, recording and annotating data from every step of the workflow. Results A laboratory information management system (LIMS has been designed and implemented at the International Crops Research Institute for the Semi-Arid Tropics (ICRISAT that meets the requirements of a moderately high throughput molecular genotyping facility. The application is designed as modules and is simple to learn and use. The application leads the user through each step of the process from starting an experiment to the storing of output data from the genotype detection step with auto-binning of alleles; thus ensuring that every DNA sample is handled in an identical manner and all the necessary data are captured. The application keeps track of DNA samples and generated data. Data entry into the system is through the use of forms for file uploads. The LIMS provides functions to trace back to the electrophoresis gel files or sample source for any genotypic data and for repeating experiments. The LIMS is being presently used for the capture of high throughput SSR (simple-sequence repeat genotyping data from the legume (chickpea, groundnut and pigeonpea and cereal (sorghum and millets crops of importance in the semi-arid tropics. Conclusion A laboratory information management system is available that has been found useful in the management of microsatellite genotype data in a moderately high throughput genotyping

  9. Research on Office Automation Workflow Based on Petri Network in Colleges and Universities%基于Petri网的高校办公自动化工作流研究

    Institute of Scientific and Technical Information of China (English)

    董淑娟; 张哲

    2014-01-01

    This article analyzes the basic structure and modeling process of office automation subsystem workflow in colleges and universities, and designs the Petri workflow model of document flow, the route of workflow, the document information and document flow process information. Furthermore, it discusses the process of document flow realized by this model combining with living examples.%分析了高校办公自动化子系统工作流的基本结构和建模步骤,对公文流转的Petri工作流模型、工作流的路由、公文信息和公文的流程处理信息进行设计,并结合实例,探讨了该模型实现公文流转的过程。

  10. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Ryabinkin, E.; Wenaus, T.

    2016-02-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed Analysis)Workload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF), is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF's Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  11. A novel workflow for seismic net pay estimation with uncertainty

    CERN Document Server

    Glinsky, Michael E; Unaldi, Muhlis; Nagassar, Vishal

    2016-01-01

    This paper presents a novel workflow for seismic net pay estimation with uncertainty. It is demonstrated on the Cassra/Iris Field. The theory for the stochastic wavelet derivation (which estimates the seismic noise level along with the wavelet, time-to-depth mapping, and their uncertainties), the stochastic sparse spike inversion, and the net pay estimation (using secant areas) along with its uncertainty; will be outlined. This includes benchmarking of this methodology on a synthetic model. A critical part of this process is the calibration of the secant areas. This is done in a two step process. First, a preliminary calibration is done with the stochastic reflection response modeling using rock physics relationships derived from the well logs. Second, a refinement is made to the calibration to account for the encountered net pay at the wells. Finally, a variogram structure is estimated from the extracted secant area map, then used to build in the lateral correlation to the ensemble of net pay maps while matc...

  12. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  13. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  14. Automating radiologist workflow part 1: the digital consultation.

    Science.gov (United States)

    Reiner, Bruce

    2008-10-01

    With the widespread adoption of picture archiving and communication systems and filmless imaging, ubiquitous and instantaneous access to imaging data has resulted in decreased radiologist-clinician consultations. It is therefore imperative that the radiology community develop new communication strategies to improve both the timeliness and the perceived value of the radiology report. One strategy to accomplish this goal is the creation of electronic consultation tools, which can effectively recreate radiologist workflow and the identification of key pathologic findings in an easy-to-use, well-organized, and timely fashion. This would be accomplished by recording radiologist-computer interactions using an electronic auditing tool, storing these interactions in an extensible markup language schema, which can subsequently be played back at a later point in time to recreate the radiologist consultation. This approach has the added benefits of allowing the radiologist to selectively edit content to the needs of different clinician users, index the comprehensive consultation into pathology-specific components, and perform asynchronous bidirectional consultations. This electronic consultation tool would result in the creation of context and user-specific consultation files, which can in turn be integrated with clinical data from electronic medical records.

  15. A workflow model to analyse pediatric emergency overcrowding.

    Science.gov (United States)

    Zgaya, Hayfa; Ajmi, Ines; Gammoudi, Lotfi; Hammadi, Slim; Martinot, Alain; Beuscart, Régis; Renard, Jean-Marie

    2014-01-01

    The greatest source of delay in patient flow is the waiting time from the health care request, and especially the bed request to exit from the Pediatric Emergency Department (PED) for hospital admission. It represents 70% of the time that these patients occupied in the PED waiting rooms. Our objective in this study is to identify tension indicators and bottlenecks that contribute to overcrowding. Patient flow mapping through the PED was carried out in a continuous 2 years period from January 2011 to December 2012. Our method is to use the collected real data, basing on accurate visits made in the PED of the Regional University Hospital Center (CHRU) of Lille (France), in order to construct an accurate and complete representation of the PED processes. The result of this representation is a Workflow model of the patient journey in the PED representing most faithfully possible the reality of the PED of CHRU of Lille. This model allowed us to identify sources of delay in patient flow and aspects of the PED activity that could be improved. It must be enough retailed to produce an analysis allowing to identify the dysfunctions of the PED and also to propose and to estimate prevention indicators of tensions. Our survey is integrated into the French National Research Agency project, titled: "Hospital: optimization, simulation and avoidance of strain" (ANR HOST).

  16. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  17. A flow cytometry-based workflow for detection and quantification of anti-plasmodial antibodies in vaccinated and naturally exposed individuals

    DEFF Research Database (Denmark)

    Ajua, Anthony; Engleitner, Thomas; Esen, Meral;

    2012-01-01

    information about natural exposure and vaccine immunogenicity. A novel, cytometry-based workflow for the quantitative detection of anti-plasmodial antibodies in human serum is presented. METHODS: Fixed red blood cells (RBCs), infected with late stages of P. falciparum were utilized to detect malaria......-specific antibodies by flow cytometry with subsequent automated data analysis. Available methods for data-driven analysis of cytometry data were assessed and a new overlap subtraction algorithm (OSA) based on open source software was developed. The complete workflow was evaluated using sera from two GMZ2 malaria...... vaccine trials in semiimmune adults and pre-school children residing in a malaria endemic area. RESULTS: Fixation, permeabilization, and staining of infected RBCs were adapted for best operation in flow cytometry. As asexual vaccine candidates are designed to induce antibody patterns similar to semi...

  18. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  19. Task-driven equipment inspection system based on safe workflow model

    Science.gov (United States)

    Guo, Xinyou; Liu, Yangguang

    2010-12-01

    An equipment inspection system is one that contains a number of equipment queues served in cyclic order. In order to satisfy multi-task scheduling and multi-task combination requirements for equipment inspection system, we propose a model based on inspection workflow in this paper. On the one hand, the model organizes all kinds of equipments according to inspection workflow, elemental work units according to inspection tasks, combination elements according to the task defined by users. We proposed a 3-dimensional workflow model for equipments inspection system including organization sub-model, process sub-model and data sub-model. On the other hand, the model is based on the security authorization which defined by relation between roles, tasks, pre-defined business workflows and inspection data. The system based on proposed framework is safe and efficient. Our implement shows that the system is easy to operate and manage according to the basic performance.

  20. A Web application for the management of clinical workflow in image-guided and adaptive proton therapy for prostate cancer treatments.

    Science.gov (United States)

    Yeung, Daniel; Boes, Peter; Ho, Meng Wei; Li, Zuofeng

    2015-05-08

    Image-guided radiotherapy (IGRT), based on radiopaque markers placed in the prostate gland, was used for proton therapy of prostate patients. Orthogonal X-rays and the IBA Digital Image Positioning System (DIPS) were used for setup correction prior to treatment and were repeated after treatment delivery. Following a rationale for margin estimates similar to that of van Herk,(1) the daily post-treatment DIPS data were analyzed to determine if an adaptive radiotherapy plan was necessary. A Web application using ASP.NET MVC5, Entity Framework, and an SQL database was designed to automate this process. The designed features included state-of-the-art Web technologies, a domain model closely matching the workflow, a database-supporting concurrency and data mining, access to the DIPS database, secured user access and roles management, and graphing and analysis tools. The Model-View-Controller (MVC) paradigm allowed clean domain logic, unit testing, and extensibility. Client-side technologies, such as jQuery, jQuery Plug-ins, and Ajax, were adopted to achieve a rich user environment and fast response. Data models included patients, staff, treatment fields and records, correction vectors, DIPS images, and association logics. Data entry, analysis, workflow logics, and notifications were implemented. The system effectively modeled the clinical workflow and IGRT process.

  1. Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-31

    The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.

  2. Workflows for fluxomics in the framework of PhenoMeNal project

    OpenAIRE

    2016-01-01

    In the framework of the PhenoMeNal project and e-infrastructures (www.phenomenal-h2020.eu/home/), two workflows were prepared using previously developed programs for analysis of intracellular fluxes from isotopologue distributions. Isotopologue distributions will be available on data repositories in MetaboLights (www.ebi.ac.uk/metabolights). With this aim, programs were adapted to the workflows, docker container dockerfiles for each program were uploaded to GitHub repositories, added to Jenki...

  3. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  4. An Optimization Algorithm for Multipath Parallel Allocation for Service Resource in the Simulation Task Workflow

    OpenAIRE

    Zhiteng Wang; Hongjun Zhang; Rui Zhang; Yong Li; Xuliang Zhang

    2014-01-01

    Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimiz...

  5. Data Processing Workflows to Support Reproducible Data-driven Research in Hydrology

    Science.gov (United States)

    Goodall, J. L.; Essawy, B.; Xu, H.; Rajasekar, A.; Moore, R. W.

    2015-12-01

    Geoscience analyses often require the use of existing data sets that are large, heterogeneous, and maintained by different organizations. A particular challenge in creating reproducible analyses using these data sets is automating the workflows required to transform raw datasets into model specific input files and finally into publication ready visualizations. Data grids, such as the Integrated Rule-Oriented Data System (iRODS), are architectures that allow scientists to access and share large data sets that are geographically distributed on the Internet, but appear to the scientist as a single file management system. The DataNet Federation Consortium (DFC) project is built on iRODS and aims to demonstrate data and computational interoperability across scientific communities. This paper leverages iRODS and the DFC to demonstrate how hydrological modeling workflows can be encapsulated as workflows using the iRODS concept of Workflow Structured Objects (WSO). An example use case is presented for automating hydrologic model post-processing routines that demonstrates how WSOs can be created and used within the DFC to automate the creation of data visualizations from large model output collections. By co-locating the workflow used to create the visualization with the data collection, the use case demonstrates how data grid technology aids in reuse, reproducibility, and sharing of workflows within scientific communities.

  6. An iterative expanding and shrinking process for processor allocation in mixed-parallel workflow scheduling.

    Science.gov (United States)

    Huang, Kuo-Chan; Wu, Wei-Ya; Wang, Feng-Jian; Liu, Hsiao-Ching; Hung, Chun-Hao

    2016-01-01

    Parallel computation has been widely applied in a variety of large-scale scientific and engineering applications. Many studies indicate that exploiting both task and data parallelisms, i.e. mixed-parallel workflows, to solve large computational problems can get better efficacy compared with either pure task parallelism or pure data parallelism. Scheduling traditional workflows of pure task parallelism on parallel systems has long been known to be an NP-complete problem. Mixed-parallel workflow scheduling has to deal with an additional challenging issue of processor allocation. In this paper, we explore the processor allocation issue in scheduling mixed-parallel workflows of moldable tasks, called M-task, and propose an Iterative Allocation Expanding and Shrinking (IAES) approach. Compared to previous approaches, our IAES has two distinguishing features. The first is allocating more processors to the tasks on allocated critical paths for effectively reducing the makespan of workflow execution. The second is allowing the processor allocation of an M-task to shrink during the iterative procedure, resulting in a more flexible and effective process for finding better allocation. The proposed IAES approach has been evaluated with a series of simulation experiments and compared to several well-known previous methods, including CPR, CPA, MCPA, and MCPA2. The experimental results indicate that our IAES approach outperforms those previous methods significantly in most situations, especially when nodes of the same layer in a workflow might have unequal workloads.

  7. Provenance of things - describing geochemistry observation workflows using PROV-O

    Science.gov (United States)

    Cox, S. J. D.; Car, N. J.

    2015-12-01

    Geochemistry observations typically follow a complex preparation process after sample retrieval from the field. Description of these are required to allow readers and other data users to assess the reliability of the data produced, and to ensure reproducibility. While laboratory notebooks are used for private record-keeping, and laboratory information systems (LIMS) on a facility basis, this data is not generally published, and there are no standard formats for transfer. And while there is some standardization of workflows, this is often scoped to a lab, or an instrument. New procedures and workflows are being developed continually - in fact this is a key expectation in the development of the science. Thus formalization of the description of sample preparation and observations must be both rigorous and flexible. We have been exploring the use of the W3C Provenance model (PROV) to capture complete traces, including both the real world things and the data generated. PROV has a core data model that distinguishes between entities, agents and activities involved in producing a piece of data or thing in the world. While the design of PROV was primarily conditioned by stories concerning information resources, application is not restricted to the production of digital or information assets. PROV allowing a comprehensive trace of predecessor entities and transformations at any level of detail. In this paper we demonstrate the use of PROV for describing specimens managed for scientific observations. Two examples are considered: a geological sample which undergoes a typical preparation process for measurements of the concentration of a particular chemical substance, and the collection, taxonomic classification and eventual publication of an insect specimen. PROV enables the material that goes into the instrument to be linked back to the sample retrieved in the field. This complements the IGSN system, which focuses on registration of field sample identity to support the

  8. Integrated exploration workflow in the south Middle Magdalena Valley (Colombia)

    Science.gov (United States)

    Moretti, Isabelle; Charry, German Rodriguez; Morales, Marcela Mayorga; Mondragon, Juan Carlos

    2010-03-01

    The HC exploration is presently active in the southern part of the Middle Magdalena Valley but only moderate size discoveries have been made up to date. The majority of these discoveries are at shallow depth in the Tertiary section. The structures located in the Valley are faulted anticlines charged by lateral migration from the Cretaceous source rocks that are assumed to be present and mature eastward below the main thrusts and the Guaduas Syncline. Upper Cretaceous reservoirs have also been positively tested. To reduce the risks linked to the exploration of deeper structures below the western thrusts of the Eastern Cordillera, an integrated study was carried out. It includes the acquisition of new seismic data, the integration of all surface and subsurface data within a 3D-geomodel, a quality control of the structural model by restoration and a modeling of the petroleum system (presence and maturity of the Cretaceous source rocks, potential migration pathways). The various steps of this workflow will be presented as well as the main conclusions in term of source rock, deformation phases and timing of the thrust emplacement versus oil maturation and migration. Our data suggest (or confirm) The good potential of the Umir Fm as a source rock. The early (Paleogene) deformation of the Bituima Trigo fault area. The maturity gap within the Cretaceous source rock between the hangingwall and footwall of the Bituima fault that proves an initial offset of Cretaceous burial in the range of 4.5 km between the Upper Cretaceous series westward and the Lower Cretaceous ones eastward of this fault zone. The post Miocene weak reactivation as dextral strike slip of Cretaceous faults such as the San Juan de Rio Seco fault that corresponds to change in the Cretaceous thickness and therefore in the depth of the thrust decollement.

  9. Text-mining-assisted biocuration workflows in Argo.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Rowley, Andrew; Carter, Jacob; Ananiadou, Sophia

    2014-01-01

    Biocuration activities have been broadly categorized into the selection of relevant documents, the annotation of biological concepts of interest and identification of interactions between the concepts. Text mining has been shown to have a potential to significantly reduce the effort of biocurators in all the three activities, and various semi-automatic methodologies have been integrated into curation pipelines to support them. We investigate the suitability of Argo, a workbench for building text-mining solutions with the use of a rich graphical user interface, for the process of biocuration. Central to Argo are customizable workflows that users compose by arranging available elementary analytics to form task-specific processing units. A built-in manual annotation editor is the single most used biocuration tool of the workbench, as it allows users to create annotations directly in text, as well as modify or delete annotations created by automatic processing components. Apart from syntactic and semantic analytics, the ever-growing library of components includes several data readers and consumers that support well-established as well as emerging data interchange formats such as XMI, RDF and BioC, which facilitate the interoperability of Argo with other platforms or resources. To validate the suitability of Argo for curation activities, we participated in the BioCreative IV challenge whose purpose was to evaluate Web-based systems addressing user-defined biocuration tasks. Argo proved to have the edge over other systems in terms of flexibility of defining biocuration tasks. As expected, the versatility of the workbench inevitably lengthened the time the curators spent on learning the system before taking on the task, which may have affected the usability of Argo. The participation in the challenge gave us an opportunity to gather valuable feedback and identify areas of improvement, some of which have already been introduced. Database URL: http://argo.nactem.ac.uk.

  10. FLOSYS--a web-accessible workflow system for protocol-driven biomolecular sequence analysis.

    Science.gov (United States)

    Badidi, E; Lang, B F; Burger, G

    2004-11-01

    FLOSYS is an interactive web-accessible bioinformatics workflow system designed to assist biologists in multi-step data analyses. FLOSYS allows the user to create complex analysis pathways (protocols) graphically, similar to drawing a flowchart: icons representing particular bioinformatics tools are dragged and dropped onto a canvas and lines connecting those icons are drawn to specify the relationships between the tools. In addition, FLOSYS permits to select input-data, execute the protocol and store the results in a personal workspace. The three-tier architecture of FLOSYS has been implemented in Java and uses a relational database system together with new technologies for distributed and web computing such as CORBA, RMI, JSP and JDBC. The prototype of FLOSYS, which is part of the bioinformatics workbench AnaBench, is accessible on-line at http://malawimonas.bcm.umontreal.ca: 8091/anabench. The entire package is available on request to academic groups who wish to have a customized local analysis environment for research or teaching.

  11. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    Directory of Open Access Journals (Sweden)

    McNally James

    2009-01-01

    Full Text Available Abstract Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development, a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community.

  12. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  13. College educational administration management system based on workflow technology%基于工作流技术的高校教务管理系统

    Institute of Scientific and Technical Information of China (English)

    张昊

    2011-01-01

    The workflow technology can separate the information management system into function module and process management, carring on the modeling and the control separately to make the treating processes automation and the human as well as each kind of application tools coordinate the work mutually. It can let the appropriate person or the software work correctly in the appropriate time execution. This paper analyzes the work of college educational administration management. Under the web development of environment, it will introduce the workflow into the college educational administration management system, designing a kind of Petri nets workflow modeling method and rules. The process of building workflow mode is analyzed in detail by example of approval of leaving school. It is also designed based on the B/S structure of the college educational administration management system. The designed method of this structure is flexible and easy to maintain, which can handle the problem of workflow dynamic modeling in the process of business educational administration easily to improve the system of high level and high efficiency and realize the modernization and networking of the level of college educational administration management.%工作流技术可以将信息管理系统中的功能模块和过程管理分开,分别进行建模和实行控制,使处理过程自动化,使人以及各种应用工具相互之间协调工作,让合适的人或软件在恰当的时间执行正确的工作.对高校教务管理工作进行分析,在Web开发环境下将工作流技术引入高校教务管理系统,设计一种基于Petri网工作流建模方法及规则.以离校审批业务为例详细分析了构建工作流模型的过程,设计基于B/S结构的高校教务管理系统体系结构,该结构设计方法灵活和易维护,可以较方便地处理高校教务管理业务过程中工作流动态建模问题,以提高系统管理的高水平、高效率,实现高校教务管理水平的现代化和网络化.

  14. 基于SOA的BPO工作流模型研究与实践%Research of an SOA-Based BPO Workflow Model and Its Practical Application

    Institute of Scientific and Technical Information of China (English)

    秦凤梅; 张桂华; 邱玉辉

    2013-01-01

    With the development of enterprise information technology,enterprises pay more and more attention to their core business and outsource their non-core business to other professional companies.As a result,BPO (business process outsourcing) arises at the historic moment.This article gives an introduction to BPO workflow,analyzes the structure characteristics of current BPO workflow and,based on the SOA (service-oriented architecture),presents a description of the methods and steps of BPO business logic service encapsulation with the service-workflow mapping model.Then a BPO complaints workflow management system model is designed in the case of SOA architecture,which is characterized by loosely coupling,extensibility,re-usability and easiness for maintenance.%随着企业信息化技术不断发展,企业越来越重视其核心业务,将非核心业务外包给其它专业化企业,业务流程外包(Business Process Outsourcing,BPO)应运而生.从介绍BPO工作流入手,通过当前BPO工作流构成特点分析,结合面向服务架构(SOA),采用服务一工作流映射模型,对BPO业务逻辑服务封装的方法及步骤进行描述,并设计了一个SOA架构下的BPO投诉管理系统工作流模型,具有松散耦合、可扩展、复用、易维护等特点.

  15. 基于动态控制机制的工作流安全访问模型%Workflow Security Access Model Based on Dynamic Control Mechanism

    Institute of Scientific and Technical Information of China (English)

    巫茜; 周庆

    2012-01-01

    为确保工作流系统安全可靠地工作,在传统基于角色的访问控制模型中引入目标案例、用户管理和目标3个关系元素,设计动态授权机制,构建一种基于动态控制机制的工作流安全访问模型,通过基本约束关系与动态约束条件,保证模型的安全运行,并将其与工作流引擎组件进行集成,为独立安全领域应用提供安全授权服务.应用结果表明,该模型可将动态职责与互惠职责较好地分离,对动态职责进行绑定,为系统工作流安全访问提供技术支持.%In order to ensure the workflow system secure and reliable, this paper introduces three relation elements of Target Case(TC), User Management(UM) and Target(T) in the conventional access control model based on role, designs the dynamic authorization mechanism, and builds a sort of security access model in workflow based on dynamic control mechanism, which ensures the secure running of the model by means of basic constraint relation and dynamic constraint conditions. The model is integrated with workflow engine component to provide the security authorization service for the application of independent security field. Application results show that the model can better separate the dynamic responsibility and reciprocal responsibility, make the binding of dynamic responsibility, and provide technical support for security access of system workflow.

  16. A Comparison of Using Taverna and BPEL in Building Scientific Workflows: the case of caGrid.

    Science.gov (United States)

    Tan, Wei; Missier, Paolo; Foster, Ian; Madduri, Ravi; Goble, Carole

    2010-06-25

    With the emergence of "service oriented science," the need arises to orchestrate multiple services to facilitate scientific investigation-that is, to create "science workflows." We present here our findings in providing a workflow solution for the caGrid service-based grid infrastructure. We choose BPEL and Taverna as candidates, and compare their usability in the lifecycle of a scientific workflow, including workflow composition, execution, and result analysis. Our experience shows that BPEL as an imperative language offers a comprehensive set of modeling primitives for workflows of all flavors; while Taverna offers a dataflow model and a more compact set of primitives that facilitates dataflow modeling and pipelined execution. We hope that this comparison study not only helps researchers select a language or tool that meets their specific needs, but also offers some insight on how a workflow language and tool can fulfill the requirement of the scientific community.

  17. Development of Vacuum Vessel Design and Analysis Module for CFETR Integration Design Platform

    Directory of Open Access Journals (Sweden)

    Chen Zhu

    2016-01-01

    Full Text Available An integration design platform is under development for the design of the China Fusion Engineering Test Reactor (CFETR. It mainly includes the integration physical design platform and the integration engineering design platform. The integration engineering design platform aims at performing detailed engineering design for each tokamak component (e.g., breeding blanket, divertor, and vacuum vessel. The vacuum vessel design and analysis module is a part of the integration engineering design platform. The main idea of this module is to integrate the popular CAD/CAE software to form a consistent development environment. Specifically, the software OPTIMUS provides the approach to integrate the CAD/CAE software such as CATIA and ANSYS and form a design/analysis workflow for the vacuum vessel module. This design/analysis workflow could automate the process of modeling and finite element (FE analysis for vacuum vessel. Functions such as sensitivity analysis and optimization of geometric parameters have been provided based on the design/analysis workflow. In addition, data from the model and FE analysis could be easily exchanged among different modules by providing a unifying data structure to maintain the consistency of the global design. This paper describes the strategy and methodology of the workflow in the vacuum vessel module. An example is given as a test of the workflow and functions of the vacuum vessel module. The results indicate that the module is a feasible framework for future application.

  18. Assessing color reproduction tolerances in commercial print workflow

    Science.gov (United States)

    Beretta, Giordano B.; Hoarau, Eric; Kothari, Sunil; Lin, I.-Jong; Zeng, Jun

    2012-01-01

    Except for linear devices like CRTs, color transformations from colorimetric specifications to device coordinates are mostly obtained by measuring a set of samples, inverting the table, and looking up values in the table (including interpolation), and mapping the gamut from input to output device. The accuracy of a transformation is determined by reproducing a second set of samples and measuring the reproduction errors. Accuracy as the average predicted perceptual error is then used as a metric for quality. Accuracy and precision are important metrics in commercial print because a print service provider can charge a higher price for more accurate color, or can widen his tolerances when customers prefer cheap prints. The disadvantage of determining tolerances through averaging perceptual errors is that the colors in the sample sets are independent and this is not necessarily a good correlate of print quality as determined through psychophysics studies. Indeed, images consist of color palettes and the main quality factor is not color fidelity but color integrity. For example, if the divergence of the field of error vectors is zero, color constancy is likely to take over and humans will perceive the color reproduction as being of good quality, even if the average error is relatively large. However, if the errors are small but in random directions, the perceived image quality is poor because the relation among colors is altered. We propose a standard practice to determine tolerance based on the Farnsworth-Munsell 100-hue test (FM-100) for the second set and to evaluate the color transpositions-a metric for color integrity-instead of the color differences. The quality metric is then the FM-100 score. There are industry standards for the tolerances of color judges, and the same tolerances and classification can be use for print workflows or its components (e.g., presses, proofers, displays). We generalize this practice to arbitrary perceptually uniform scales tailored to

  19. A workflow for the 3D visualization of meteorological data

    Science.gov (United States)

    Helbig, Carolin; Rink, Karsten

    2014-05-01

    In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have

  20. Seamless online science workflow development and collaboration using IDL and the ENVI Services Engine

    Science.gov (United States)

    Harris, A. T.; Ramachandran, R.; Maskey, M.

    2013-12-01

    The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL

  1. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  2. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach.

  3. An Integrated Workflow For Secondary Use of Patient Data for Clinical Research.

    Science.gov (United States)

    Bouzillé, Guillaume; Sylvestre, Emmanuelle; Campillo-Gimenez, Boris; Renault, Eric; Ledieu, Thibault; Delamarre, Denis; Cuggia, Marc

    2015-01-01

    This work proposes an integrated workflow for secondary use of medical data to serve feasibility studies, and the prescreening and monitoring of research studies. All research issues are initially addressed by the Clinical Research Office through a research portal and subsequently redirected to relevant experts in the determined field of concentration. For secondary use of data, the workflow is then based on the clinical data warehouse of the hospital. A datamart with potentially eligible research candidates is constructed. Datamarts can either produce aggregated data, de-identified data, or identified data, according to the kind of study being treated. In conclusion, integrating the secondary use of data process into a general research workflow allows visibility of information technologies and improves the accessability of clinical data.

  4. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of

  5. The EOS imaging system: Workflow and radiation dose in scoliosis examinations

    DEFF Research Database (Denmark)

    Mussmann, Bo; Torfing, Trine; Jespersen, Stig;

    Introduction The EOS imaging system is a biplane slot beam scanner capable of full body scans at low radiation dose and without geometrical distortion. It was implemented in our department primo 2012 and all scoliosis examinations are now performed in EOS. The system offers improved possibility...... The purpose of the study was to evaluate workflow defined as scheduled time pr. examination and radiation dose in scoliosis examinations in EOS compared to conventional x-ray evaluation. Materials and Methods: The Dose Area Product (DAP) was measured with a dosimeter and a comparison between conventional X......-ray and EOS was made. The Workflow in 2011 was compared to the workflow in 2013 with regards to the total number of examinations and the scheduled examination time for scoliosis examinations. Results: DAP for a scoliosis examination in conventional X-ray was 185 mGy*cm2 and 60.36 mGy*cm2 for EOS...

  6. Loosen Couple Workflow Mode of Lean Operator Improvement Based on Positive Feedback

    Directory of Open Access Journals (Sweden)

    Yao Li

    2013-04-01

    Full Text Available In order to promote the core competitive power for telecom operating enterprises to face market fine operation, this article compares the ECTA mode (Extension Case Transmission Mode and the LCA mode (Loosen Couple Mode, both of which are promoted by WfMC. By comparing these two modes, the suitable situations for these two modes are determined. We also carry out empirical analysis based on the customization mode of mobile phones between China telecom and mobile phone manufacturers and to expound the ascension effect of mechanism based on the agile telecom loose coupling workflow with positive feedback to the telecom enterprises. Finally, on the basis of positive feedback system, the task complexity and information transparency of LCA mode are improved, so that the semantics of public flow mode is kept unchanged and the sub workflow is optimized when modifying the sub workflow.

  7. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes...... representing the events that can happen and arrows representing four relations between events: condition, response, include, and exclude. Distributed DCR Graphs is then obtained by assigning roles to events and principals. We give a graphical notation inspired by related work by van der Aalst et al. We...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  8. New strategies for medical data mining, part 3: automated workflow analysis and optimization.

    Science.gov (United States)

    Reiner, Bruce

    2011-02-01

    The practice of evidence-based medicine calls for the creation of "best practice" guidelines, leading to improved clinical outcomes. One of the primary factors limiting evidence-based medicine in radiology today is the relative paucity of standardized databases. The creation of standardized medical imaging databases offer the potential to enhance radiologist workflow and diagnostic accuracy through objective data-driven analytics, which can be categorized in accordance with specific variables relating to the individual examination, patient, provider, and technology being used. In addition to this "global" database analysis, "individual" radiologist workflow can be analyzed through the integration of electronic auditing tools into the PACS. The combination of these individual and global analyses can ultimately identify best practice patterns, which can be adapted to the individual attributes of end users and ultimately used in the creation of automated evidence-based medicine workflow templates.

  9. Differentiated protection services with failure probability guarantee for workflow-based applications

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  10. An Extended Policy Language for Role Resolution in Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    张晓光; 曹健; 张申生; 牟玉洁

    2004-01-01

    HP defines an SQL-like language to specify organizational policies (or constraints) in workflow systems.Three types of policies were studied including qualification, requirements and substitution policies which can not handle complex role resolution such as Separation of Roles and Binding of Roles, and several exception situations,such as Role Delegation and Role Unavailable. From the perspective of project-oriented workflow, a project and its sub-projects can be under the charge of teams (or virtual teams). The teams should satisfy the role resolution of the projects managed by the team. To support the above requirements, based on team-enabled organization model,this paper extended HP's policy language to support the role resolution in project-oriented workflow, and provided its modeling and enforcement mechanism.

  11. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  12. Perti Net-Based Workflow Access Control Model%基于Perti网的工作流访问控制模型研究

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems.This paper shows how to make access control in workflow system.We give a workflow access control model (WACM) based on several current access control models.The model supports roles assignment and dynamic authorization.The paper defines the workflow using Petri net.It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM).Finally, an example of an e-commerce workflow access control model is discussed in detail.

  13. SegMine workflows for semantic microarray data analysis in Orange4WS

    Directory of Open Access Journals (Sweden)

    Kulovesi Kimmo

    2011-10-01

    Full Text Available Abstract Background In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases. Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. Results We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Conclusions Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  14. Creating OGC Web Processing Service workflows using a web-based editor

    Science.gov (United States)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  15. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    Directory of Open Access Journals (Sweden)

    Oberg Ryan

    2011-04-01

    Full Text Available Abstract Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language process definition language (XPDL. The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent. We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We

  16. Magnetic resonance only workflow and validation of dose calculations for radiotherapy of prostate cancer

    DEFF Research Database (Denmark)

    Christiansen, Rasmus Lübeck; Jensen, Henrik R.; Brink, Carsten

    2017-01-01

    Background: Current state of the art radiotherapy planning of prostate cancer utilises magnetic resonance (MR) for soft tissue delineation and computed tomography (CT) to provide an electron density map for dose calculation. This dual scan workflow is prone to setup and registration error....... This study evaluates the feasibility of an MR-only workflow and the validity of dose calculation from an MR derived pseudo CT. Material and methods: Thirty prostate cancer patients were CT and MR scanned. Clinical treatment plans were generated on CT using a single 18 MV arc volumetric modulated arc therapy...

  17. Computational workflow for analysis of gain and loss of genes in distantly related genomes

    Directory of Open Access Journals (Sweden)

    Ptitsyn Andrey

    2012-09-01

    Full Text Available Abstract Background Early evolution of animals led to profound changes in body plan organization, symmetry and the rise of tissue complexity including formation of muscular and nervous systems. This process was associated with massive restructuring of animal genomes as well as deletion, acquisition and rapid differentiation of genes from a common metazoan ancestor. Here, we present a simple but efficient workflow for elucidation of gene gain and gene loss within major branches of the animal kingdom. Methods We have designed a pipeline of sequence comparison, clustering and functional annotation using 12 major phyla as illustrative examples. Specifically, for the input we used sets of ab initio predicted gene models from the genomes of six bilaterians, three basal metazoans (Cnidaria, Placozoa, Porifera, two unicellular eukaryotes (Monosiga and Capsospora and the green plant Arabidopsis as an out-group. Due to the large amounts of data the software required a high-performance Linux cluster. The final results can be imported into standard spreadsheet analysis software and queried for the numbers and specific sets of genes absent in specific genomes, uniquely present or shared among different taxons. Results and conclusions The developed software is open source and available free of charge on Open Source principles. It allows the user to address a number of specific questions regarding gene gain and gene loss in particular genomes, and user-defined groups of genomes can be formulated in a type of logical expression. For example, our analysis of 12 sequenced genomes indicated that these genomes possess at least 90,000 unique genes and gene families, suggesting enormous diversity of the genome repertoire in the animal kingdom. Approximately 9% of these gene families are shared universally (homologous among all genomes, 53% are unique to specific taxa, and the rest are shared between two or more distantly related genomes.

  18. TU-A-304-01: Introduction and Workflow of Image-Guided SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Salter, B. [University of Utah Huntsman Cancer Institute (United States)

    2015-06-15

    Increased use of SBRT and hypo fractionation in radiation oncology practice has posted a number of challenges to medical physicist, ranging from planning, image-guided patient setup and on-treatment monitoring, to quality assurance (QA) and dose delivery. This symposium is designed to provide updated knowledge necessary for the safe and efficient implementation of SBRT in various linac platforms, including the emerging digital linacs equipped with high dose rate FFF beams. Issues related to 4D CT, PET and MRI simulations, 3D/4D CBCT guided patient setup, real-time image guidance during SBRT dose delivery using gated/un-gated VMAT or IMRT, and technical advancements in QA of SBRT (in particular, strategies dealing with high dose rate FFF beams) will be addressed. The symposium will help the attendees to gain a comprehensive understanding of the SBRT workflow and facilitate their clinical implementation of the state-of-art imaging and planning techniques. Learning Objectives: Present background knowledge of SBRT, describe essential requirements for safe implementation of SBRT, and discuss issues specific to SBRT treatment planning and QA. Update on the use of multi-dimensional (3D and 4D) and multi-modality (CT, beam-level X-ray imaging, pre- and on-treatment 3D/4D MRI, PET, robotic ultrasound, etc.) for reliable guidance of SBRT. Provide a comprehensive overview of emerging digital linacs and summarize the key geometric and dosimetric features of the new generation of linacs for substantially improved SBRT. Discuss treatment planning and quality assurance issues specific to SBRT. Research grant from Varian Medical Systems.

  19. Application of intra-oral dental scanners in the digital workflow of implantology.

    Directory of Open Access Journals (Sweden)

    Wicher J van der Meer

    Full Text Available Intra-oral scanners will play a central role in digital dentistry in the near future. In this study the accuracy of three intra-oral scanners was compared.A master model made of stone was fitted with three high precision manufactured PEEK cylinders and scanned with three intra-oral scanners: the CEREC (Sirona, the iTero (Cadent and the Lava COS (3M. In software the digital files were imported and the distance between the centres of the cylinders and the angulation between the cylinders was assessed. These values were compared to the measurements made on a high accuracy 3D scan of the master model.The distance errors were the smallest and most consistent for the Lava COS. The distance errors for the Cerec were the largest and least consistent. All the angulation errors were small.The Lava COS in combination with a high accuracy scanning protocol resulted in the smallest and most consistent errors of all three scanners tested when considering mean distance errors in full arch impressions both in absolute values and in consistency for both measured distances. For the mean angulation errors, the Lava COS had the smallest errors between cylinders 1-2 and the largest errors between cylinders 1-3, although the absolute difference with the smallest mean value (iTero was very small (0,0529°. An expected increase in distance and/or angular errors over the length of the arch due to an accumulation of registration errors of the patched 3D surfaces could be observed in this study design, but the effects were statistically not significant.For making impressions of implant cases for digital workflows, the most accurate scanner with the scanning protocol that will ensure the most accurate digital impression should be used. In our study model that was the Lava COS with the high accuracy scanning protocol.

  20. Characterizing Strain Variation in Engineered E. coli Using a Multi-Omics-Based Workflow

    DEFF Research Database (Denmark)

    Brunk, Elizabeth; George, Kevin W.; Alonso-Gutierrez, Jorge;

    2016-01-01

    . Application of this workflow identified the roles of candidate genes, pathways, and biochemical reactions in observed experimental phenomena and facilitated the construction of a mutant strain with improved productivity. The contributed workflow is available as an open-source tool in the form of iPython...... notebooks....

  1. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    Science.gov (United States)

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboard, that incorporates text mining techniques to support the biocurator in the generation of BEL networks. The underlying UIMA-based text mining pipeline (BELIEF Pipeline) uses several named entity recognition processes and relationship extraction methods to detect concepts and BEL relationships in literature. The BELIEF Dashboard allows easy curation of the automatically generated BEL statements and their context annotations. Resulting BEL statements and their context annotations can be syntactically and semantically verified to ensure consistency in the BEL network. In summary, the workflow supports experts in different stages of systems biology network building. Based on the BioCreative V BEL track evaluation, we show that the BELIEF Pipeline automatically extracts relationships with an F-score of 36.4% and fully correct statements can be obtained with an F-score of 30.8%. Participation in the BioCreative V Interactive task (IAT) track with BELIEF revealed a systems usability scale (SUS) of 67. Considering the complexity of the task for new users—learning BEL, working with a completely new interface, and performing complex curation—a score so close to the overall SUS average highlights the usability of BELIEF. Database URL: BELIEF is available at http://www.scaiview.com/belief/ PMID:27694210

  2. DiscopFlow: A new Tool for Discovering Organizational Structures and Interaction Protocols in WorkFlow

    CERN Document Server

    Abdelkafi, Mahdi; Gargouri, Faiez

    2012-01-01

    This work deals with Workflow Mining (WM) a very active and promising research area. First, in this paper we give a critical and comparative study of three representative WM systems of this area: the ProM, InWolve and WorkflowMiner systems. The comparison is made according to quality criteria that we have defined such as the capacity to filter and convert a Workflow log, the capacity to discover workflow perspectives and the capacity to support Multi-Analysis of processes. The major drawback of these systems is the non possibility to deal with organizational perspective discovering issue. We mean by organizational perspective, the organizational structures (federation, coalition, market or hierarchy) and interaction protocols (contract net, auction or vote). This paper defends the idea that organizational dimension in Multi-Agent System is an appropriate approach to support the discovering of this organizational perspective. Second, the paper proposes a Workflow log meta-model which extends the classical one ...

  3. Integration of 3D photogrammetric outcrop models in the reservoir modelling workflow

    Science.gov (United States)

    Deschamps, Remy; Joseph, Philippe; Lerat, Olivier; Schmitz, Julien; Doligez, Brigitte; Jardin, Anne

    2014-05-01

    3D technologies are now widely used in geosciences to reconstruct outcrops in 3D. The technology used for the 3D reconstruction is usually based on Lidar, which provides very precise models. Such datasets offer the possibility to build well-constrained outcrop analogue models for reservoir study purposes. The photogrammetry is an alternate methodology which principles are based in determining the geometric properties of an object from photographic pictures taken from different angles. Outcrop data acquisition is easy, and this methodology allows constructing 3D outcrop models with many advantages such as: - light and fast acquisition, - moderate processing time (depending on the size of the area of interest), - integration of field data and 3D outcrops into the reservoir modelling tools. Whatever the method, the advantages of digital outcrop model are numerous as already highlighted by Hodgetts (2013), McCaffrey et al. (2005) and Pringle et al. (2006): collection of data from otherwise inaccessible areas, access to different angles of view, increase of the possible measurements, attributes analysis, fast rate of data collection, and of course training and communication. This paper proposes a workflow where 3D geocellular models are built by integrating all sources of information from outcrops (surface picking, sedimentological sections, structural and sedimentary dips…). The 3D geomodels that are reconstructed can be used at the reservoir scale, in order to compare the outcrop information with subsurface models: the detailed facies models of the outcrops are transferred into petrophysical and acoustic models, which are used to test different scenarios of seismic and fluid flow modelling. The detailed 3D models are also used to test new techniques of static reservoir modelling, based either on geostatistical approaches or on deterministic (process-based) simulation techniques. A modelling workflow has been designed to model reservoir geometries and properties from

  4. Refinement of arrival-time picks using a cross-correlation based workflow

    Science.gov (United States)

    Akram, Jubran; Eaton, David W.

    2016-12-01

    We propose a new iterative workflow based on cross-correlation for improved arrival-time picking on microseismic data. In this workflow, signal-to-noise ratio (S/N) and polarity weighted stacking are used to minimize the effect of S/N and polarity fluctuations on the pilot waveform computation. We use an exhaustive search technique for polarity estimation through stack power maximization. We use pseudo-synthetic and real microseismic data from western Canada in order to demonstrate the effectiveness of proposed workflow relative to Akaike information criterion (AIC) and a previously published cross-correlation based method. The pseudo-synthetic microseismic waveforms are obtained by introducing Gaussian noise and polarity fluctuations into waveforms from a high S/N microseismic event. We find that the cross-correlation based approaches yield more accurate arrival-time picks as compared to AIC for low S/N waveforms. AIC is not affected by waveform polarities as it works on individual receiver levels whereas the accuracy of existing cross-correlation method decreases in spite of using envelope correlation. We show that our proposed workflow yields better and consistent arrival-time picks regardless of waveform amplitude and polarity variations within the receiver array. After refinement, the initial arrival-time picks are located closer to the best estimated manual picks.

  5. Effects of Lean Work Organization and Industrialization on Workflow and Productive Time in Housing Renovation Projects

    NARCIS (Netherlands)

    Vrijhoef, Ruben

    2016-01-01

    This paper presents work aimed at improved organization and performance of production in housing renovation projects. The purpose is to explore and demonstrate the potential of lean work organization and industrialized product technology to improve workflow and productive time. The research included

  6. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  7. An optimization algorithm for multipath parallel allocation for service resource in the simulation task workflow.

    Science.gov (United States)

    Wang, Zhiteng; Zhang, Hongjun; Zhang, Rui; Li, Yong; Zhang, Xuliang

    2014-01-01

    Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimization algorithm for multipath service resource parallel allocation, in which multipath service resource parallel allocation model is built and multiple chains coding scheme quantum optimization algorithm is used for optimization and solution. The multiple chains coding scheme quantum optimization algorithm is to extend parallel search space to improve search efficiency. Through the simulation experiment, this paper investigates the effect for the probability of success in simulation task workflow from different optimization algorithm, service allocation strategy, and path number, and the simulation result shows that the optimization algorithm for multipath service resource parallel allocation is an effective method to improve the probability of success and timeliness in simulation task workflow.

  8. Asking for Permission: A Survey of Copyright Workflows for Institutional Repositories

    Science.gov (United States)

    Hanlon, Ann; Ramirez, Marisa

    2011-01-01

    An online survey of institutional repository (IR) managers identified copyright clearance trends in staffing and workflows. The majority of respondents followed a mediated deposit model, and reported that library personnel, instead of authors, engaged in copyright clearance activities for IRs. The most common "information gaps" pertained to the…

  9. Automation of lidar-based hydrologic feature extraction workflows using GIS

    Science.gov (United States)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  10. High throughput workflow for coacervate formation and characterization in shampoo systems.

    Science.gov (United States)

    Kalantar, T H; Tucker, C J; Zalusky, A S; Boomgaard, T A; Wilson, B E; Ladika, M; Jordan, S L; Li, W K; Zhang, X; Goh, C G

    2007-01-01

    Cationic cellulosic polymers find wide utility as benefit agents in shampoo. Deposition of these polymers onto hair has been shown to mend split-ends, improve appearance and wet combing, as well as provide controlled delivery of insoluble actives. The deposition is thought to be enhanced by the formation of a polymer/surfactant complex that phase-separates from the bulk solution upon dilution. A standard characterization method has been developed to characterize the coacervate formation upon dilution, but the test is time and material prohibitive. We have developed a semi-automated high throughput workflow to characterize the coacervate-forming behavior of different shampoo formulations. A procedure that allows testing of real use shampoo dilutions without first formulating a complete shampoo was identified. This procedure was adapted to a Tecan liquid handler by optimizing the parameters for liquid dispensing as well as for mixing. The high throughput workflow enabled preparation and testing of hundreds of formulations with different types and levels of cationic cellulosic polymers and surfactants, and for each formulation a haze diagram was constructed. Optimal formulations and their dilutions that give substantial coacervate formation (determined by haze measurements) were identified. Results from this high throughput workflow were shown to reproduce standard haze and bench-top turbidity measurements, and this workflow has the advantages of using less material and allowing more variables to be tested with significant time savings.

  11. Additions to the Human Plasma Proteome via a Tandem MARS Depletion iTRAQ-Based Workflow

    Directory of Open Access Journals (Sweden)

    Zhiyun Cao

    2013-01-01

    Full Text Available Robust platforms for determining differentially expressed proteins in biomarker and discovery studies using human plasma are of great interest. While increased depth in proteome coverage is desirable, it is associated with costs of experimental time due to necessary sample fractionation. We evaluated a robust quantitative proteomics workflow for its ability (1 to provide increased depth in plasma proteome coverage and (2 to give statistical insight useful for establishing differentially expressed plasma proteins. The workflow involves dual-stage immunodepletion on a multiple affinity removal system (MARS column, iTRAQ tagging, offline strong-cation exchange chromatography, and liquid chromatography tandem mass spectrometry (LC-MS/MS. Independent workflow experiments were performed in triplicate on four plasma samples tagged with iTRAQ 4-plex reagents. After stringent criteria were applied to database searched results, 689 proteins with at least two spectral counts (SC were identified. Depth in proteome coverage was assessed by comparison to the 2010 Human Plasma Proteome Reference Database in which our studies reveal 399 additional proteins which have not been previously reported. Additionally, we report on the technical variation of this quantitative workflow which ranges from ±11 to 30%.

  12. Additions to the Human Plasma Proteome via a Tandem MARS Depletion iTRAQ-Based Workflow.

    Science.gov (United States)

    Cao, Zhiyun; Yende, Sachin; Kellum, John A; Robinson, Renã A S

    2013-01-01

    Robust platforms for determining differentially expressed proteins in biomarker and discovery studies using human plasma are of great interest. While increased depth in proteome coverage is desirable, it is associated with costs of experimental time due to necessary sample fractionation. We evaluated a robust quantitative proteomics workflow for its ability (1) to provide increased depth in plasma proteome coverage and (2) to give statistical insight useful for establishing differentially expressed plasma proteins. The workflow involves dual-stage immunodepletion on a multiple affinity removal system (MARS) column, iTRAQ tagging, offline strong-cation exchange chromatography, and liquid chromatography tandem mass spectrometry (LC-MS/MS). Independent workflow experiments were performed in triplicate on four plasma samples tagged with iTRAQ 4-plex reagents. After stringent criteria were applied to database searched results, 689 proteins with at least two spectral counts (SC) were identified. Depth in proteome coverage was assessed by comparison to the 2010 Human Plasma Proteome Reference Database in which our studies reveal 399 additional proteins which have not been previously reported. Additionally, we report on the technical variation of this quantitative workflow which ranges from ±11 to 30%.

  13. Towards an integrated workflow for structural reservoir model updating and history matching

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Peters, E.; Wilschut, F.

    2011-01-01

    A history matching workflow, as typically used for updating of petrophysical reservoir model properties, is modified to include structural parameters including the top reservoir and several fault properties: position, slope, throw and transmissibility. A simple 2D synthetic oil reservoir produced by

  14. A data-independent acquisition workflow for qualitative screening of new psychoactive substances in biological samples.

    Science.gov (United States)

    Kinyua, Juliet; Negreira, Noelia; Ibáñez, María; Bijlsma, Lubertus; Hernández, Félix; Covaci, Adrian; van Nuijs, Alexander L N

    2015-11-01

    Identification of new psychoactive substances (NPS) is challenging. Developing targeted methods for their analysis can be difficult and costly due to their impermanence on the drug scene. Accurate-mass mass spectrometry (AMMS) using a quadrupole time-of-flight (QTOF) analyzer can be useful for wide-scope screening since it provides sensitive, full-spectrum MS data. Our article presents a qualitative screening workflow based on data-independent acquisition mode (all-ions MS/MS) on liquid chromatography (LC) coupled to QTOFMS for the detection and identification of NPS in biological matrices. The workflow combines and structures fundamentals of target and suspect screening data processing techniques in a structured algorithm. This allows the detection and tentative identification of NPS and their metabolites. We have applied the workflow to two actual case studies involving drug intoxications where we detected and confirmed the parent compounds ketamine, 25B-NBOMe, 25C-NBOMe, and several predicted phase I and II metabolites not previously reported in urine and serum samples. The screening workflow demonstrates the added value for the detection and identification of NPS in biological matrices.

  15. Full 3-dimensional digital workflow for multicomponent dental appliances : A proof of concept

    NARCIS (Netherlands)

    van der Meer, W. Joerd; Vissink, Arjan; Ren, Yijin

    2016-01-01

    BACKGROUND: The authors used a 3-dimensional (3D) printer and a bending robot to produce a multicomponent dental appliance to assess whether 3D digital models of the dentition are applicable for a full digital workflow. METHODS: The authors scanned a volunteer's dentition with an intraoral scanner (

  16. An Optimization Algorithm for Multipath Parallel Allocation for Service Resource in the Simulation Task Workflow

    Directory of Open Access Journals (Sweden)

    Zhiteng Wang

    2014-01-01

    Full Text Available Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimization algorithm for multipath service resource parallel allocation, in which multipath service resource parallel allocation model is built and multiple chains coding scheme quantum optimization algorithm is used for optimization and solution. The multiple chains coding scheme quantum optimization algorithm is to extend parallel search space to improve search efficiency. Through the simulation experiment, this paper investigates the effect for the probability of success in simulation task workflow from different optimization algorithm, service allocation strategy, and path number, and the simulation result shows that the optimization algorithm for multipath service resource parallel allocation is an effective method to improve the probability of success and timeliness in simulation task workflow.

  17. Full 3-dimensional digital workflow for multicomponent dental appliances A proof of concept

    NARCIS (Netherlands)

    Meer, van der Joerd; Vissink, Arjan; Ren, Yijin

    2016-01-01

    Background. The authors used a 3-dimensional (3D) printer and a bending robot to produce a multicomponent dental appliance to assess whether 3D digital models of the dentition are applicable for a full digital workflow. Methods. The authors scanned a volunteer's dentition with an intraoral scanner (

  18. Biocuration workflows and text mining: overview of the BioCreative 2012 Workshop Track II.

    Science.gov (United States)

    Lu, Zhiyong; Hirschman, Lynette

    2012-01-01

    Manual curation of data from the biomedical literature is a rate-limiting factor for many expert curated databases. Despite the continuing advances in biomedical text mining and the pressing needs of biocurators for better tools, few existing text-mining tools have been successfully integrated into production literature curation systems such as those used by the expert curated databases. To close this gap and better understand all aspects of literature curation, we invited submissions of written descriptions of curation workflows from expert curated databases for the BioCreative 2012 Workshop Track II. We received seven qualified contributions, primarily from model organism databases. Based on these descriptions, we identified commonalities and differences across the workflows, the common ontologies and controlled vocabularies used and the current and desired uses of text mining for biocuration. Compared to a survey done in 2009, our 2012 results show that many more databases are now using text mining in parts of their curation workflows. In addition, the workshop participants identified text-mining aids for finding gene names and symbols (gene indexing), prioritization of documents for curation (document triage) and ontology concept assignment as those most desired by the biocurators. DATABASE URL: http://www.biocreative.org/tasks/bc-workshop-2012/workflow/.

  19. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  20. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology

    Directory of Open Access Journals (Sweden)

    Peter J.A. Cock

    2013-09-01

    Full Text Available The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of “effector” proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen’s predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology.This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols.The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu.

  1. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  2. Making Sense of Complexity with FRE, a Scientific Workflow System for Climate Modeling (Invited)

    Science.gov (United States)

    Langenhorst, A. R.; Balaji, V.; Yakovlev, A.

    2010-12-01

    A workflow is a description of a sequence of activities that is both precise and comprehensive. Capturing the workflow of climate experiments provides a record which can be queried or compared, and allows reproducibility of the experiments - sometimes even to the bit level of the model output. This reproducibility helps to verify the integrity of the output data, and enables easy perturbation experiments. GFDL's Flexible Modeling System Runtime Environment (FRE) is a production-level software project which defines and implements building blocks of the workflow as command line tools. The scientific, numerical and technical input needed to complete the workflow of an experiment is recorded in an experiment description file in XML format. Several key features add convenience and automation to the FRE workflow: ● Experiment inheritance makes it possible to define a new experiment with only a reference to the parent experiment and the parameters to override. ● Testing is a basic element of the FRE workflow: experiments define short test runs which are verified before the main experiment is run, and a set of standard experiments are verified with new code releases. ● FRE is flexible enough to support short runs with mere megabytes of data, to high-resolution experiments that run on thousands of processors for months, producing terabytes of output data. Experiments run in segments of model time; after each segment, the state is saved and the model can be checkpointed at that level. Segment length is defined by the user, but the number of segments per system job is calculated to fit optimally in the batch scheduler requirements. FRE provides job control across multiple segments, and tools to monitor and alter the state of long-running experiments. ● Experiments are entered into a Curator Database, which stores query-able metadata about the experiment and the experiment's output. ● FRE includes a set of standardized post-processing functions as well as the ability

  3. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  4. 利用日志文件进行工作流恢复的一种策略%A Strategy for Workflow Recovery via Log Pile

    Institute of Scientific and Technical Information of China (English)

    高军; 王海洋

    2000-01-01

    In a traditional way, when the execuUon of workflow is interruptted, we rise compensete worktask for workflow recovery itt order to keep the consistense of database. In this paper,we open a new .feasible way for system-supported recovery mechanism via database log file and workflow log file.

  5. Toward Exascale Seismic Imaging: Taming Workflow and I/O Issues

    Science.gov (United States)

    Lefebvre, M. P.; Bozdag, E.; Lei, W.; Rusmanugroho, H.; Smith, J. A.; Tromp, J.; Yuan, Y.

    2013-12-01

    Providing a better understanding of the physics and chemistry of Earth's interior through numerical simulations has always required tremendous computational resources. Post-petascale supercomputers are now available to solve complex scientific problems that were thought unreachable a few decades ago. They also bring a cohort of concerns on how to obtain optimum performance. Several issues are currently being investigated by the HPC community. To name a few, we can list energy consumption, fault resilience, scalability of the current parallel paradigms, large workflow management, I/O performance and feature extraction with large datasets. For this presentation, we focus on the last three issues. In the context of seismic imaging, in particular for simulations based on adjoint methods, workflows are well defined. They consist of a few collective steps (e.g., mesh generation or model updates) and of a large number of independent steps (e.g., forward and adjoint simulations of each seismic event, pre- and postprocessing of seismic traces). The greater goal is to reduce the time to solution, that is, obtaining a more precise representation of the subsurface as fast as possible. This brings us to consider both the workflow in its entirety and the parts composing it. The usual approach is to speedup the purely computational parts by code tuning in order to reach higher FLOPS and better memory usage. This still remains an important concern, but larger scale experiments show that the imaging workflow suffers from a severe I/O bottleneck. This limitation occurs both for purely computational data and seismic time series. The latter are dealt with by the introduction of a new Adaptable Seismic Data Format (ASDF). In both cases, a parallel I/O library, ORNL's ADIOS, is used to drastically lessen the weight of disk access. Moreover, parallel visualization tools, such as VisIt, are able to take advantage of the metadata included in our ADIOS outputs to extract features and

  6. 面向云计算的工作流技术%Cloud Computing Oriented Workflow Technology

    Institute of Scientific and Technical Information of China (English)

    柴学智; 曹健

    2012-01-01

    云计算的发展,对提高服务质量与压缩运行成本提出了新的要求.在此背景下,工作流技术被认为是一种较为优越的解决方案:从云计算用户的角度看,工作流提供了对复杂应用的抽象定义、灵活配置和自动化运行;从云计算服务提供者的角度看,工作流实现了任务的自动调度、资源的优化和管理.文章介绍工作流技术与云计算;阐述了云工作流产生的背景;深入剖析了云工作流的技术特征以及与其他工作流(业务工作流、网格工作流)的异同;列举了云工作流的4个实现案例,并对其进行比较.最后,在总结全文的同时,展望了云工作流技术的发展前景.%Along with the development of Cloud Computing, improving OoS (Quality of Service) as well as reducing OC(Operation Cost) is increasingly becoming an urgent goal. Cloud-oriented workflow technology is now regarded as a pretty good solution; For users of could computing, cloud workflow enables a complex application instance to be abstractly defined, flexibly configured and auto-operated ; for enterprises who provide cloud services, cloud workflow brings them auto-scheduling of tasks, optimization and management of computing resource. This paper proposed a short introduction to the technology of both workflow and Cloud Computing; Introduced the background in which Cloud Workflow generated; Meanwhile, the following part fully compared Cloud Workflow with other workflows (Business Workflow and Grid Workflow) according to the technical characteristics of workflow; After that, 4 instances of Cloud Workflow management system are listed, as well as their similarities and differences. At the end of this paper, a summary, and predicted the possible trend of Cloud Workflow in the future are given.

  7. Design

    DEFF Research Database (Denmark)

    Volf, Mette

    This publication is unique in its demystification and operationalization of the complex and elusive nature of the design process. The publication portrays the designer’s daily work and the creative process, which the designer is a part of. Apart from displaying the designer’s work methods...... and design parameters, the publication shows examples from renowned Danish design firms. Through these examples the reader gets an insight into the designer’s reality....

  8. Realization of Key Technologies for Enterprise Collaboration Office Automation System Based on Workflow%基于工作流的企业协同OA系统关键技术实现

    Institute of Scientific and Technical Information of China (English)

    缪永; 周健; 陶亮

    2011-01-01

    Office Automation(OA) is a comprehensive technology based on advanced science and technology, information technology,systems science, and behavioral science. Office Automation is a very active field of technology with strong vitality in the new technology revolution. Office Automation is also a product of the information society. With the environment complication and process refinement of the modern office, the cooperation of the office processes also has increasing demands, in this paper, workflow is firstly introduced, and then some key technologies for the workflow-based enterprise collaboration office automation system are studied, which include workflow design, workflow engine design and system interface design.%办公自动化(Office Automation,OA)是以先进的科学技术、信息技术、系统科学和行为科学为支柱的一门综合性技术.办公自动化是当前新技术革命中一个非常活跃和具有很强生命力的技术领域,是信息化社会的产物.随着现代办公环境的复杂化和流程的精细化,对办公流程中的任务的协同性也提出了越来越高的要求.文中对工作流作了简单介绍,然后对基于工作流的企业协同办公自动化系统关键技术实现进行了研究,具体包括工作流设计、工作流引擎的设计以及系统接口的设计.

  9. Automatically updating predictive modeling workflows support decision-making in drug design.

    Science.gov (United States)

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  10. Technology in the Classroom: Transitioning Lab and Design to an All-Digital Workflow

    Science.gov (United States)

    Anastasio, Daniel; Suresh, Aravind; Burkey, Daniel D.

    2013-01-01

    Mobile platforms and cloud computing allow collaborative sharing and submission of work in ways not feasible until recently. In this article, we detail how we took a writing-intensive laboratory and made the submission, reading, grading, and returning of student work online and paper-free, while maintaining familiar elements of pen-and-paper…

  11. En bloc prefabrication of vascularized bioartificial bone grafts in sheep and complete workflow for custom-made transplants.

    Science.gov (United States)

    Kokemüller, H; Jehn, P; Spalthoff, S; Essig, H; Tavassol, F; Schumann, P; Andreae, A; Nolte, I; Jagodzinski, M; Gellrich, N-C

    2014-02-01

    The aim of this pilot study was to determine, in a new experimental model, whether complex bioartificial monoblocs of relevant size and stability can be prefabricated in a defined three-dimensional design, in which the latissimus dorsi muscle serves as a natural bioreactor and the thoracodorsal vessel tree is prepared for axial construct perfusion. Eighteen sheep were included in the study, with six animals in each of three experimental groups. Vitalization of the β-tricalcium phosphate-based constructs was performed by direct application of unmodified osteogenic material from the iliac crest (group A), in vivo application of nucleated cell concentrate (NCC) from bone marrow aspirate (group B), and in vitro cultivation of bone marrow stromal cells (BMSC) in a perfusion bioreactor system (group C). The contours of the constructs were designed digitally and transferred onto the bioartificial bone grafts using a titanium cage, which was bent over a stereolithographic model of the defined subvolume intraoperatively. At the end of the prefabrication process, only the axial vascularized constructs of group A demonstrated vital bone formation with considerable stability. In groups B and C, the applied techniques were not able to induce ectopic bone formation. The presented computer-assisted workflow allows the prefabrication of custom-made bioartificial transplants.

  12. An efficient field and laboratory workflow for plant phylotranscriptomic projects1

    Science.gov (United States)

    Yang, Ya; Moore, Michael J.; Brockington, Samuel F.; Timoneda, Alfonso; Feng, Tao; Marx, Hannah E.; Walker, Joseph F.; Smith, Stephen A.

    2017-01-01

    Premise of the study: We describe a field and laboratory workflow developed for plant phylotranscriptomic projects that involves cryogenic tissue collection in the field, RNA extraction and quality control, and library preparation. We also make recommendations for sample curation. Methods and Results: A total of 216 frozen tissue samples of Caryophyllales and other angiosperm taxa were collected from the field or botanical gardens. RNA was extracted, stranded mRNA libraries were prepared, and libraries were sequenced on Illumina HiSeq platforms. These included difficult mucilaginous tissues such as those of Cactaceae and Droseraceae. Conclusions: Our workflow is not only cost effective (ca. $270 per sample, as of August 2016, from tissue to reads) and time efficient (less than 50 h for 10–12 samples including all laboratory work and sample curation), but also has proven robust for extraction of difficult samples such as tissues containing high levels of secondary compounds. PMID:28337391

  13. The Architecture of an Intuitive Scientific Workflow System for Spatial Planning

    Directory of Open Access Journals (Sweden)

    Tiberiu Florescu

    2018-06-01

    Full Text Available In recent years, the quantity of open data has multiplied dramatically. This newly found wealth of data has raised a number of issues for any research exercise within the field of spatial planning: underpowered traditional tools and methods, difficulties in tracking data, transparency issues, sharing difficulties and, above all, an erratic workflow. Some new research tools to counter this irksome tendency do exist at the moment, but, unfortunately, we feel that there is still ample room for improvement. We have therefore commenced the development of our own research instrument, based on the Scientific Workflow System concept. This paper lays the foundation for its architecture. Once completed, both the instrument and the resulting data shall be freely available, truthful to the spirit of the open source and open data tradition. We strongly believe that spatial planning professionals and researchers might find it interesting and worthwhile for increasing the quality and speed of their work.

  14. Medical image registration algorithms assesment Bronze Standard application enactment on grids using the MOTEUR workflow engine

    CERN Document Server

    Glatard, T; Pennec, X

    2006-01-01

    Medical image registration is pre-processing needed for many medical image analysis procedures. A very large number of registration algorithms are available today, but their performance is often not known and very difficult to assess due to the lack of gold standard. The Bronze Standard algorithm is a very data and compute intensive statistical approach for quantifying registration algorithms accuracy. In this paper, we describe the Bronze Standard application and we discuss the need for grids to tackle such computations on medical image databases. We demonstrate MOTEUR, a service-based workflow engine optimized for dealing with data intensive applications. MOTEUR eases the enactment of the Bronze Standard and similar applications on the EGEE production grid infrastructure. It is a generic workflow engine, based on current standards and freely available, that can be used to instrument legacy application code at low cost.

  15. Towards Dynamic Authentication in the Grid — Secure and Mobile Business Workflows Using GSet

    Science.gov (United States)

    Mangler, Jürgen; Schikuta, Erich; Witzany, Christoph; Jorns, Oliver; Ul Haq, Irfan; Wanek, Helmut

    Until now, the research community mainly focused on the technical aspects of Grid computing and neglected commercial issues. However, recently the community tends to accept that the success of the Grid is crucially based on commercial exploitation. In our vision Foster's and Kesselman's statement "The Grid is all about sharing." has to be extended by "... and making money out of it!". To allow for the realization of this vision the trust-worthyness of the underlying technology needs to be ensured. This can be achieved by the use of gSET (Gridified Secure Electronic Transaction) as a basic technology for trust management and secure accounting in the presented Grid based workflow. We present a framework, conceptually and technically, from the area of the Mobile-Grid, which justifies the Grid infrastructure as a viable platform to enable commercially successful business workflows.

  16. From Data to Knowledge to Discoveries: Artificial Intelligence and Scientific Workflows

    Directory of Open Access Journals (Sweden)

    Yolanda Gil

    2009-01-01

    Full Text Available Scientific computing has entered a new era of scale and sharing with the arrival of cyberinfrastructure facilities for computational experimentation. A key emerging concept is scientific workflows, which provide a declarative representation of complex scientific applications that can be automatically managed and executed in distributed shared resources. In the coming decades, computational experimentation will push the boundaries of current cyberinfrastructure in terms of inter-disciplinary scope and integrative models of scientific phenomena under study. This paper argues that knowledge-rich workflow environments will provide necessary capabilities for that vision by assisting scientists to validate and vet complex analysis processes and by automating important aspects of scientific exploration and discovery.

  17. MO-D-213-01: Workflow Monitoring for a High Volume Radiation Oncology Center

    Energy Technology Data Exchange (ETDEWEB)

    Laub, S [CDH Proton Center, Warrenville, IL (United States); Dunn, M [Proton Collaborative Group, Warrenville, Illinois (United States); Galbreath, G [ProCure Treatment Centers Inc., New York, NY (United States); Gans, S [Northwestern Memorial Chicago Proton Center, Warrenville, Illinois (United States); Pankuch, M [Northwestern Medicine Chicago Proton Center, Warrenville, IL (United States)

    2015-06-15

    Purpose: Implement a center wide communication system that increases interdepartmental transparency and accountability while decreasing redundant work and treatment delays by actively monitoring treatment planning workflow. Methods: Intake Management System (IMS), a program developed by ProCure Treatment Centers Inc., is a multi-function database that stores treatment planning process information. It was devised to work with the oncology information system (Mosaiq) to streamline interdepartmental workflow.Each step in the treatment planning process is visually represented and timelines for completion of individual tasks are established within the software. The currently active step of each patient’s planning process is highlighted either red or green according to whether the initially allocated amount of time has passed for the given process. This information is displayed as a Treatment Planning Process Monitor (TPPM), which is shown on screens in the relevant departments throughout the center. This display also includes the individuals who are responsible for each task.IMS is driven by Mosaiq’s quality checklist (QCL) functionality. Each step in the workflow is initiated by a Mosaiq user sending the responsible party a QCL assignment. IMS is connected to Mosaiq and the sending or completing of a QCL updates the associated field in the TPPM to the appropriate status. Results: Approximately one patient a week is identified during the workflow process as needing to have his/her treatment start date modified or resources re-allocated to address the most urgent cases. Being able to identify a realistic timeline for planning each patient and having multiple departments communicate their limitations and time constraints allows for quality plans to be developed and implemented without overburdening any one department. Conclusion: Monitoring the progression of the treatment planning process has increased transparency between departments, which enables efficient

  18. Enhancing population pharmacokinetic modeling efficiency and quality using an integrated workflow.

    Science.gov (United States)

    Schmidt, Henning; Radivojevic, Andrijana

    2014-08-01

    Population pharmacokinetic (popPK) analyses are at the core of Pharmacometrics and need to be performed regularly. Although these analyses are relatively standard, a large variability can be observed in both the time (efficiency) and the way they are performed (quality). Main reasons for this variability include the level of experience of a modeler, personal preferences and tools. This paper aims to examine how the process of popPK model building can be supported in order to increase its efficiency and quality. The presented approach to the conduct of popPK analyses is centered around three key components: (1) identification of most common and important popPK model features, (2) required information content and formatting of the data for modeling, and (3) methodology, workflow and workflow supporting tools. This approach has been used in several popPK modeling projects and a documented example is provided in the supplementary material. Efficiency of model building is improved by avoiding repetitive coding and other labor-intensive tasks and by putting the emphasis on a fit-for-purpose model. Quality is improved by ensuring that the workflow and tools are in alignment with a popPK modeling guidance which is established within an organization. The main conclusion of this paper is that workflow based approaches to popPK modeling are feasible and have significant potential to ameliorate its various aspects. However, the implementation of such an approach in a pharmacometric organization requires openness towards innovation and change-the key ingredient for evolution of integrative and quantitative drug development in the pharmaceutical industry.

  19. Osiris: accessible and reproducible phylogenetic and phylogenomic analyses within the Galaxy workflow management system

    OpenAIRE

    2014-01-01

    Background Phylogenetic tools and ‘tree-thinking’ approaches increasingly permeate all biological research. At the same time, phylogenetic data sets are expanding at breakneck pace, facilitated by increasingly economical sequencing technologies. Therefore, there is an urgent need for accessible, modular, and sharable tools for phylogenetic analysis. Results We developed a suite of wrappers for new and existing phylogenetics tools for the Galaxy workflow management system that we call Osiris. ...

  20. Multivariate Analysis of High Through-Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols

    Science.gov (United States)

    2016-06-01

    ARL-TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through- Put Adhesively Bonded Single Lap...TR-7696 ● JUNE 2016 US Army Research Laboratory Multivariate Analysis of High Through- Put Adhesively Bonded Single Lap Joints: Experimental...SUBTITLE Multivariate Analysis of High Through- Put Adhesively Bonded Single Lap Joints: Experimental and Workflow Protocols 5a. CONTRACT NUMBER 5b