WorldWideScience

Sample records for cambafx workflow design

  1. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging

    Science.gov (United States)

    Ooi, Cinly; Bullmore, Edward T.; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs. PMID:19826470

  2. Designing Workflows on the Fly Using e-BioFlow

    NARCIS (Netherlands)

    Wassink, I.; Ooms, Matthijs; van der Vet, P.E.; Baresi, Luciano; Chi, Chihung; Suzuki, Jun

    Abstract. Life scientists use workflow systems for service orchestration to design their computer based experiments. These workflow systems require life scientists to design complete workflows before they can be run. Traditional workflow systems not support the explorative research approach life

  3. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  4. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural......This paper presents design workflows for the representation, analysis and fabrication of braided structures. The workflows employ a braid pattern and simulation method which extends the state-of-the-art in the following ways: by supporting the braid design of both pre-determined target shapes...... and exploratory, generative, or evolved designs; by incorporating material and fabrication constraints generalised for both hand and machine; by providing a greater degree of design agency and supporting real-time modification of braid topologies. The paper first introduces braid as a technique, stating...

  5. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    This paper presents design workflows for the representation, analysis and fabrication of braided structures. The workflows employ a braid pattern and simulation method which extends the state-of-the-art in the following ways: by supporting the braid design of both pre-determined target shapes...... and exploratory, generative, or evolved designs; by incorporating material and fabrication constraints generalised for both hand and machine; by providing a greater degree of design agency and supporting real-time modification of braid topologies. The paper first introduces braid as a technique, stating...... the objectives and motivation for our exploration of braid within an architectural context and highlighting both the relevance of braid and current lack of suitable design modelling tools to support our approach. We briefly introduce the state-of-the-art in braid representation and present the characteristics...

  6. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, C.; Chiao, C.; Hess, G.; Nascimento, G.S.; Thom, L.H.; Reichert, M.U.

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  7. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  8. Designing a road map for geoscience workflows

    Science.gov (United States)

    Duffy, Christopher; Gil, Yolanda; Deelman, Ewa; Marru, Suresh; Pierce, Marlon; Demir, Ibrahim; Wiener, Gerry

    2012-06-01

    Advances in geoscience research and discovery are fundamentally tied to data and computation, but formal strategies for managing the diversity of models and data resources in the Earth sciences have not yet been resolved or fully appreciated. The U.S. National Science Foundation (NSF) EarthCube initiative (http://earthcube.ning.com), which aims to support community-guided cyberinfrastructure to integrate data and information across the geosciences, recently funded four community development activities: Geoscience Workflows; Semantics and Ontologies; Data Discovery, Mining, and Integration; and Governance. The Geoscience Workflows working group, with broad participation from the geosciences, cyberinfrastructure, and other relevant communities, is formulating a workflows road map (http://sites.google.com/site/earthcubeworkflow/). The Geoscience Workflows team coordinates with each of the other community development groups given their direct relevance to workflows. Semantics and ontologies are mechanisms for describing workflows and the data they process.

  9. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  10. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  11. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  12. A use case driven object-oriented design methodology for the design of multi-level workflow schemas

    Science.gov (United States)

    Chen, Pei-Hung

    Traditional workflow schema design largely involves manual processes. It begins with business-process models and ends with delivering a workflow schema that can be executed by a workflow management system. However, little effort has been made to develop methodological approaches for workflow design [BARE99]. As a consequence, this thesis develops a design methodology which will enable workflow designers to model complex business processes in a simple and straightforward manner and easily generate workflow schemas. Our methodology is a use case based approach to algorithmic multi-level workflow schema generation. It begins with an initial analysis phase to capture requirement specifications, and incorporates workflow technology to support business process modeling that captures business processes as workflow specifications. In the process, we extend the interaction diagram for modeling workflow applications and for integrating business rules. Consequently, the extended interaction diagrams, named workflow based interaction diagrams, can support process-related concepts, including static and dynamic rules, multiple use case scenarios, event scheduling and delay features. We develop automatic conversion algorithms, which enable designers to generate different levels of workflow schemas automatically based on the workflow-based interaction diagrams generated from the use case analysis. The workflow schemas produced can be specified at different designers can further create a multi-level web interface based on these workflow schemas to support state and process defined data views and decision trees simultaneously.

  13. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    Directory of Open Access Journals (Sweden)

    Jiantao Zhou

    2014-01-01

    Full Text Available This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  14. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Jiantao Zhou; Chaoxin Sun; Weina Fu; Jing Liu; Lei Jia; Hongyan Tan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  15. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud.

    Science.gov (United States)

    Wolstencroft, Katherine; Haines, Robert; Fellows, Donal; Williams, Alan; Withers, David; Owen, Stuart; Soiland-Reyes, Stian; Dunlop, Ian; Nenadic, Aleksandra; Fisher, Paul; Bhagat, Jiten; Belhajjame, Khalid; Bacall, Finn; Hardisty, Alex; Nieva de la Hidalga, Abraham; Balcazar Vargas, Maria P; Sufi, Shoaib; Goble, Carole

    2013-07-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud environments), using the Taverna Server. In bioinformatics, Taverna workflows are typically used in the areas of high-throughput omics analyses (for example, proteomics or transcriptomics), or for evidence gathering methods involving text mining or data mining. Through Taverna, scientists have access to several thousand different tools and resources that are freely available from a large range of life science institutions. Once constructed, the workflows are reusable, executable bioinformatics protocols that can be shared, reused and repurposed. A repository of public workflows is available at http://www.myexperiment.org. This article provides an update to the Taverna tool suite, highlighting new features and developments in the workbench and the Taverna Server.

  16. Surgical workflow management schemata for cataract procedures. process model-based design and validation of workflow schemata.

    Science.gov (United States)

    Neumuth, T; Liebmann, P; Wiedemann, P; Meixensberger, J

    2012-01-01

    Workflow guidance of surgical activities is a challenging task. Because of variations in patient properties and applied surgical techniques, surgical processes have a high variability. The objective of this study was the design and implementation of a surgical workflow management system (SWFMS) that can provide a robust guidance for surgical activities. We investigated how many surgical process models are needed to develop a SWFMS that can guide cataract surgeries robustly. We used 100 cases of cataract surgeries and acquired patient-individual surgical process models (iSPMs) from them. Of these, randomized subsets iSPMs were selected as learning sets to create a generic surgical process model (gSPM). These gSPMs were mapped onto workflow nets as workflow schemata to define the behavior of the SWFMS. Finally, 10 iSPMs from the disjoint set were simulated to validate the workflow schema for the surgical processes. The measurement was the successful guidance of an iSPM. We demonstrated that a SWFMS with a workflow schema that was generated from a subset of 10 iSPMs is sufficient to guide approximately 65% of all surgical processes in the total set, and that a subset of 50 iSPMs is sufficient to guide approx. 80% of all processes. We designed a SWFMS that is able to guide surgical activities on a detailed level. The study demonstrated that the high inter-patient variability of surgical processes can be considered by our approach.

  17. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  18. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  19. Effects of the Interactions Between LPS and BIM on Workflow in Two Building Design Projects

    OpenAIRE

    Khan, Sheriz; Tzortzopoulos, Patricia

    2014-01-01

    Variability in design workflow causes delays and undermines the performance of building projects. As lean processes, the Last Planner System (LPS) and Building Information Modeling (BIM) can improve workflow in building projects through features that reduce waste. Since its introduction, BIM has had significant positive influence on workflow in building design projects, but these have been rarely considered in combination with LPS. This paper is part of a postgraduate research focusing on the...

  20. ANALYSIS OF WORKFLOW ON DESIGN PROJECTS IN INDIA

    Directory of Open Access Journals (Sweden)

    Senthilkumar Venkatachalam

    2010-12-01

    Full Text Available Proposal: The increase in privately funded infrastructure construction in India had compelled project owners to demand highly compressed project schedules due to political risks and early revenue generation. As a result, many of the contracts are based on EPC (Engineering Procurement and Construction contract enabling the contractor to plan and control the EPC phases. Sole responsibility for the three phases has facilitated the use of innovative approaches such as fast-track construction and concurrent engineering in order to minimize project duration. As a part of a research study to improve design processes, the first author spent a year as an observer in two design projects which was done by a leading EPC contractor in India. Both projects required accelerated design and fast-track construction. The first project involved the detailed design of a coal handling unit for a power plant and second the preliminary phase of a large airport design project. The research team had the mandate to analyze the design process and suggest changes to make it more efficient. On the first project, detailed data on the design/drawing workflow was collected and analyzed. The paper presents the analysis of the data identifying the bottlenecks in the process and compares the analysis results with the perceptions of the design team. On the second project, the overall organizational structure for coordinating the interfaces between the design processes was evaluated. The paper presents a structured method to organize the interface and interactions between the various design disciplines. The details of the method proposed, implementation issues and outcomes of implementation are also discussed.

  1. The Design of an Automated Workflow for Metadata Generation

    Science.gov (United States)

    Manso-Callejo, Miguel; Wachowicz, Mónica; Bernabé-Poveda, Miguel

    The important role of digital resources relies on whether metadata is available and has been correctly catalogued and indexed so that the user can discover and use geospatial datasets. However, the cost and the error-proneness in the manual metadata creation, the lack of information provided by the producers of geospatial datasets and the lack of experience in cataloguing have motivated us to propose a new workflow for the automated metadata generation for geospatial datasets. This paper describes this workflow based on tasks synchronization that gives support for four metadata functions: discovery, use, evaluate and retrieval of digital geodata. The workflow was implemented using a multi-tier architecture system where the Data, Application and User Tiers can run a single use application as well as web services. The prototype evaluation is discussed in terms of the type of metadata being generated and the type of metadata function being supported by the workflow.

  2. Design and implementation of workflow engine for service-oriented architecture

    Science.gov (United States)

    Peng, Shuqing; Duan, Huining; Chen, Deyun

    2009-04-01

    As computer network is developed rapidly and in the situation of the appearance of distribution specialty in enterprise application, traditional workflow engine have some deficiencies, such as complex structure, bad stability, poor portability, little reusability and difficult maintenance. In this paper, in order to improve the stability, scalability and flexibility of workflow management system, a four-layer architecture structure of workflow engine based on SOA is put forward according to the XPDL standard of Workflow Management Coalition, the route control mechanism in control model is accomplished and the scheduling strategy of cyclic routing and acyclic routing is designed, and the workflow engine which adopts the technology such as XML, JSP, EJB and so on is implemented.

  3. Managing and Communicating Operational Workflow: Designing and Implementing an Electronic Outpatient Whiteboard.

    Science.gov (United States)

    Steitz, Bryan D; Weinberg, Stuart T; Danciu, Ioana; Unertl, Kim M

    2016-01-01

    Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings.

  4. Sources of variation in primary care clinical workflow: implications for the design of cognitive support.

    Science.gov (United States)

    Militello, Laura G; Arbuckle, Nicole B; Saleem, Jason J; Patterson, Emily; Flanagan, Mindy; Haggstrom, David; Doebbeling, Bradley N

    2014-03-01

    This article identifies sources of variation in clinical workflow and implications for the design and implementation of electronic clinical decision support. Sources of variation in workflow were identified via rapid ethnographic observation, focus groups, and interviews across a total of eight medical centers in both the Veterans Health Administration and academic medical centers nationally regarded as leaders in developing and using clinical decision support. Data were reviewed for types of variability within the social and technical subsystems and the external environment as described in the sociotechnical systems theory. Two researchers independently identified examples of variation and their sources, and then met with each other to discuss them until consensus was reached. Sources of variation were categorized as environmental (clinic staffing and clinic pace), social (perception of health information technology and real-time use with patients), or technical (computer access and information access). Examples of sources of variation within each of the categories are described and discussed in terms of impact on clinical workflow. As technologies are implemented, barriers to use become visible over time as users struggle to adapt workflow and work practices to accommodate new technologies. Each source of variability identified has implications for the effective design and implementation of useful health information technology. Accommodating moderate variability in workflow is anticipated to avoid brittle and inflexible workflow designs, while also avoiding unnecessary complexity for implementers and users.

  5. Design of efficient computational workflows for in silico drug repurposing.

    Science.gov (United States)

    Vanhaelen, Quentin; Mamoshina, Polina; Aliper, Alexander M; Artemov, Artem; Lezhnina, Ksenia; Ozerov, Ivan; Labat, Ivan; Zhavoronkov, Alex

    2017-02-01

    Here, we provide a comprehensive overview of the current status of in silico repurposing methods by establishing links between current technological trends, data availability and characteristics of the algorithms used in these methods. Using the case of the computational repurposing of fasudil as an alternative autophagy enhancer, we suggest a generic modular organization of a repurposing workflow. We also review 3D structure-based, similarity-based, inference-based and machine learning (ML)-based methods. We summarize the advantages and disadvantages of these methods to emphasize three current technical challenges. We finish by discussing current directions of research, including possibilities offered by new methods, such as deep learning. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Modeling workflow to design machine translation applications for public health practice.

    Science.gov (United States)

    Turner, Anne M; Brownstein, Megumu K; Cole, Kate; Karasz, Hilary; Kirchhoff, Katrin

    2015-02-01

    Provide a detailed understanding of the information workflow processes related to translating health promotion materials for limited English proficiency individuals in order to inform the design of context-driven machine translation (MT) tools for public health (PH). We applied a cognitive work analysis framework to investigate the translation information workflow processes of two large health departments in Washington State. Researchers conducted interviews, performed a task analysis, and validated results with PH professionals to model translation workflow and identify functional requirements for a translation system for PH. The study resulted in a detailed description of work related to translation of PH materials, an information workflow diagram, and a description of attitudes towards MT technology. We identified a number of themes that hold design implications for incorporating MT in PH translation practice. A PH translation tool prototype was designed based on these findings. This study underscores the importance of understanding the work context and information workflow for which systems will be designed. Based on themes and translation information workflow processes, we identified key design guidelines for incorporating MT into PH translation work. Primary amongst these is that MT should be followed by human review for translations to be of high quality and for the technology to be adopted into practice. The time and costs of creating multilingual health promotion materials are barriers to translation. PH personnel were interested in MT's potential to improve access to low-cost translated PH materials, but expressed concerns about ensuring quality. We outline design considerations and a potential machine translation tool to best fit MT systems into PH practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Quantum mechanics implementation in drug-design workflows: does it really help? [Corrigendum

    Directory of Open Access Journals (Sweden)

    Arodola OA

    2017-11-01

    Full Text Available Arodola OA, Soliman MES. Quantum mechanics implementation in drug-design workflows: does it really help? Drug Design, Development and Therapy. 2017;11:2551–2564.Figure 3 on page 2557 contains errors. The correct figure is shown.Read the original article

  8. From Design to Production Control Through the Integration of Engineering Data Management and Workflow Management Systems

    CERN Document Server

    Le Goff, J M; Bityukov, S; Estrella, F; Kovács, Z; Le Flour, T; Lieunard, S; McClatchey, R; Murray, S; Organtini, G; Vialle, J P; Bazan, A; Chevenier, G

    1997-01-01

    At a time when many companies are under pressure to reduce "times-to-market" the management of product information from the early stages of design through assembly to manufacture and production has become increasingly important. Similarly in the construction of high energy physics devices the collection of ( often evolving) engineering data is central to the subsequent physics analysis. Traditionally in industry design engineers have employed Engineering Data Management Systems ( also called Product Data Management Systems) to coordinate and control access to documented versions of product designs. However, these systems provide control only at the collaborative design level and are seldom used beyond design. Workflow management systems, on the other hand, are employed in industry to coordinate and support the more complex and repeatable work processes of the production environment. Commer cial workflow products cannot support the highly dynamic activities found both in the design stages of product developmen...

  9. Linear CMOS RF power amplifiers a complete design workflow

    CERN Document Server

    Ruiz, Hector Solar

    2013-01-01

    The work establishes the design flow for the optimization of linear CMOS power amplifiers from the first steps of the design to the final IC implementation and tests. The authors also focuses on design guidelines of the inductor's geometrical characteristics for power applications and covers their measurement and characterization. Additionally, a model is proposed which would facilitate designs in terms of transistor sizing, required inductor quality factors or minimum supply voltage. The model considers limitations that CMOS processes can impose on implementation. The book also provides diffe

  10. MysiRNA-Designer: A Workflow for Efficient siRNA Design

    Science.gov (United States)

    Mysara, Mohamed; Garibaldi, Jonathan M.; ElHefnawi, Mahmoud

    2011-01-01

    The design of small interfering RNA (siRNA) is a multi factorial problem that has gained the attention of many researchers in the area of therapeutic and functional genomics. MysiRNA score was previously introduced that improves the correlation of siRNA activity prediction considering state of the art algorithms. In this paper, a new program, MysiRNA-Designer, is described which integrates several factors in an automated work-flow considering mRNA transcripts variations, siRNA and mRNA target accessibility, and both near-perfect and partial off-target matches. It also features the MysiRNA score, a highly ranked correlated siRNA efficacy prediction score for ranking the designed siRNAs, in addition to top scoring models Biopredsi, DISR, Thermocomposition21 and i-Score, and integrates them in a unique siRNA score-filtration technique. This multi-score filtration layer filters siRNA that passes the 90% thresholds calculated from experimental dataset features. MysiRNA-Designer takes an accession, finds conserved regions among its transcript space, finds accessible regions within the mRNA, designs all possible siRNAs for these regions, filters them based on multi-scores thresholds, and then performs SNP and off-target filtration. These strict selection criteria were tested against human genes in which at least one active siRNA was designed from 95.7% of total genes. In addition, when tested against an experimental dataset, MysiRNA-Designer was found capable of rejecting 98% of the false positive siRNAs, showing superiority over three state of the art siRNA design programs. MysiRNA is a freely accessible (Microsoft Windows based) desktop application that can be used to design siRNA with a high accuracy and specificity. We believe that MysiRNA-Designer has the potential to play an important role in this area. PMID:22046244

  11. Product-Based Workflow Design for Monitoring of Collaborative Business Processes

    OpenAIRE

    Comuzzi, M.; Vanderfeesten, I. T. P.

    2011-01-01

    Monitoring of cross-organizational processes requires the definition and implementation of monitoring processes that can deliver the right information to the right party in the collaboration. Monitoring processes should account for the temporal and aggregation dependencies among the monitoring information made available by the set of collaborating parties. We solve the problem of designing monitoring processes in collaborative settings using Product-Based Workflow Design (PBWD). We first disc...

  12. Echoes of Semiotically-Based Design in the Development and Testing of a Workflow System

    Directory of Open Access Journals (Sweden)

    Clarisse Sieckenius de Souza

    2001-05-01

    Full Text Available Workflow systems are information-intensive task-oriented computer applications that typically involve a considerable number of users playing a wide variety of roles. Since communication, coordination and decision-making processes are essential for such systems, representing, interpreting and negotiating collective meanings are a crucial issue for software design and development processes. In this paper, we report and discuss our experience in implementing Qualitas, a web-based workflow system. Semiotic theory was extensively used to support design decisions and negotiations with users about technological signs. Taking scenarios as a type-sign exchanged throughout the whole process, we could trace the theoretic underpinnings of our experience and draw some revealing conclusions about the product and the process of technologically reified discourse. Although it is present in all information technology applications, this kind of discourse is seldom analyzed by software designers and developers. Our conjecture is that outside semiotic theory, professionals involved with human-computer interaction and software engineering practices have difficulty to coalesce concepts derived from such different disciplines as psychology, anthropology, linguistics and sociology, to name a few. Semiotics, however, can by itself provide a unifying ontological basis for interdisciplinary nowledge, raising issues and proposing alternatives, that may help professionals gain insights at lower learning costs. eywords: semiotic engineering, workflow systems, information-intensive task-oriented systems, scenario based design and development of computer systems, human-computer interaction

  13. Towards a workflow driven design for mHealth devices within temporary eye clinics in low-income settings.

    Science.gov (United States)

    Bolster, Nigel M; Bastawrous, Andrew; Giardini, Mario E

    2015-01-01

    Only a small minority of mobile healthcare technologies that have been successful in pilot studies have subsequently been integrated into healthcare systems. Understanding the reasons behind this discrepancy is crucial if such technologies are to be adopted. We believe that the mismatch is due to a breakdown in the relation between technical soundness of the original mobile health (mHealth) device design, and integration into healthcare provision workflows. Quantitative workflow modelling provides an opportunity to test this hypothesis. In this paper we present our current progress in developing a clinical workflow model for mobile eye assessment in low-income settings. We test the model for determining the appropriateness of design parameters of a mHealth device within this workflow, by assessing their impact on the entire clinical workflow performance.

  14. An Assisted Workflow for the Early Design of Nearly Zero Emission Healthcare Buildings

    Directory of Open Access Journals (Sweden)

    Hassan A. Sleiman

    2017-07-01

    Full Text Available Energy efficiency in buildings is one of the main goals of many governmental policies due to their high impact on the carbon dioxide emissions in Europe. One of these targets is to reduce the energy consumption in healthcare buildings, which are known to be among the most energy-demanding building types. Although design decisions made at early design phases have a significant impact on the energy performance of the realized buildings, only a small portion of possible early designs is analyzed, which does not ensure an optimal building design. We propose an automated early design support workflow, accompanied by a set of tools, for achieving nearly zero emission healthcare buildings. It is intended to be used by decision makers during the early design phase. It starts with the user-defined brief and the design rules, which are the input for the Early Design Configurator (EDC. The EDC generates multiple design alternatives following an evolutionary algorithm while trying to satisfy user requirements and geometric constraints. The generated alternatives are then validated by means of an Early Design Validator (EDV, and then, early energy and cost assessments are made using two early assessment tools. A user-friendly dashboard is used to guide the user and to illustrate the workflow results, whereas the chosen alternative at the end of the workflow is considered as the starting point for the next design phases. Our proposal has been implemented using Building Information Models (BIM and validated by means of a case study on a healthcare building and several real demonstrations from different countries in the context of the European project STREAMER.

  15. End-to-end interoperability and workflows from building architecture design to one or more simulations

    Science.gov (United States)

    Chao, Tian-Jy; Kim, Younghun

    2015-02-10

    An end-to-end interoperability and workflows from building architecture design to one or more simulations, in one aspect, may comprise establishing a BIM enablement platform architecture. A data model defines data entities and entity relationships for enabling the interoperability and workflows. A data definition language may be implemented that defines and creates a table schema of a database associated with the data model. Data management services and/or application programming interfaces may be implemented for interacting with the data model. Web services may also be provided for interacting with the data model via the Web. A user interface may be implemented that communicates with users and uses the BIM enablement platform architecture, the data model, the data definition language, data management services and application programming interfaces to provide functions to the users to perform work related to building information management.

  16. Scientific workflows for bibliometrics

    OpenAIRE

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discu...

  17. The use of workflows in the design and implementation of complex experiments in macromolecular crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Brockhauser, Sandor, E-mail: brockhauser@embl.fr [European Molecular Biology Laboratory, 6 Rue Jules Horowitz, BP 181, 38042 Grenoble (France); UJF–EMBL–CNRS, UMI 3265, 6 Rue Jules Horowitz, 38042 Grenoble CEDEX 9 (France); Svensson, Olof; Bowler, Matthew W. [European Synchrotron Radiation Facility, 6 Rue Jules Horowitz, 38043 Grenoble (France); Nanao, Max [European Molecular Biology Laboratory, 6 Rue Jules Horowitz, BP 181, 38042 Grenoble (France); UJF–EMBL–CNRS, UMI 3265, 6 Rue Jules Horowitz, 38042 Grenoble CEDEX 9 (France); Gordon, Elspeth; Leal, Ricardo M. F.; Popov, Alexander; Gerring, Matthew [European Synchrotron Radiation Facility, 6 Rue Jules Horowitz, 38043 Grenoble (France); McCarthy, Andrew A. [European Molecular Biology Laboratory, 6 Rue Jules Horowitz, BP 181, 38042 Grenoble (France); UJF–EMBL–CNRS, UMI 3265, 6 Rue Jules Horowitz, 38042 Grenoble CEDEX 9 (France); Gotz, Andy [European Synchrotron Radiation Facility, 6 Rue Jules Horowitz, 38043 Grenoble (France); European Molecular Biology Laboratory, 6 Rue Jules Horowitz, BP 181, 38042 Grenoble (France)

    2012-08-01

    A powerful and easy-to-use workflow environment has been developed at the ESRF for combining experiment control with online data analysis on synchrotron beamlines. This tool provides the possibility of automating complex experiments without the need for expertise in instrumentation control and programming, but rather by accessing defined beamline services. The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE.

  18. The use of workflows in the design and implementation of complex experiments in macromolecular crystallography.

    Science.gov (United States)

    Brockhauser, Sandor; Svensson, Olof; Bowler, Matthew W; Nanao, Max; Gordon, Elspeth; Leal, Ricardo M F; Popov, Alexander; Gerring, Matthew; McCarthy, Andrew A; Gotz, Andy

    2012-08-01

    The automation of beam delivery, sample handling and data analysis, together with increasing photon flux, diminishing focal spot size and the appearance of fast-readout detectors on synchrotron beamlines, have changed the way that many macromolecular crystallography experiments are planned and executed. Screening for the best diffracting crystal, or even the best diffracting part of a selected crystal, has been enabled by the development of microfocus beams, precise goniometers and fast-readout detectors that all require rapid feedback from the initial processing of images in order to be effective. All of these advances require the coupling of data feedback to the experimental control system and depend on immediate online data-analysis results during the experiment. To facilitate this, a Data Analysis WorkBench (DAWB) for the flexible creation of complex automated protocols has been developed. Here, example workflows designed and implemented using DAWB are presented for enhanced multi-step crystal characterizations, experiments involving crystal reorientation with kappa goniometers, crystal-burning experiments for empirically determining the radiation sensitivity of a crystal system and the application of mesh scans to find the best location of a crystal to obtain the highest diffraction quality. Beamline users interact with the prepared workflows through a specific brick within the beamline-control GUI MXCuBE.

  19. Digital workflow for virtually designing and milling ceramic lithium disilicate veneers: a clinical report.

    Science.gov (United States)

    Zandinejad, A; Lin, W S; Atarodi, M; Abdel-Azim, T; Metz, M J; Morton, D

    2015-01-01

    Laminate veneers have been routinely used to restore and enhance the appearance of natural dentition. The traditional pathway for fabricating veneers consisted of making conventional polyvinyl siloxane impressions, producing stone casts, and fabricating final porcelain prostheses on stone dies. Pressed ceramics have successfully been used for laminate veneer fabrication for several years. Recently, digital computer-aided design/computer-aided manufacturing scanning has become commercially available to make a digital impression that is sent electronically to a dental laboratory or a chairside milling machine. However, technology has been developed to allow digital data acquisition in conjunction with electronically transmitted data that enables virtual design of restorations and milling at a remote production center. Following the aforementioned workflow will provide the opportunity to fabricate a physical cast-free restoration. This new technique has been reported recently for all-ceramic IPS e.max full-coverage pressed-ceramic restorations. However, laminate veneers are very delicate and technique-sensitive restorations when compared with all-ceramic full-coverage ones made from the same material. Complete digital design and fabrication of multiple consecutive laminate veneers seems to be very challenging. This clinical report presents the digital workflow for the virtual design and fabrication of multiple laminate veneers in a patient for enhancing the esthetics of his maxillary anterior teeth. A step-by-step process is presented with a discussion of the advantages and disadvantages of this novel technique. Additionally, the use of lithium disilicate ceramic as the material of choice and the rationale for such a decision is discussed.

  20. Evaluation of user interface and workflow design of a bedside nursing clinical decision support system.

    Science.gov (United States)

    Yuan, Michael Juntao; Finley, George Mike; Long, Ju; Mills, Christy; Johnson, Ron Kim

    2013-01-31

    Clinical decision support systems (CDSS) are important tools to improve health care outcomes and reduce preventable medical adverse events. However, the effectiveness and success of CDSS depend on their implementation context and usability in complex health care settings. As a result, usability design and validation, especially in real world clinical settings, are crucial aspects of successful CDSS implementations. Our objective was to develop a novel CDSS to help frontline nurses better manage critical symptom changes in hospitalized patients, hence reducing preventable failure to rescue cases. A robust user interface and implementation strategy that fit into existing workflows was key for the success of the CDSS. Guided by a formal usability evaluation framework, UFuRT (user, function, representation, and task analysis), we developed a high-level specification of the product that captures key usability requirements and is flexible to implement. We interviewed users of the proposed CDSS to identify requirements, listed functions, and operations the system must perform. We then designed visual and workflow representations of the product to perform the operations. The user interface and workflow design were evaluated via heuristic and end user performance evaluation. The heuristic evaluation was done after the first prototype, and its results were incorporated into the product before the end user evaluation was conducted. First, we recruited 4 evaluators with strong domain expertise to study the initial prototype. Heuristic violations were coded and rated for severity. Second, after development of the system, we assembled a panel of nurses, consisting of 3 licensed vocational nurses and 7 registered nurses, to evaluate the user interface and workflow via simulated use cases. We recorded whether each session was successfully completed and its completion time. Each nurse was asked to use the National Aeronautics and Space Administration (NASA) Task Load Index to self

  1. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  2. Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands

    NARCIS (Netherlands)

    Westera, Wim; Brouns, Francis; Pannekeet, Kees; Janssen, José; Manderveld, Jocelyn

    2005-01-01

    Please refer to the original article in: Westera, W., Brouns, F., Pannekeet, K., Janssen, J., & Manderveld, J. (2005). Achieving E-learning with IMS Learning Design - Workflow Implications at the Open University of the Netherlands. Educational Technology & Society, 8 (3), 216-225. (URL:

  3. Quantum mechanics implementation in drug-design workflows: does it really help?

    Science.gov (United States)

    Arodola, Olayide A; Soliman, Mahmoud Es

    2017-01-01

    The pharmaceutical industry is progressively operating in an era where development costs are constantly under pressure, higher percentages of drugs are demanded, and the drug-discovery process is a trial-and-error run. The profit that flows in with the discovery of new drugs has always been the motivation for the industry to keep up the pace and keep abreast with the endless demand for medicines. The process of finding a molecule that binds to the target protein using in silico tools has made computational chemistry a valuable tool in drug discovery in both academic research and pharmaceutical industry. However, the complexity of many protein-ligand interactions challenges the accuracy and efficiency of the commonly used empirical methods. The usefulness of quantum mechanics (QM) in drug-protein interaction cannot be overemphasized; however, this approach has little significance in some empirical methods. In this review, we discuss recent developments in, and application of, QM to medically relevant biomolecules. We critically discuss the different types of QM-based methods and their proposed application to incorporating them into drug-design and -discovery workflows while trying to answer a critical question: are QM-based methods of real help in drug-design and -discovery research and industry?

  4. Quantum mechanics implementation in drug-design workflows: does it really help?

    Science.gov (United States)

    Arodola, Olayide A; Soliman, Mahmoud ES

    2017-01-01

    The pharmaceutical industry is progressively operating in an era where development costs are constantly under pressure, higher percentages of drugs are demanded, and the drug-discovery process is a trial-and-error run. The profit that flows in with the discovery of new drugs has always been the motivation for the industry to keep up the pace and keep abreast with the endless demand for medicines. The process of finding a molecule that binds to the target protein using in silico tools has made computational chemistry a valuable tool in drug discovery in both academic research and pharmaceutical industry. However, the complexity of many protein–ligand interactions challenges the accuracy and efficiency of the commonly used empirical methods. The usefulness of quantum mechanics (QM) in drug–protein interaction cannot be overemphasized; however, this approach has little significance in some empirical methods. In this review, we discuss recent developments in, and application of, QM to medically relevant biomolecules. We critically discuss the different types of QM-based methods and their proposed application to incorporating them into drug-design and -discovery workflows while trying to answer a critical question: are QM-based methods of real help in drug-design and -discovery research and industry? PMID:28919707

  5. Workflow Management

    NARCIS (Netherlands)

    Grefen, P.W.P.J.

    2000-01-01

    Workflow Management biedt concepten en technieken voor het verbeteren van efficientie en effectiviteit van complexe processen in administratieve omgevingen. De inzet van workflow management technologie kan onder andere leiden tot kortere doorlooptijden van processen en een verbeterd inzicht in de

  6. Quantum mechanics implementation in drug-design workflows: does it really help?

    Directory of Open Access Journals (Sweden)

    Arodola OA

    2017-08-01

    Full Text Available Olayide A Arodola,1 Mahmoud ES Soliman1,2 1Department of Pharmaceutical Chemistry, University of KwaZulu-Natal, Durban, South Africa; 2Department of Pharmaceutical Organic Chemistry, Faculty of Pharmacy, Zagazig University, Egypt Abstract: The pharmaceutical industry is progressively operating in an era where development costs are constantly under pressure, higher percentages of drugs are demanded, and the drug-discovery process is a trial-and-error run. The profit that flows in with the discovery of new drugs has always been the motivation for the industry to keep up the pace and keep abreast with the endless demand for medicines. The process of finding a molecule that binds to the target protein using in silico tools has made computational chemistry a valuable tool in drug discovery in both academic research and pharmaceutical industry. However, the complexity of many protein–ligand interactions challenges the accuracy and efficiency of the commonly used empirical methods. The usefulness of quantum mechanics (QM in drug–protein interaction cannot be overemphasized; however, this approach has little significance in some empirical methods. In this review, we discuss recent developments in, and application of, QM to medically relevant biomolecules. We critically discuss the different types of QM-based methods and their proposed application to incorporating them into drug-design and -discovery workflows while trying to answer a critical question: are QM-based methods of real help in drug-design and -discovery research and industry? Keywords: quantum mechanics, drug discovery, drug design, molecular mechanics, molecular dynamics, in silico tools 

  7. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly

  8. Workflow Agents vs. Expert Systems: Problem Solving Methods in Work Systems Design

    Science.gov (United States)

    Clancey, William J.; Sierhuis, Maarten; Seah, Chin

    2009-01-01

    During the 1980s, a community of artificial intelligence researchers became interested in formalizing problem solving methods as part of an effort called "second generation expert systems" (2nd GES). How do the motivations and results of this research relate to building tools for the workplace today? We provide an historical review of how the theory of expertise has developed, a progress report on a tool for designing and implementing model-based automation (Brahms), and a concrete example how we apply 2nd GES concepts today in an agent-based system for space flight operations (OCAMS). Brahms incorporates an ontology for modeling work practices, what people are doing in the course of a day, characterized as "activities." OCAMS was developed using a simulation-to-implementation methodology, in which a prototype tool was embedded in a simulation of future work practices. OCAMS uses model-based methods to interactively plan its actions and keep track of the work to be done. The problem solving methods of practice are interactive, employing reasoning for and through action in the real world. Analogously, it is as if a medical expert system were charged not just with interpreting culture results, but actually interacting with a patient. Our perspective shifts from building a "problem solving" (expert) system to building an actor in the world. The reusable components in work system designs include entire "problem solvers" (e.g., a planning subsystem), interoperability frameworks, and workflow agents that use and revise models dynamically in a network of people and tools. Consequently, the research focus shifts so "problem solving methods" include ways of knowing that models do not fit the world, and ways of interacting with other agents and people to gain or verify information and (ultimately) adapt rules and procedures to resolve problematic situations.

  9. Workflow management systems in radiology

    Science.gov (United States)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim

    1998-07-01

    In a situation of shrinking health care budgets, increasing cost pressure and growing demands to increase the efficiency and the quality of medical services, health care enterprises are forced to optimize or complete re-design their processes. Although information technology is agreed to potentially contribute to cost reduction and efficiency improvement, the real success factors are the re-definition and automation of processes: Business Process Re-engineering and Workflow Management. In this paper we discuss architectures for the use of workflow management systems in radiology. We propose to move forward from information systems in radiology (RIS, PACS) to Radiology Management Systems, in which workflow functionality (process definitions and process automation) is implemented through autonomous workflow management systems (WfMS). In a workflow oriented architecture, an autonomous workflow enactment service communicates with workflow client applications via standardized interfaces. In this paper, we discuss the need for and the benefits of such an approach. The separation of workflow management system and application systems is emphasized, and the consequences that arise for the architecture of workflow oriented information systems. This includes an appropriate workflow terminology, and the definition of standard interfaces for workflow aware application systems. Workflow studies in various institutions have shown that most of the processes in radiology are well structured and suited for a workflow management approach. Numerous commercially available Workflow Management Systems (WfMS) were investigated, and some of them, which are process- oriented and application independent, appear suitable for use in radiology.

  10. Service-Oriented Architectures: from Design to Production exploiting Workflow Patterns

    Directory of Open Access Journals (Sweden)

    Saverio GIALLORENZO

    2015-03-01

    Full Text Available In Service-Oriented Architectures (SOA services are composed by coordinating their communications into a flow of interactions. Coloured Petri nets (CPN offer a formal yet easy tool for modelling abstract SOAs. Still, mapping abstract SOAs into executable ones requires a non-trivial and time-costly analysis. Here, we propose a methodology that maps CPN-modelled SOAs into executable Jolie SOAs (our target language. To this end, we employ a collection of recurring control-flow patterns, called Workflow Patterns, as composable blocks of the translation. Following our methodology, we discuss how the Workflow Patterns we consider are translated in Jolie. Finally, we validate our methodology with a realistic use case. As additional result of our research, we could pragmatically assess the expressiveness of Jolie with relation to the considered Workflow Patterns.

  11. Information Engineering and Workflow Design in a Clinical Decision Support System for Colorectal Cancer Screening in Iran.

    Science.gov (United States)

    Maserat, Elham; Seied Farajollah, Seiede Sedigheh; Safdari, Reza; Ghazisaeedi, Marjan; Aghdaei, Hamid Asadzadeh; Zali, Mohammad Reza

    2015-01-01

    Colorectal cancer is a major cause of morbidity and mortality throughout the world. Colorectal cancer screening is an optimal way for reducing of morbidity and mortality and a clinical decision support system (CDSS) plays an important role in predicting success of screening processes. DSS is a computer-based information system that improves the delivery of preventive care services. The aim of this article was to detail engineering of information requirements and work flow design of CDSS for a colorectal cancer screening program. In the first stage a screening minimum data set was determined. Developed and developing countries were analyzed for identifying this data set. Then information deficiencies and gaps were determined by check list. The second stage was a qualitative survey with a semi-structured interview as the study tool. A total of 15 users and stakeholders' perspectives about workflow of CDSS were studied. Finally workflow of DSS of control program was designed by standard clinical practice guidelines and perspectives. Screening minimum data set of national colorectal cancer screening program was defined in five sections, including colonoscopy data set, surgery, pathology, genetics and pedigree data set. Deficiencies and information gaps were analyzed. Then we designed a work process standard of screening. Finally workflow of DSS and entry stage were determined. A CDSS facilitates complex decision making for screening and has key roles in designing optimal interactions between colonoscopy, pathology and laboratory departments. Also workflow analysis is useful to identify data reconciliation strategies to address documentation gaps. Following recommendations of CDSS should improve quality of colorectal cancer screening.

  12. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  13. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  14. Design and development of a workflow for microbial spray formulations including decision criteria.

    Science.gov (United States)

    Bejarano, Ana; Sauer, Ursula; Preininger, Claudia

    2017-10-01

    Herein, we present a workflow for the development of talc-based microbial inoculants for foliar spray consisting of four steps. These include together with decision-making criteria (1) the selection of additives based on their capability to wet juvenile maize leaves, (2) their adhesion on the plant, (3) their interaction with the biological systems, and (4) the choice of thickener for good dispersion stability. In total, 29 additives including polysaccharides and proteins, polyols, glycosides, oils, waxes, and surfactants (e.g., chitosan, gelatin, glycerol, saponin, castor oil, polyethylene, rhamnolipid) were evaluated. Contact angle and spreading index measurements revealed that the use of 5% Geloil, 1% rhamnolipid, or suitable combinations of Geloil + rhamnolipid and Nurture Yield S 2002 + rhamnolipid enhanced wetting of hydrophobic maize leaves and adherence, similarly to the commercial wetting agents recommended for plant protection 1% Prev B2 and 1% Trifolio S Forte. Interaction of additives with biological systems was based on biocompatibility and phytotoxicity assays, and cell viability monitoring using the endophytic Gram-negative bacterium Paraburkholderia phytofirmans PsJN. Results from biocompatibility assays indicated that in contrast to rhamnolipid and Prev B2 Geloil, Nurture Yield S 2002 and Trifolio S Forte fully supported bacterial growth within a concentration range of 1 to 5%. Dose-dependent phytotoxicity was observed in plants treated with rhamnolipid. Most efficient formulation was composed of PsJN, talc, xanthan, and Geloil. Beyond that, the proposed workflow is expected to generally provide guidance for the development of spray formulations and help other researchers to optimize their choices in this area.

  15. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  16. Agile parallel bioinformatics workflow management using Pwrake.

    Science.gov (United States)

    Mishima, Hiroyuki; Sasaki, Kensaku; Tanaka, Masahiro; Tatebe, Osamu; Yoshiura, Koh-Ichiro

    2011-09-08

    In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error.Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK) and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows. Furthermore, readability and maintainability of rakefiles

  17. Continuous and fast calibration of the CMS experiment: design of the automated workflows and operational experience

    CERN Document Server

    Oramus, Piotr Karol; Pfeiffer, Andreas; Franzoni, Giovanni; Govi, Giacomo Maria; Musich, Marco; Di Guida, Salvatore

    2017-01-01

    The exploitation of the full physics potential of the LHC experiments requires fast and efficient processing of the largest possible dataset with the most refined understanding of the detector conditions. To face this challenge, the CMS collaboration has setup an infrastructure for the continuous unattended computation of the alignment and calibration constants, allowing for a refined knowledge of the most time-critical parameters already a few hours after the data have been saved to disk. This is the prompt calibration framework which, since the beginning of the LHC Run-I, enables the analysis and the High Level Trigger of the experiment to consume the most up-to-date conditions optimizing the performance of the physics objects. In the Run-II this setup has been further expanded to include even more complex calibration algorithms requiring higher statistics to reach the needed precision. This imposed the introduction of a new paradigm in the creation of the calibration datasets for unattended workflows and o...

  18. Design and first implementation of business process visualization for a task manager supporting the workflow in an operating room

    Science.gov (United States)

    Fink, E.; Wiemuth, M.; Burgert, O.

    2015-03-01

    An operating room is a stressful work environment. Nevertheless, all involved persons have to work safely as there is no space for mistakes. To ensure a high level of concentration and seamless interaction, all involved persons have to know their own tasks and the tasks of their colleagues. The entire team must work synchronously at all times. To optimize the overall workflow, a task manager supporting the team was developed. In parallel, a common conceptual design of a business process visualization was developed, which makes all relevant information accessible in real-time during a surgery. In this context an overview of all processes in the operating room was created and different concepts for the graphical representation of these user-dependent processes were developed. This paper describes the concept of the task manager as well as the general concept in the field of surgery.

  19. A workflow for column interchangeability in liquid chromatography using modeling software and quality-by-design principles.

    Science.gov (United States)

    Kormány, Róbert; Tamás, Katalin; Guillarme, Davy; Fekete, Szabolcs

    2017-11-30

    The goal of the present study was to develop a generic workflow to evaluate the chromatographic resolution in a large design space and easily find some replacement column for the method. To attain this objective from a limited number of initial experiments, modern LC modeling software (Drylab) was employed to study the behaviour of the compounds and visually compare the parts of design spaces obtained with different columns, where a given criterion of critical resolution is fullfilled. A zone of robust space can then easily be found by overlapping design spaces. By using 50×2.1mm columns packed with sub-2μm fully porous particles (UHPLC), the resolution in the entire design space can be modeled on the basis of only 2-3h experimental work per column. To demonstrate the applicability of the developed procedure, amlodipine and its related pharmacopeia impurities were selected as a case study. It was demonstrated that two columns from different providers (Waters Acquity HSS C18, Thermo Hypersil Gold C18) can be interchanged, providing a sufficient resolution at the same working point and a high degree of robustness around this condition. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  1. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.

    2010-01-01

    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  2. LATUX: An Iterative Workflow for Designing, Validating, and Deploying Learning Analytics Visualizations

    Science.gov (United States)

    Martinez-Maldonado, Roberto; Pardo, Abelardo; Mirriahi, Negin; Yacef, Kalina; Kay, Judy; Clayphan, Andrew

    2015-01-01

    Designing, validating, and deploying learning analytics tools for instructors or students is a challenge that requires techniques and methods from different disciplines, such as software engineering, human-computer interaction, computer graphics, educational design, and psychology. Whilst each has established its own design methodologies, we now…

  3. Workflow in nuclear medicine.

    Science.gov (United States)

    Laet, G D; Naudts, J; Vandevivere, J

    2001-01-01

    This paper discusses a workflow management system for nuclear medicine. It augments the more conventional PACS with automatic transfer of studies along the chain of activities making up an examination in nuclear medicine. A prototype system has been designed, built, and installed in a department of nuclear medicine, active in a network of hospitals.

  4. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...

  5. Toward Design, Modelling and Analysis of Dynamic Workflow Reconfigurations - A Process Algebra Perspective

    DEFF Research Database (Denmark)

    Mazzara, M.; Abouzaid, F.; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design and to ve......This paper describes a case study involving the dynamic re- conguration of an oce work ow. We state the requirements on a sys- tem implementing the work ow and its reconguration, and describe the system's design in BPMN. We then use an asynchronous -calculus and Web1 to model the design...

  6. Research on Configurable Workflow Technology in PDM System

    OpenAIRE

    Cao Kang; Liu Li; Cheng Zheng

    2016-01-01

    This paper analyzed the deficiency of the static process management in the traditional PDM workflow system, and proposed a new configurable workflow system model for the dynamic and multiple characteristics of the equipment product design business process. On the basis of PDM workflow engine, the model introduced the configurable workflow form and logic processing program, the total workflow template definition and customized workflow participant selection module, and put forward the key tech...

  7. Integrated design workflow and a new tool for urban rainwater management.

    Science.gov (United States)

    Chen, Yujiao; Samuelson, Holly W; Tong, Zheming

    2016-09-15

    Low Impact Development (LID) practices provide more sustainable solutions than traditional piping and storm ponds in stormwater management. However, architects are not equipped with the knowledge to perform runoff calculations at early design stage. In response to this dilemma, we have developed an open-source stormwater runoff evaluation and management tool, Rainwater+. It is seamlessly integrated into computer-aided design (CAD) software to receive instant estimate on the stormwater runoff volume of architecture and landscape designs. Designers can thereby develop appropriate rainwater management strategies based on local precipitation data, specific standards, site conditions and economic considerations. We employed Rainwater+ to conduct two case studies illustrating the importance of considering stormwater runoff in the early design stage. The first case study showed that integrating rainwater management into design modeling is critical for determining LID practice at any specific site. The second case study demonstrated the need of visualizing runoff flow direction in assisting the placement of LID practices at proper locations when the terrain is of great complexity. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Dynameomics: design of a computational lab workflow and scientific data repository for protein simulations.

    Science.gov (United States)

    Simms, Andrew M; Toofanny, Rudesh D; Kehl, Catherine; Benson, Noah C; Daggett, Valerie

    2008-06-01

    Dynameomics is a project to investigate and catalog the native-state dynamics and thermal unfolding pathways of representatives of all protein folds using solvated molecular dynamics simulations, as described in the preceding paper. Here we introduce the design of the molecular dynamics data warehouse, a scalable, reliable repository that houses simulation data that vastly simplifies management and access. In the succeeding paper, we describe the development of a complementary multidimensional database. A single protein unfolding or native-state simulation can take weeks to months to complete, and produces gigabytes of coordinate and analysis data. Mining information from over 3000 completed simulations is complicated and time-consuming. Even the simplest queries involve writing intricate programs that must be built from low-level file system access primitives and include significant logic to correctly locate and parse data of interest. As a result, programs to answer questions that require data from hundreds of simulations are very difficult to write. Thus, organization and access to simulation data have been major obstacles to the discovery of new knowledge in the Dynameomics project. This repository is used internally and is the foundation of the Dynameomics portal site http://www.dynameomics.org. By organizing simulation data into a scalable, manageable and accessible form, we can begin to address substantial questions that move us closer to solving biomedical and bioengineering problems.

  9. Design and evaluation of a multimedia electronic patient record "oncoflow" with clinical workflow assistance for head and neck tumor therapy.

    Science.gov (United States)

    Meier, Jens; Boehm, Andreas; Kielhorn, Anne; Dietz, Andreas; Bohn, Stefan; Neumuth, Thomas

    2014-11-01

    The management of patient-specific information is a challenging task for surgeons and physicians because existing clinical information systems are insufficiently integrated into daily clinical routine and contained information entities are distributed across different proprietary databases. Thus, existing information is hardly usable for further electronic processing, workflow support or clinical studies. A Web-based clinical information system has been developed that automatically imports patient-specific information from different information systems. The system is tailored to the existing workflow for the treatment of patients with head and neck cancer. In this paper, the clinical assistance functions and a quantitative as well as a qualitative system evaluation are presented. The information system has been deployed at a clinical site and is in use in daily clinical routine. Two evaluation studies show that the information integration, the structured information presentation in the Web browser and the assistance functions improve the physician's workflow. The studies also show that the usage of the new information system does not impair the time physicians need for a process step compared with the usage of the existing information system. Information integration is crucial for efficient workflow support in the clinic. The central access to information within a modern and structured user interface saves valuable time for the physician. The comprehensive database allows an instant usage of the existing information clinical workflow support or the conduction of trial studies.

  10. E-BioFlow: Different perspectives on scientific workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.; Breit, T.; Nijholt, A.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  11. E-BioFlow: Different Perspectives on Scientific Workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.E.; Breit, T.; Nijholt, Antinus; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R.F.; Schneider, K.; Toma, C.

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  12. Managing and Communicating Operational Workflow

    Science.gov (United States)

    Weinberg, Stuart T.; Danciu, Ioana; Unertl, Kim M.

    2016-01-01

    Summary Background Healthcare team members in emergency department contexts have used electronic whiteboard solutions to help manage operational workflow for many years. Ambulatory clinic settings have highly complex operational workflow, but are still limited in electronic assistance to communicate and coordinate work activities. Objective To describe and discuss the design, implementation, use, and ongoing evolution of a coordination and collaboration tool supporting ambulatory clinic operational workflow at Vanderbilt University Medical Center (VUMC). Methods The outpatient whiteboard tool was initially designed to support healthcare work related to an electronic chemotherapy order-entry application. After a highly successful initial implementation in an oncology context, a high demand emerged across the organization for the outpatient whiteboard implementation. Over the past 10 years, developers have followed an iterative user-centered design process to evolve the tool. Results The electronic outpatient whiteboard system supports 194 separate whiteboards and is accessed by over 2800 distinct users on a typical day. Clinics can configure their whiteboards to support unique workflow elements. Since initial release, features such as immunization clinical decision support have been integrated into the system, based on requests from end users. Conclusions The success of the electronic outpatient whiteboard demonstrates the usefulness of an operational workflow tool within the ambulatory clinic setting. Operational workflow tools can play a significant role in supporting coordination, collaboration, and teamwork in ambulatory healthcare settings. PMID:27081407

  13. Workflow management systems

    OpenAIRE

    Imsland, Geir Inge Struen

    2007-01-01

    This master's thesis gives an insight to workflow technologies used to improve efficiency of business processes. Ways to use such technologies in order to help users through tasks in MIPS (Material Integrated Production System) are discussed. Description on how to implement a workflow management system for MIPS is provided. Workflow engine, workflow importer and user interface are the three most important parts discussed.

  14. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  15. Integrating data from an online diabetes prevention program into an electronic health record and clinical workflow, a design phase usability study.

    Science.gov (United States)

    Mishuris, Rebecca Grochow; Yoder, Jordan; Wilson, Dan; Mann, Devin

    2016-07-11

    Health information is increasingly being digitally stored and exchanged. The public is regularly collecting and storing health-related data on their own electronic devices and in the cloud. Diabetes prevention is an increasingly important preventive health measure, and diet and exercise are key components of this. Patients are turning to online programs to help them lose weight. Despite primary care physicians being important in patients' weight loss success, there is no exchange of information between the primary care provider (PCP) and these online weight loss programs. There is an emerging opportunity to integrate this data directly into the electronic health record (EHR), but little is known about what information to share or how to share it most effectively. This study aims to characterize the preferences of providers concerning the integration of externally generated lifestyle modification data into a primary care EHR workflow. We performed a qualitative study using two rounds of semi-structured interviews with primary care providers. We used an iterative design process involving primary care providers, health information technology software developers and health services researchers to develop the interface. Using grounded-theory thematic analysis 4 themes emerged from the interviews: 1) barriers to establishing healthy lifestyles, 2) features of a lifestyle modification program, 3) reporting of outcomes to the primary care provider, and 4) integration with primary care. These themes guided the rapid-cycle agile design process of an interface of data from an online diabetes prevention program into the primary care EHR workflow. The integration of external health-related data into the EHR must be embedded into the provider workflow in order to be useful to the provider and beneficial for the patient. Accomplishing this requires evaluation of that clinical workflow during software design. The development of this novel interface used rapid cycle iterative

  16. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  17. Towards Analyzing Declarative Workflows

    OpenAIRE

    Fahland, Dirk

    2007-01-01

    Enacting tasks in a workflow cannot always follow a pre-defined process model. In application domains like disaster management workflows are partially specified and circumstances of their enactment change. There exist various approaches for formal workflow models that are effective in such situations, like declarative specifications instead of operational models for formalizing flexible workflow process. These powerful models leave a gap to existing techniques in the doma...

  18. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  19. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  20. Disruption of Radiologist Workflow.

    Science.gov (United States)

    Kansagra, Akash P; Liu, Kevin; Yu, John-Paul J

    2016-01-01

    The effect of disruptions has been studied extensively in surgery and emergency medicine, and a number of solutions-such as preoperative checklists-have been implemented to enforce the integrity of critical safety-related workflows. Disruptions of the highly complex and cognitively demanding workflow of modern clinical radiology have only recently attracted attention as a potential safety hazard. In this article, we describe the variety of disruptions that arise in the reading room environment, review approaches that other specialties have taken to mitigate workflow disruption, and suggest possible solutions for workflow improvement in radiology. Copyright © 2015 Mosby, Inc. All rights reserved.

  1. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  2. Research on Configurable Workflow Technology in PDM System

    Directory of Open Access Journals (Sweden)

    Cao Kang

    2016-01-01

    Full Text Available This paper analyzed the deficiency of the static process management in the traditional PDM workflow system, and proposed a new configurable workflow system model for the dynamic and multiple characteristics of the equipment product design business process. On the basis of PDM workflow engine, the model introduced the configurable workflow form and logic processing program, the total workflow template definition and customized workflow participant selection module, and put forward the key technology solutions. This model could solve the problem of the lack of flexibility in workflow design and maintenance. It had been successfully applied to the design management process for multiple equipment products, and improved the dynamic process management capability of PDM system.

  3. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  4. Parametric Room Acoustic workflows with real-time acoustic simulation

    DEFF Research Database (Denmark)

    Parigi, Dario

    2017-01-01

    The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages......The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages...

  5. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  6. Provenance in bioinformatics workflows.

    Science.gov (United States)

    de Paula, Renato; Holanda, Maristela; Gomes, Luciana S A; Lifschitz, Sergio; Walter, Maria Emilia M T

    2013-01-01

    In this work, we used the PROV-DM model to manage data provenance in workflows of genome projects. This provenance model allows the storage of details of one workflow execution, e.g., raw and produced data and computational tools, their versions and parameters. Using this model, biologists can access details of one particular execution of a workflow, compare results produced by different executions, and plan new experiments more efficiently. In addition to this, a provenance simulator was created, which facilitates the inclusion of provenance data of one genome project workflow execution. Finally, we discuss one case study, which aims to identify genes involved in specific metabolic pathways of Bacillus cereus, as well as to compare this isolate with other phylogenetic related bacteria from the Bacillus group. B. cereus is an extremophilic bacteria, collected in warm water in the Midwestern Region of Brazil, its DNA samples having been sequenced with an NGS machine.

  7. GeNeDA: An Open-Source Workflow for Design Automation of Gene Regulatory Networks Inspired from Microelectronics.

    Science.gov (United States)

    Madec, Morgan; Pecheux, François; Gendrault, Yves; Rosati, Elise; Lallement, Christophe; Haiech, Jacques

    2016-10-01

    The topic of this article is the development of an open-source automated design framework for synthetic biology, specifically for the design of artificial gene regulatory networks based on a digital approach. In opposition to other tools, GeNeDA is an open-source online software based on existing tools used in microelectronics that have proven their efficiency over the last 30 years. The complete framework is composed of a computation core directly adapted from an Electronic Design Automation tool, input and output interfaces, a library of elementary parts that can be achieved with gene regulatory networks, and an interface with an electrical circuit simulator. Each of these modules is an extension of microelectronics tools and concepts: ODIN II, ABC, the Verilog language, SPICE simulator, and SystemC-AMS. GeNeDA is first validated on a benchmark of several combinatorial circuits. The results highlight the importance of the part library. Then, this framework is used for the design of a sequential circuit including a biological state machine.

  8. Toward a Visualization-Supported Workflow for Cyber Alert Management using Threat Models and Human-Centered Design

    Energy Technology Data Exchange (ETDEWEB)

    Franklin, Lyndsey; Pirrung, Megan A.; Blaha, Leslie M.; Dowling, Michelle V.; Feng, Mi

    2017-10-09

    Cyber network analysts follow complex processes in their investigations of potential threats to their network. Much research is dedicated to providing automated tool support in the effort to make their tasks more efficient, accurate, and timely. This tool support comes in a variety of implementations from machine learning algorithms that monitor streams of data to visual analytic environments for exploring rich and noisy data sets. Cyber analysts, however, often speak of a need for tools which help them merge the data they already have and help them establish appropriate baselines against which to compare potential anomalies. Furthermore, existing threat models that cyber analysts regularly use to structure their investigation are not often leveraged in support tools. We report on our work with cyber analysts to understand they analytic process and how one such model, the MITRE ATT&CK Matrix [32], is used to structure their analytic thinking. We present our efforts to map specific data needed by analysts into the threat model to inform our eventual visualization designs. We examine data mapping for gaps where the threat model is under-supported by either data or tools. We discuss these gaps as potential design spaces for future research efforts. We also discuss the design of a prototype tool that combines machine-learning and visualization components to support cyber analysts working with this threat model.

  9. Designing an architectural style for dynamic medical Cross-Organizational Workflow management system: an approach based on agents and web services.

    Science.gov (United States)

    Bouzguenda, Lotfi; Turki, Manel

    2014-04-01

    This paper shows how the combined use of agent and web services technologies can help to design an architectural style for dynamic medical Cross-Organizational Workflow (COW) management system. Medical COW aims at supporting the collaboration between several autonomous and possibly heterogeneous medical processes, distributed over different organizations (Hospitals, Clinic or laboratories). Dynamic medical COW refers to occasional cooperation between these health organizations, free of structural constraints, where the medical partners involved and their number are not pre-defined. More precisely, this paper proposes a new architecture style based on agents and web services technologies to deal with two key coordination issues of dynamic COW: medical partners finding and negotiation between them. It also proposes how the proposed architecture for dynamic medical COW management system can connect to a multi-agent system coupling the Clinical Decision Support System (CDSS) with Computerized Prescriber Order Entry (CPOE). The idea is to assist the health professionals such as doctors, nurses and pharmacists with decision making tasks, as determining diagnosis or patient data analysis without stopping their clinical processes in order to act in a coherent way and to give care to the patient.

  10. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  11. Fluent Logic Workflow Analyser: A Tool for The Verification of Workflow Properties

    Directory of Open Access Journals (Sweden)

    Germán Regis

    2014-01-01

    Full Text Available In this paper we present the design and implementation, as well as a use case, of a tool for workflow analysis. The tool provides an assistant for the specification of properties of a workflow model. The specification language for property description is Fluent Linear Time Temporal Logic. Fluents provide an adequate flexibility for capturing properties of workflows. Both the model and the properties are encoded, in an automated way, as Labelled Transition Systems, and the analysis is reduced to model checking.

  12. Development of a Theoretical Monitoring System Design for a HLW Repository Based on the 'MoDeRn Monitoring Workflow' (A Case Study) - 12044

    Energy Technology Data Exchange (ETDEWEB)

    Jobmann, M. [DBE TECHNOLOGY GmbH, Eschenstrasse 55, D-31224 Peine (Germany); Schroeder, T.J. [Nuclear Research and Consultancy Group - NRG, P.O. Box 25, NL-1755 ZG Petten (Netherlands); White, M. [Galson Sciences Limited, 5 Grosvenor House, Melton Road, Oakham, LE15 6AX (United Kingdom)

    2012-07-01

    In this paper, a generic German disposal concept in rock salt is used as an example to discuss the design of a repository monitoring system. The approach used is based on a generic structured approach to monitoring - the MoDeRn Monitoring Workflow - which is being developed and tested as part of an on-going European Commission Seventh Framework project. As a first step in the study, the requirements on the monitoring program were identified through consideration of the national context, including regulatory guidelines, host rock properties and waste to be disposed of. These are stated as general monitoring objectives. An analysis of the German safety concept for safe confinement of the radioactive waste allows these general objectives to be converted into specific sub-objectives, and for the sub-objectives to be related to specific monitoring processes and parameters. The safety concept identified the key safety components, each of them having specific associated safety functions. The safety functions can be related to the list of features, events and processes (FEPs) that contains all processes related to the future repository evolution. By screening the FEP list, all processes that potentially can affect the safety functions have been identified. In a next step the parameters that would be affected by the individual processes were determined, leading to a preliminary list of parameters to be monitored. By evaluating available techniques and monitoring equipment, this preliminary list was investigated with respect to its technical feasibility at the intended locations. Prior to final system selection, potential impacts of the monitoring system on safety or other measurements are evaluated. To avoid potential pathways for fluids that may compromise the integrity of a barrier, considerations on the application of wireless data transmission systems and techniques for autonomous, long-term power supply were given. (authors)

  13. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  14. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  15. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  16. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  17. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  18. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  19. On the Evaluation of Workflow Systems in Business Processes

    NARCIS (Netherlands)

    Choenni, R.S.; Bakker, R; Baets, W.R.J.

    2003-01-01

    Although it is widely accepted that workflow systems add value to business processes, no substantial research has been reported in the literature that confirms this. Most of the efforts in the field of workflow systems are devoted to issues that are relevant to the design and implementation of this

  20. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  1. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  2. Design and development of a mobile computer application to reengineer workflows in the hospital and the methodology to evaluate its effectiveness.

    Science.gov (United States)

    Holzinger, Andreas; Kosec, Primoz; Schwantzer, Gerold; Debevc, Matjaz; Hofmann-Wellenhof, Rainer; Frühauf, Julia

    2011-12-01

    This paper describes a new method of collecting additional data for the purpose of skin cancer research from the patients in the hospital using the system Mobile Computing in Medicine Graz (MoCoMed-Graz). This system departs from the traditional paper-based questionnaire data collection methods and implements a new composition of evaluation methods to demonstrate its effectiveness. The patients fill out a questionnaire on a Tablet-PC (or iPad Device) and the resulting medical data is integrated into the electronic patient record for display when the patient enters the doctor's examination room. Since the data is now part of the electronic patient record, the doctor can discuss the data together with the patient making corrections or completions where necessary, thus enhancing data quality and patient empowerment. A further advantage is that all questionnaires are in the system at the end of the day - and manual entry is no longer necessary - consequently raising data completeness. The front end was developed using a User Centered Design Process for touch tablet computers and transfers the data in XML to the SAP based enterprise hospital information system. The system was evaluated at the Graz University Hospital - where about 30 outpatients consult the pigmented lesion clinic each day - following Bronfenbrenner's three level perspective: The microlevel, the mesolevel and the macrolevel: On the microlevel, the questions answered by 194 outpatients, evaluated with the System Usability Scale (SUS) resulted in a median of 97.5 (min: 50, max: 100) which showed that it is easy to use. On the mesolevel, the time spent by medical doctors was measured before and after the implementation of the system; the medical task performance time of 20 doctors (age median 43 (min: 29; max: 50)) showed a reduction of 90%. On the macrolevel, a cost model was developed to show how much money can be saved by the hospital management. This showed that, for an average of 30 patients per day

  3. Monitoring of Grid Scientific Workflows

    Directory of Open Access Journals (Sweden)

    Bartosz Balis

    2008-01-01

    Full Text Available Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful in future decisions, e.g., related to optimization of resource usage. In this paper, basic problems related to monitoring of Grid scientific workflows are discussed. Being highly distributed, loosely coupled in space and time, heterogeneous, and heavily using legacy codes, workflows are exceptionally challenging from the monitoring point of view. We propose a Grid monitoring architecture for scientific workflows. Monitoring data correlation problem is described and an algorithm for on-line distributed collection of monitoring data is proposed. We demonstrate a prototype implementation of the proposed workflow monitoring architecture, the GEMINI monitoring system, and its use for monitoring of a real-life scientific workflow.

  4. Semantic Workflows and Provenance-Aware Software (Invited)

    Science.gov (United States)

    Gil, Y.

    2013-12-01

    Workflows are increasingly used in science to manage complex computations and data processing at large scale. Intelligent workflow systems provide assistance in setting up parameters and data, validating workflows created by users, and automating the generation of workflows from high-level user guidance. These systems use semantic workflows that extend workflow representations with semantic constraints that express characteristics of the data and analytic models. Reasoning algorithms propagate these semantic constraints throughout the workflow structure, select executable components for underspecified steps, and suggest parameter values. Semantic workflows also enhance provenance records with abstract steps that reflect the overall data analysis method rather than just execution traces. Intelligent workflow systems are provenance-aware, since they both use and generate provenance and metadata as the data is being processed. Provenance-aware software enhances scientific analysis by propagating upstream metadata and provenance to new data products. Through the use of provenance standards, such as the recent W3C PROV recommendation for provenance on the Web, provenance-aware software can significantly enhance scientific data analysis, publication, and reuse. New capabilities are enabled when provenance is brought to the forefront in the design of software systems for science.

  5. Implementing and Running a Workflow Application on Cloud Resources

    Directory of Open Access Journals (Sweden)

    Gabriela Andreea MORAR

    2011-01-01

    Full Text Available Scientist need to run applications that are time and resource consuming, but, not all of them, have the requires knowledge to run this applications in a parallel manner, by using grid, cluster or cloud resources. In the past few years many workflow building frameworks were developed in order to help scientist take a better advantage of computing resources, by designing workflows based on their applications and executing them on heterogeneous resources. This paper presents a case study of implementing and running a workflow for an E-bay data retrieval application. The workflow was designed using Askalon framework and executed on the cloud resources. The purpose of this paper is to demonstrate how workflows and cloud resources can be used by scientists in order to achieve speedup for their application without the need of spending large amounts of money on computational resources.

  6. Mining workflow processes from distributed workflow enactment event logs

    OpenAIRE

    Kwanghoon Pio Kim

    2012-01-01

    Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events). The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow), social, informational (data-flow), and organizational perspectives...

  7. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  8. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  9. Radiology information system: a workflow-based approach.

    Science.gov (United States)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; van der Aalst, W M P

    2009-09-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare.

  10. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  11. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful

  12. Metadata Standards and Workflow Systems

    Science.gov (United States)

    Habermann, T.

    2012-12-01

    All modern workflow systems include mechanisms for recording inputs, outputs and processes. These descriptions can include details required to reproduce the workflows exactly and, in some cases, can include virtual images of the hardware and operating system. There are several on-going and emerging standards for representing these detailed workflows including the Open Provenance Model (OPM) and the W3C PROV. At the same time, ISO metadata standards include a simple provenance or lineage model that includes many important elements of workflows. The ISO model could play a critical role in sharing and discovering workflow information for collections and perhaps in recording some details in granules. In order for this goal to be reached, connections between the detailed standards and ISO must be understood and conventions for using them must be developed.

  13. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  14. The Future of Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay; Carothers, Christopher; Dam, Kerstin Kleese van; Moreland, Kenneth; Parashar, Manish; Ramakrishnan, Lavanya; Taufer, Michela; Vetter, Jeffery

    2018-01-01

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on those workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.

  15. Essential Grid Workflow Monitoring Elements

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel K.; Jackson, Keith R.; Konerding, David E.; Lee,Jason R.; Tierney, Brian L.

    2005-07-01

    Troubleshooting Grid workflows is difficult. A typicalworkflow involves a large number of components networks, middleware,hosts, etc. that can fail. Even when monitoring data from all thesecomponents is accessible, it is hard to tell whether failures andanomalies in these components are related toa given workflow. For theGrid to be truly usable, much of this uncertainty must be elim- inated.We propose two new Grid monitoring elements, Grid workflow identifiersand consistent component lifecycle events, that will make Gridtroubleshooting easier, and thus make Grids more usable, by simplifyingthe correlation of Grid monitoring data with a particular Gridworkflow.

  16. Exploring Dental Providers' Workflow in an Electronic Dental Record Environment.

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N; Ye, Zhan; Acharya, Amit

    2016-01-01

    A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR.

  17. Design of a Seismic Reflection Multi-Attribute Workflow for Delineating Karst Pore Systems Using Neural Networks and Statistical Dimensionality Reduction Techniques

    Science.gov (United States)

    Ebuna, D. R.; Kluesner, J.; Cunningham, K. J.; Edwards, J. H.

    2016-12-01

    An effective method for determining the approximate spatial extent of karst pore systems is critical for hydrological modeling in such environments. When using geophysical techniques, karst features are especially challenging to constrain due to their inherent heterogeneity and complex seismic signatures. We present a method for mapping these systems using three-dimensional seismic reflection data by combining applications of machine learning and modern data science. Supervised neural networks (NN) have been successfully implemented in seismic reflection studies to produce multi-attributes (or meta-attributes) for delineating faults, chimneys, salt domes, and slumps. Using a seismic reflection dataset from southeast Florida, we develop an objective multi-attribute workflow for mapping karst in which potential interpreter bias is minimized by applying linear and non-linear data transformations for dimensionality reduction. This statistical approach yields a reduced set of input seismic attributes to the NN by eliminating irrelevant and overly correlated variables, while still preserving the vast majority of the observed data variance. By initiating the supervised NN from an eigenspace that maximizes the separation between classes, the convergence time and accuracy of the computations are improved since the NN only needs to recognize small perturbations to the provided decision boundaries. We contend that this 3D seismic reflection, data-driven method for defining the spatial bounds of karst pore systems provides great value as a standardized preliminary step for hydrological characterization and modeling in these complex geological environments.

  18. Scientific workflow management in proteomics

    National Research Council Canada - National Science Library

    de Bruin, Jeroen S; Deelder, André M; Palmblad, Magnus

    2012-01-01

    .... In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community...

  19. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  20. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  1. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  2. Resource scheduling of workflow multi-instance migration based on the shuffled leapfrog algorithm

    Directory of Open Access Journals (Sweden)

    Yang Mingshun

    2015-01-01

    Full Text Available Purpose: When the workflow changed, resource scheduling optimization in the process of the current running instance migration has become a hot issue in current workflow flexible research; purpose of the article is to investigate the resource scheduling problem of workflow multi-instance migration. Design/methodology/approach: The time and cost relationships between activities and resources in workflow instance migration process are analyzed and a resource scheduling optimization model in the process of workflow instance migration is set up; Research is performed on resource scheduling optimization in workflow multi-instance migration, leapfrog algorithm is adopted to obtain the optimal resource scheduling scheme. An example is given to verify the validity of the model and the algorithm. Findings: under the constraints of resource cost and quantity, an optimal resource scheduling scheme for workflow migration is found, ensuring a minimal running time and optimal cost.

  3. An Automated Translator for Model Checking Time Constrained Workflow Systems

    Science.gov (United States)

    Mashiyat, Ahmed Shah; Rabbi, Fazle; Wang, Hao; Maccaull, Wendy

    Workflows have proven to be a useful conceptualization for the automation of business processes. While formal verification methods (e.g., model checking) can help ensure the reliability of workflow systems, the industrial uptake of such methods has been slow largely due to the effort involved in modeling and the memory required to verify complex systems. Incorporation of time constraints in such systems exacerbates the latter problem. We present an automated translator, YAWL2DVE-t, which takes as input a time constrained workflow model built with the graphical modeling tool YAWL, and outputs the model in DVE, the system specification language for the distributed LTL model checker DiVinE. The automated translator, together with the graphical editor and the distributed model checker, provides a method for rapid design, verification and refactoring of time constrained workflow systems. We present a realistic case study developed through collaboration with the local health authority.

  4. Assessment of Perioperative Ultrasound Workflow Understanding: A Consensus.

    Science.gov (United States)

    Yeh, Lu; Montealegre-Gallegos, Mario; Mahmood, Feroze; Hess, Philip E; Shnider, Marc; Mitchell, John D; Jones, Stephanie B; Mashari, Azad; Wong, Vanessa; Matyal, Robina

    2017-02-01

    Understanding of the workflow of perioperative ultrasound (US) examination is an integral component of proficiency. Workflow consists of the practical steps prior to executing an US examination (eg, equipment operation). Whereas other proficiency components (ie, cognitive knowledge and manual dexterity) can be tested, workflow understanding is difficult to define and assess due to its contextual and institution-specific nature. The objective was to define the workflow components of specific perioperative US applications using an iterative process to reach a consensus opinion. Expert consensus, survey study. Tertiary university hospital. This study sought expert consensus among a focus group of 9 members of an anesthesia department with experience in perioperative US. Afterward, 257 anesthesia faculty members from 133 academic centers across the United States were surveyed. A preliminary list of tasks was designed to establish the expectations of workflow understanding by an anesthesiology resident prior to clinical exposure to perioperative US. This list was modified by a focus group through an iterative process. Afterwards, a survey was sent to faculty members nationwide, and Likert scale ratings for each task were obtained and reviewed during a second round. Consensus among members of the focus group was reached after 2 iterations. 72 participants responded to the nationwide survey (28%), and consensus was reached after the second round (Cronbach's α = 0.99, ICC = 0.99) on a final list of 46 workflow-related tasks. Specific components of perioperative US workflow were identified. Evaluation of workflow understanding may be combined with cognitive knowledge and manual dexterity testing for assessing proficiency in perioperative US. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    This thesis presents CSP as a means of orchestrating the execution of tasks in a scientific workflow. Scientific workflow systems are popular in a wide range of scientific areas, where tasks are organised in directed graphs. Execution of such graphs is handled by the scientific workflow systems...... and the readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... applications to run on heterogenous hardware, combining multiple hardware architectures. This is especially important in scientific computing as the performance of computational tasks may be orders of magnitude faster depending on the hardware architecture used. To ensure the robustness of the PyCSP library...

  6. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    semantics, which is a necessary foundation for asynchronously executing distributed processes, is not obvious for declarative formalisms and is so far virtually unexplored. This is in stark contrast to the very successful Petri-net–based process languages, which have an inherent notion of concurrency...... of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical...

  7. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  8. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    2008-01-01

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  9. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  10. Constructing workflows from script applications

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.; Malawski, M.

    2012-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  11. Rapid Energy Modeling Workflow Demonstration

    Science.gov (United States)

    2013-10-31

    BIM Building Information Modeling BPA Building Performance Analysis BTU British Thermal Unit CBECS Commercial Building ...geometry, orientation, weather, and materials, generates 3D Building Information Models ( BIM ) guided by satellite views of building footprints and...Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and conceptual energy analysis.

  12. Adobe Photoshop Lightroom and Photoshop workflow bible

    CERN Document Server

    Fitzgerald, Mark

    2013-01-01

    The digital photographer's workflow is divided into two distinct parts - the Production Workflow and the Creative Workflow. The Production workflow is used to import and organize large numbers of images, and prepare them for presentation via proof printing, Web, or slideshow. Increasingly, photographers are turning to Adobe's acclaimed new Lightroom software to manage this part of the workflow. After the best images are identified, photographers move to the second part of the workflow, the Creative Workflow, to fine-tune special images using a variety of advanced digital tools so that the creative vision is realized. An overwhelming majority of digital photographers use Photoshop for this advanced editing. Adobe Photoshop Lightroom & Photoshop Workflow Bible effectively guides digital photographers through both parts of this process. Author Mark Fitzgerald, an Adobe Certified Expert and Adobe Certified Instructor in Photoshop CS3 offers readers a clear path to using both Lightroom 2 and Photoshop CS3 to c...

  13. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.; Bauknecht, K.; Kumar Madria, S.; Pernul, G.

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow management systems (WFMSs) to control their processes, a way of linking workflow processes in different organizations is required for turning the cooperating

  14. An Auto-management Thesis Program WebMIS Based on Workflow

    Science.gov (United States)

    Chang, Li; Jie, Shi; Weibo, Zhong

    An auto-management WebMIS based on workflow for bachelor thesis program is given in this paper. A module used for workflow dispatching is designed and realized using MySQL and J2EE according to the work principle of workflow engine. The module can automatively dispatch the workflow according to the date of system, login information and the work status of the user. The WebMIS changes the management from handwork to computer-work which not only standardizes the thesis program but also keeps the data and documents clean and consistent.

  15. Specifying workflow process requirements for an emergency medical service.

    Science.gov (United States)

    Poulymenopoulou, M; Malamateniou, F; Vassilacopoulos, G

    2003-08-01

    Recent trends in healthcare delivery have led to a gradual shift in the conceptualisation of healthcare information systems towards supporting healthcare processes in a more direct way. The move towards integrated and managed care, which requires designing healthcare processes around patient needs and incorporating efficiency considerations has led to an increased interest in process-oriented healthcare information systems based on workflow technology. This means to actively deliver the tasks to be performed to the right persons at the right time with the necessary information and the application functions needed. Moreover, workflow technology promotes a component-oriented development whereby the process logic is separated from application logic. This paper presents an approach to capturing process logic requirements for healthcare workflow systems with a view to design a system that is easily adjustable to process changes and to evolving organizational structures at a reasonable cost.

  16. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  17. Execution Time Estimation for Workflow Scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A..S.Z.; Kovalchuk, S.V.; Makkes, M.X.

    2014-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and

  18. Execution time estimation for workflow scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A.S.Z.; Kovalchuk, S.V.; Makkes, M.X.; Melnik, M.A.; Visheratin, A.A.; Nasonov, D.A.

    2017-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the stochastic aspects of the workflow

  19. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  20. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  1. The Prosthetic Workflow in the Digital Era.

    Science.gov (United States)

    Tordiglione, Lidia; De Franco, Michele; Bosetti, Giovanni

    2016-01-01

    The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland) intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010) software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  2. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    output, given the input resource at hand. Still, in such cases it may be possible to reach the set goal by chaining a number of tools. The approach presented here frees the user of having to meddle with tools and the construction of workflows. Instead, the user only needs to supply the workflow manager...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this......, the workflow manager needs stringent and complete information about each integrated tool. We discuss how such information is structured in CLARIN-DK. Provided that many tools are made available to and through the CLARIN-DK infrastructure, the automatically created workflows, although simple linear programs...

  3. A Drupal-Based Collaborative Framework for Science Workflows

    Science.gov (United States)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between

  4. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  5. Implementation of Electronic Workflow Systems in Higher Education Institutions: Issues and Challenges

    Science.gov (United States)

    Cheung, K. S.

    To different extents, electronic workflow systems have been widely used in higher education institutions for administering the daily and routine operations. Whilst workflow automation is advocated for streamlining business processes, there are technical limitations as well as management constraints, especially on process review and re-engineering. During the process review, a big challenge is to make sure that the system would not only meet the business requirements but also improve the process flow. It is important for one to retain the legacy stature while coping with the changes in workflow, but taking into consideration of the needs to accommodate managerial constraints. This paper investigates the issues and challenges in implementing electronic workflow systems in higher education institutions. Different approaches to the process review, workflow design and re-design are discussed.

  6. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  7. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  8. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  9. A query suggestion workflow for life science IR-systems.

    Science.gov (United States)

    Esch, Maria; Chen, Jinbo; Weise, Stephan; Hassani-Pak, Keywan; Scholz, Uwe; Lange, Matthias

    2014-06-13

    Information Retrieval (IR) plays a central role in the exploration and interpretation of integrated biological datasets that represent the heterogeneous ecosystem of life sciences. Here, keyword based query systems are popular user interfaces. In turn, to a large extend, the used query phrases determine the quality of the search result and the effort a scientist has to invest for query refinement. In this context, computer aided query expansion and suggestion is one of the most challenging tasks for life science information systems. Existing query front-ends support aspects like spelling correction, query refinement or query expansion. However, the majority of the front-ends only make limited use of enhanced IR algorithms to implement comprehensive and computer aided query refinement workflows. In this work, we present the design of a multi-stage query suggestion workflow and its implementation in the life science IR system LAILAPS. The presented workflow includes enhanced tokenisation, word breaking, spelling correction, query expansion and query suggestion ranking. A spelling correction benchmark with 5,401 queries and manually selected use cases for query expansion demonstrate the performance of the implemented workflow and its advantages compared with state-of-the-art systems.

  10. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  11. Introduction to the Workflow Systems in Management

    Directory of Open Access Journals (Sweden)

    Aleksander Wocial

    2007-10-01

    Full Text Available The article concerns ontology of workflow management systems. The fundamental diagrams and their constituent elements are presented, the meaning of components and relation or interaction among them as well. The first is conceptual model of flow process, followed by meta model of process definition. The understanding of terms is crucial for IT or management specialists involved in the area of workflow.

  12. Introduction to the Workflow Systems in Management

    OpenAIRE

    Aleksander Wocial

    2007-01-01

    The article concerns ontology of workflow management systems. The fundamental diagrams and their constituent elements are presented, the meaning of components and relation or interaction among them as well. The first is conceptual model of flow process, followed by meta model of process definition. The understanding of terms is crucial for IT or management specialists involved in the area of workflow.

  13. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    Analysis of workflow processes with quantitative aspects like timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and study the foundational problems of soundness and strong (time-bounded) soundness. We explore the decidability o...

  14. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  15. Conveyor: a workflow engine for bioinformatic analyses.

    Science.gov (United States)

    Linke, Burkhard; Giegerich, Robert; Goesmann, Alexander

    2011-04-01

    The rapidly increasing amounts of data available from new high-throughput methods have made data processing without automated pipelines infeasible. As was pointed out in several publications, integration of data and analytic resources into workflow systems provides a solution to this problem, simplifying the task of data analysis. Various applications for defining and running workflows in the field of bioinformatics have been proposed and published, e.g. Galaxy, Mobyle, Taverna, Pegasus or Kepler. One of the main aims of such workflow systems is to enable scientists to focus on analysing their datasets instead of taking care for data management, job management or monitoring the execution of computational tasks. The currently available workflow systems achieve this goal, but fundamentally differ in their way of executing workflows. We have developed the Conveyor software library, a multitiered generic workflow engine for composition, execution and monitoring of complex workflows. It features an open, extensible system architecture and concurrent program execution to exploit resources available on modern multicore CPU hardware. It offers the ability to build complex workflows with branches, loops and other control structures. Two example use cases illustrate the application of the versatile Conveyor engine to common bioinformatics problems. The Conveyor application including client and server are available at http://conveyor.cebitec.uni-bielefeld.de.

  16. Exploration of management workflow of cataract surgery in an impoverished population in urban China.

    Science.gov (United States)

    Jiang, Haofeng; Lin, Haotian; Qu, Bo; Chen, Weirong

    2014-06-01

    To explore and establish a rational management workflow for a free cataract surgery program for the poor population in urban China, aiming to improve surgical efficiency. Establishment of a management workflow mainly includes system design and an auxiliary facility. System design procedures consist of outpatient screening, outpatient physical examination, surgical procedures, and postoperative clinic visits. After establishing the management workflow of cataract surgery, a free cataract surgery program was conducted for 15 months. Based upon the established management mode, 9003 patients received preoperative screening and 2358 underwent cataract surgery. During the 15-month investigation, each procedure was successfully conducted, the efficiency of screening and operation attained the highest standards in China, and no surgical malpractice occurred intraoperatively. In this study, a management workflow for cataract surgery was designed for a poverty relief project in urban China. During the 15-month project, the degree of patient satisfaction was enhanced without disrupting the normal practice and safety of the sponsor hospital.

  17. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  18. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  19. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    Science.gov (United States)

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  20. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  1. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  2. Relationship between E-Prescriptions and Community Pharmacy Workflow

    Science.gov (United States)

    Odukoya, Olufunmilola K.; Chui, Michelle A.

    2013-01-01

    Objectives To understand how community pharmacists use electronic prescribing (e-prescribing) technology; and to describe the workflow challenges pharmacy personnel encounter as a result of using e-prescribing technology. Design Cross-sectional qualitative study. Setting Seven community pharmacies in Wisconsin from December 2010 to March 2011 Participants 16 pharmacists and 14 pharmacy technicians (in three chain and four independent pharmacies). Interventions Think-aloud protocol and pharmacy group interviews. Main outcome measures Pharmacy staff description of their use of e-prescribing technology and challenges encountered in their daily workflow related to this technology. Results Two contributing factors were perceived to influence e-prescribing workflow: issues stemming from prescribing or transmitting software, and issues from within the pharmacy. Pharmacies experienced both delays in receiving, and inaccurate e-prescriptions from physician offices. Receiving an overwhelming number of e-prescriptions with inaccurate or unclear information resulted in significant time delays for patients as pharmacists contacted physicians to clarify wrong information. In addition, pharmacy personnel reported that lack of formal training and the disconnect between the way pharmacists verify accuracy and conduct drug utilization review and the presentation of e-prescription information on the computer screen significantly influenced the speed of processing an e-prescription. Conclusion E-prescriptions processing can hinder pharmacy workflow. As the number of e-prescriptions transmitted to pharmacies increases due to legislative mandates; it is essential that the technology that supports e-prescriptions (both on the prescriber and pharmacy operating systems) be redesigned to facilitate pharmacy workflow processes and to prevent unintended consequences, such as increased medication errors, user frustration, and stress. PMID:23229979

  3. Redefining the sonography workflow through the application of a departmental computerized workflow management system.

    Science.gov (United States)

    Li, Ming-Feng; Tsai, Jerry Ch; Chen, Wei-Juhn; Lin, Huey-Shyan; Pan, Huay-Ben; Yang, Tsung-Lung

    2013-03-01

    The purpose of this study is to demonstrate and evaluate the effective application of a computerized workflow management system (WMS) into sonography workflow in order to reduce patient exam waiting time, number of waiting patients, sonographer stress level, and to improve patient satisfaction. A computerized WMS was built with seamless integration of an automated patient sorting algorithm, a real-time monitoring system, exam schedules fine-tuning, a tele-imaging support system, and a digital signage broadcasting system of patient education programs. The computerized WMS was designed to facilitate problem-solving through continuous customization and flexible adjustment capability. Its effects on operations, staff stress, and patient satisfaction were studied. After implementation of the computerized WMS, there is a significant decrease in patient exam waiting time and sonographer stress level, significant increase in patient satisfaction regarding exam waiting time and the number of examined patients, and marked decrease in the number of waiting patients at different time points in a day. Through multidisciplinary teamwork, the computerized WMS provides a simple and effective approach that can overcome jammed exams associated problems, increase patient satisfaction level, and decrease staff workload stress under limited resources, eventually creating a win-win situation for both the patients and radiology personnel. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  5. Concept and application of a computational vaccinology workflow.

    Science.gov (United States)

    Söllner, Johannes; Heinzel, Andreas; Summer, Georg; Fechete, Raul; Stipkovits, Laszlo; Szathmary, Susan; Mayer, Bernd

    2010-11-03

    The last years have seen a renaissance of the vaccine area, driven by clinical needs in infectious diseases but also chronic diseases such as cancer and autoimmune disorders. Equally important are technological improvements involving nano-scale delivery platforms as well as third generation adjuvants. In parallel immunoinformatics routines have reached essential maturity for supporting central aspects in vaccinology going beyond prediction of antigenic determinants. On this basis computational vaccinology has emerged as a discipline aimed at ab-initio rational vaccine design.Here we present a computational workflow for implementing computational vaccinology covering aspects from vaccine target identification to functional characterization and epitope selection supported by a Systems Biology assessment of central aspects in host-pathogen interaction. We exemplify the procedures for Epstein Barr Virus (EBV), a clinically relevant pathogen causing chronic infection and suspected of triggering malignancies and autoimmune disorders. We introduce pBone/pView as a computational workflow supporting design and execution of immunoinformatics workflow modules, additionally involving aspects of results visualization, knowledge sharing and re-use. Specific elements of the workflow involve identification of vaccine targets in the realm of a Systems Biology assessment of host-pathogen interaction for identifying functionally relevant targets, as well as various methodologies for delineating B- and T-cell epitopes with particular emphasis on broad coverage of viral isolates as well as MHC alleles.Applying the workflow on EBV specifically proposes sequences from the viral proteins LMP2, EBNA2 and BALF4 as vaccine targets holding specific B- and T-cell epitopes promising broad strain and allele coverage. Based on advancements in the experimental assessment of genomes, transcriptomes and proteomes for both, pathogen and (human) host, the fundaments for rational design of vaccines

  6. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    usage history to help Earth scientists better understand existing artifacts and how to use them in a proper manner? R2: Informed by insights derived from their computing contexts, how could such hidden knowledge be used to facilitate artifact reuse by Earth scientists? Our study of the two research questions will provide answers to three technical questions aiming to assist NEX users during workflow development: 1) How to determine what topics interest the researcher? 2) How to find appropriate artifacts? and 3) How to advise the researcher in artifact reuse? In this paper, we report our on-going efforts of leveraging social networking theory and analysis techniques to provide dynamic advice on artifact reuse to NEX users based on their surrounding contexts. As a proof of concept, we have designed and developed a plug-in to the VisTrails workflow design tool. When users develop workflows using VisTrails, our plug-in will proactively recommend most relevant sub-workflows to the users.

  7. Integrative workflows for metagenomic analysis.

    Science.gov (United States)

    Ladoukakis, Efthymios; Kolisis, Fragiskos N; Chatziioannou, Aristotelis A

    2014-01-01

    The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS), have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e., Sanger). From a bioinformatic perspective, this boils down to many GB of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control, and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  8. The Diabetic Retinopathy Screening Workflow

    Science.gov (United States)

    Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew

    2015-01-01

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630

  9. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  10. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  11. The complete digital workflow in fixed prosthodontics: a systematic review.

    Science.gov (United States)

    Joda, Tim; Zarone, Fernando; Ferrari, Marco

    2017-09-19

    The continuous development in dental processing ensures new opportunities in the field of fixed prosthodontics in a complete virtual environment without any physical model situations. The aim was to compare fully digitalized workflows to conventional and/or mixed analog-digital workflows for the treatment with tooth-borne or implant-supported fixed reconstructions. A PICO strategy was executed using an electronic (MEDLINE, EMBASE, Google Scholar) plus manual search up to 2016-09-16 focusing on RCTs investigating complete digital workflows in fixed prosthodontics with regard to economics or esthetics or patient-centered outcomes with or without follow-up or survival/success rate analysis as well as complication assessment of at least 1 year under function. The search strategy was assembled from MeSH-Terms and unspecific free-text words: {(("Dental Prosthesis" [MeSH]) OR ("Crowns" [MeSH]) OR ("Dental Prosthesis, Implant-Supported" [MeSH])) OR ((crown) OR (fixed dental prosthesis) OR (fixed reconstruction) OR (dental bridge) OR (implant crown) OR (implant prosthesis) OR (implant restoration) OR (implant reconstruction))} AND {("Computer-Aided Design" [MeSH]) OR ((digital workflow) OR (digital technology) OR (computerized dentistry) OR (intraoral scan) OR (digital impression) OR (scanbody) OR (virtual design) OR (digital design) OR (cad/cam) OR (rapid prototyping) OR (monolithic) OR (full-contour))} AND {("Dental Technology" [MeSH) OR ((conventional workflow) OR (lost-wax-technique) OR (porcelain-fused-to-metal) OR (PFM) OR (implant impression) OR (hand-layering) OR (veneering) OR (framework))} AND {(("Study, Feasibility" [MeSH]) OR ("Survival" [MeSH]) OR ("Success" [MeSH]) OR ("Economics" [MeSH]) OR ("Costs, Cost Analysis" [MeSH]) OR ("Esthetics, Dental" [MeSH]) OR ("Patient Satisfaction" [MeSH])) OR ((feasibility) OR (efficiency) OR (patient-centered outcome))}. Assessment of risk of bias in selected studies was done at a 'trial level' including random sequence

  12. Introduction to the Workflow Systems in Management

    National Research Council Canada - National Science Library

    Aleksander Wocial

    2007-01-01

    The article concerns ontology of workflow management systems. The fundamental diagrams and their constituent elements are presented, the meaning of components and relation or interaction among them as well...

  13. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  14. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  15. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  16. Temporal similarity measures for querying clinical workflows.

    Science.gov (United States)

    Combi, Carlo; Gozzi, Matteo; Oliboni, Barbara; Juarez, Jose M; Marin, Roque

    2009-05-01

    In this paper, we extend a preliminary proposal and discuss in a deeper and more formal way an approach to evaluate temporal similarity between clinical workflow cases (i.e., executions of clinical processes). More precisely, we focus on (i) the representation of clinical processes by using a temporal conceptual workflow model; (ii) the definition of ad hoc temporal constraint networks to formally represent clinical workflow cases; (iii) the definition of temporal similarity for clinical workflow cases based on the comparison of temporal constraint networks; (iv) the management of the similarity of clinical processes related to the Italian guideline for stroke prevention and management (SPREAD). Clinical processes are composed by clinical activities to be done by given actors in a given order satisfying given temporal constraints. This description means that clinical processes can be seen as organizational processes, and modeled by workflow schemata. When a workflow schema represents a clinical process, its cases represent different instances derived from dealing with different patients in different situations. With respect to all the cases related to a workflow schema, each clinical case can be different with respect to its structure and to its temporal aspects. Clinical cases can be stored in clinical databases and information retrieval can be done evaluating the similarity between workflow cases. We first describe a possible approach to the conceptual modeling of a clinical process, by using a temporally extended workflow model. Then, we define how a workflow case can be represented as a set of activities, and show how to express them through temporal constraint networks. Once we have built temporal constraint networks related to the cases to compare, we propose a similarity function able to evaluate the differences between the considered cases with respect to the order and duration of corresponding activities, and with respect to the presence/absence of some

  17. Toward Common Components for Open Workflow Systems

    OpenAIRE

    Billings, Jay Jay; Jha, Shantenu

    2017-01-01

    The role of scalable high-performance workflows and flexible workflow management systems that can support multiple simulations will continue to increase in importance. For example, with the end of Dennard scaling, there is a need to substitute a single long running simulation with multiple repeats of shorter simulations, or concurrent replicas. Further, many scientific problems involve ensembles of simulations in order to solve a higher-level problem or produce statistically meaningful result...

  18. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on

  19. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; De, K; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2014-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  20. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  1. SPATIAL DATA QUALITY AND A WORKFLOW TOOL

    Directory of Open Access Journals (Sweden)

    M. Meijer

    2015-08-01

    Full Text Available Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.

  2. Workflow to numerically reproduce laboratory ultrasonic datasets

    Directory of Open Access Journals (Sweden)

    A. Biryukov

    2014-12-01

    Full Text Available The risks and uncertainties related to the storage of high-level radioactive waste (HLRW can be reduced thanks to focused studies and investigations. HLRWs are going to be placed in deep geological repositories, enveloped in an engineered bentonite barrier, whose physical conditions are subjected to change throughout the lifespan of the infrastructure. Seismic tomography can be employed to monitor its physical state and integrity. The design of the seismic monitoring system can be optimized via conducting and analyzing numerical simulations of wave propagation in representative repository geometry. However, the quality of the numerical results relies on their initial calibration. The main aim of this paper is to provide a workflow to calibrate numerical tools employing laboratory ultrasonic datasets. The finite difference code SOFI2D was employed to model ultrasonic waves propagating through a laboratory sample. Specifically, the input velocity model was calibrated to achieve a best match between experimental and numerical ultrasonic traces. Likely due to the imperfections of the contact surfaces, the resultant velocities of P- and S-wave propagation tend to be noticeably lower than those a priori assigned. Then, the calibrated model was employed to estimate the attenuation in a montmorillonite sample. The obtained low quality factors (Q suggest that pronounced inelastic behavior of the clay has to be taken into account in geophysical modeling and analysis. Consequently, this contribution should be considered as a first step towards the creation of a numerical tool to evaluate wave propagation in nuclear waste repositories.

  3. Spatial Data Quality and a Workflow Tool

    Science.gov (United States)

    Meijer, M.; Vullings, L. A. E.; Bulens, J. D.; Rip, F. I.; Boss, M.; Hazeu, G.; Storm, M.

    2015-08-01

    Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary) and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.

  4. Electronic health information in use: Characteristics that support employee workflow and patient care.

    Science.gov (United States)

    Russ, Alissa L; Saleem, Jason J; Justice, Connie F; Woodward-Hagg, Heather; Woodbridge, Peter A; Doebbeling, Bradley N

    2010-12-01

    The aim of this investigation was to assess helpful and challenging aspects of electronic health information with respect to clinical workflow and identify a set of characteristics that support patient care processes. We conducted 20 semi-structured interviews at a Veterans Affairs Medical Center, with a fully implemented electronic health record (EHR), and elicited positive and negative examples of how information technology (IT) affects the work of healthcare employees. Responses naturally shed light on information characteristics that aid work processes. We performed a secondary analysis on interview data and inductively identified characteristics of electronic information that support healthcare workflow. Participants provided 199 examples of how electronic information affects workflow. Seventeen characteristics emerged along with four primary domains: trustworthy and reliable; ubiquitous; effectively displayed; and adaptable to work demands. Each characteristic may be used to help evaluate health information technology pre- and post-implementation. Results provide several strategies to improve EHR design and implementation to better support healthcare workflow.

  5. Re-engineering Workflows: Changing the Life Cycle of an Electronic Health Record System

    Directory of Open Access Journals (Sweden)

    Jane M. Brokel

    2011-01-01

    Full Text Available An existing electronic health record (EHR system was re-engineered with cross-functional workflows to enhance the efficiency and clinical utility of health information technology. The new designs were guided by a systematic review of clinicians' requests, which were garnered by direct interviews. To design cross-functional, patient-centered workflows, several multi-disciplinary teams from the health system of hospitals, clinics, and other services participated. We identified gaps and inconsistencies with current care processes and implemented changes that improved workflow for patients and clinicians. Our findings emphasize that, to coordinate care between many providers, process workflow must be standardized within and across settings and focus on patient care processes, not the technology. These new, comprehensive, admission-to-discharge workflows replaced the older, functional- and departmental-process flow charts that had fallen short. Our experience led to integrated redesign of the workflows, review prior to implementation and ongoing maintenance of this process knowledge across 37 hospital facilities.

  6. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  7. How Workflow Documentation Facilitates Curation Planning

    Science.gov (United States)

    Wickett, K.; Thomer, A. K.; Baker, K. S.; DiLauro, T.; Asangba, A. E.

    2013-12-01

    The description of the specific processes and artifacts that led to the creation of a data product provide a detailed picture of data provenance in the form of a workflow. The Site-Based Data Curation project, hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, has been investigating how workflows can be used in developing curation processes and policies that move curation "upstream" in the research process. The team has documented an individual workflow for geobiology data collected during a single field trip to Yellowstone National Park. This specific workflow suggests a generalized three-part process for field data collection that comprises three distinct elements: a Planning Stage, a Fieldwork Stage, and a Processing and Analysis Stage. Beyond supplying an account of data provenance, the workflow has allowed the team to identify 1) points of intervention for curation processes and 2) data products that are likely candidates for sharing or deposit. Although these objects may be viewed by individual researchers as 'intermediate' data products, discussions with geobiology researchers have suggested that with appropriate packaging and description they may serve as valuable observational data for other researchers. Curation interventions may include the introduction of regularized data formats during the planning process, data description procedures, the identification and use of established controlled vocabularies, and data quality and validation procedures. We propose a poster that shows the individual workflow and our generalization into a three-stage process. We plan to discuss with attendees how well the three-stage view applies to other types of field-based research, likely points of intervention, and what kinds of interventions are appropriate and feasible in the example workflow.

  8. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  9. Understanding the dispensary workflow at the Birmingham Free Clinic: a proposed framework for an informatics intervention.

    Science.gov (United States)

    Fisher, Arielle M; Herbert, Mary I; Douglas, Gerald P

    2016-02-19

    The Birmingham Free Clinic (BFC) in Pittsburgh, Pennsylvania, USA is a free, walk-in clinic that serves medically uninsured populations through the use of volunteer health care providers and an on-site medication dispensary. The introduction of an electronic medical record (EMR) has improved several aspects of clinic workflow. However, pharmacists' tasks involving medication management and dispensing have become more challenging since EMR implementation due to its inability to support workflows between the medical and pharmaceutical services. To inform the design of a systematic intervention, we conducted a needs assessment study to identify workflow challenges and process inefficiencies in the dispensary. We used contextual inquiry to document the dispensary workflow and facilitate identification of critical aspects of intervention design specific to the user. Pharmacists were observed according to contextual inquiry guidelines. Graphical models were produced to aid data and process visualization. We created a list of themes describing workflow challenges and asked the pharmacists to rank them in order of significance to narrow the scope of intervention design. Three pharmacists were observed at the BFC. Observer notes were documented and analyzed to produce 13 themes outlining the primary challenges pharmacists encounter during dispensation at the BFC. The dispensary workflow is labor intensive, redundant, and inefficient when integrated with the clinical service. Observations identified inefficiencies that may benefit from the introduction of informatics interventions including: medication labeling, insufficient process notification, triple documentation, and inventory control. We propose a system for Prescription Management and General Inventory Control (RxMAGIC). RxMAGIC is a framework designed to mitigate workflow challenges and improve the processes of medication management and inventory control. While RxMAGIC is described in the context of the BFC

  10. Integrating configuration workflows with project management system

    Science.gov (United States)

    Nilsen, Dimitri; Weber, Pavel

    2014-06-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  11. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  12. SwinDeW-C: A Peer-to-Peer Based Cloud Workflow System

    Science.gov (United States)

    Liu, Xiao; Yuan, Dong; Zhang, Gaofeng; Chen, Jinjun; Yang, Yun

    Workflow systems are designed to support the process automation of large scale business and scientific applications. In recent years, many workflow systems have been deployed on high performance computing infrastructures such as cluster, peer-to-peer (p2p), and grid computing (Moore, 2004; Wang, Jie, & Chen, 2009; Yang, Liu, Chen, Lignier, & Jin, 2007). One of the driving forces is the increasing demand of large scale instance and data/computation intensive workflow applications (large scale workflow applications for short) which are common in both eBusiness and eScience application areas. Typical examples (will be detailed in Section 13.2.1) include such as the transaction intensive nation-wide insurance claim application process; the data and computation intensive pulsar searching process in Astrophysics. Generally speaking, instance intensive applications are those processes which need to be executed for a large number of times sequentially within a very short period or concurrently with a large number of instances (Liu, Chen, Yang, & Jin, 2008; Liu et al., 2010; Yang et al., 2008). Therefore, large scale workflow applications normally require the support of high performance computing infrastructures (e.g. advanced CPU units, large memory space and high speed network), especially when workflow activities are of data and computation intensive themselves. In the real world, to accommodate such a request, expensive computing infrastructures including such as supercomputers and data servers are bought, installed, integrated and maintained with huge cost by system users

  13. An analytical method for well-formed workflow/Petri net verification of classical soundness

    Directory of Open Access Journals (Sweden)

    Clempner Julio

    2014-12-01

    Full Text Available In this paper we consider workflow nets as dynamical systems governed by ordinary difference equations described by a particular class of Petri nets. Workflow nets are a formal model of business processes. Well-formed business processes correspond to sound workflow nets. Even if it seems necessary to require the soundness of workflow nets, there exist business processes with conditional behavior that will not necessarily satisfy the soundness property. In this sense, we propose an analytical method for showing that a workflow net satisfies the classical soundness property using a Petri net. To present our statement, we use Lyapunov stability theory to tackle the classical soundness verification problem for a class of dynamical systems described by Petri nets. This class of Petri nets allows a dynamical model representation that can be expressed in terms of difference equations. As a result, by applying Lyapunov theory, the classical soundness property for workflow nets is solved proving that the Petri net representation is stable. We show that a finite and non-blocking workflow net satisfies the sound property if and only if its corresponding PN is stable, i.e., given the incidence matrix A of the corresponding PN, there exists a Փ strictly positive m vector such that AՓ≤ 0. The key contribution of the paper is the analytical method itself that satisfies part of the definition of the classical soundness requirements. The method is designed for practical applications, guarantees that anomalies can be detected without domain knowledge, and can be easily implemented into existing commercial systems that do not support the verification of workflows. The validity of the proposed method is successfully demonstrated by application examples.

  14. Workflows for intelligent monitoring using proxy services.

    Science.gov (United States)

    Rüping, Stefan; Wegener, Dennis; Sfakianakis, Stelios; Sengstag, Thierry

    2009-01-01

    Grid technologies have proven to be very successful in the area of eScience, and in particular in healthcare applications. But while the applicability of workflow enacting tools for biomedical research has long since been proven, the practical adoption into regular clinical research has some additional challenges in grid context. In this paper, we investigate the case of data monitoring, and how to seamlessly implement the step between a one-time proof-of-concept workflow and high-performance on-line monitoring of data streams, as exemplified by the case of long-running clinical trials. We will present an approach based on proxy services that allows executing single-run workflows repeatedly with little overhead.

  15. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  16. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  17. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  18. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow......We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...

  19. Comprehensive workflow for wireline fluid sampling in an unconsolidated formations utilizing new large volume sampling equipment

    Energy Technology Data Exchange (ETDEWEB)

    Kvinnsland, S.; Brun, M. [TOTAL EandP Norge (Norway); Achourov, V.; Gisolf, A. [Schlumberger (Canada)

    2011-07-01

    Precise and accurate knowledge of fluid properties is essential in unconsolidated formations to the design of production facilities. Wireline formation testers (WFT) have a wide range of applications and the latest WFT can be used to define fluid properties in the wells drilled with oil based mud (OBM) by acquiring PVT and large volume samples. To use these technologies, a comprehensive workflow has to be implemented and the aim of this paper is to present such a workflow. A sampling was conducted in highly unconsolidated sand saturated with biodegradable fluid in the Hild filed in the North Sea. Results showed the use of comprehensive workflow to be successful in obtaining large volume samples with the contamination level below 1%. Oil was precisely characterized thanks to these samples and design updates to the project were made possible. This paper highlighted that the use of the latest WFT technologies can help better characterize fluids in unconsolidated formations and thus optimize production facilities design.

  20. Bioinformatics Analysis of Small RNA Transcriptomes: The Detailed Workflow.

    Science.gov (United States)

    Ilnytskyy, Slava; Bilichak, Andriy

    2017-01-01

    Next-generation sequencing became a method of choice for the investigation of small RNA transcriptomes in plants and animals. Although a technical side of sequencing itself is becoming routine, and experimental costs are affordable, data analysis still remains a challenge, especially for researchers with limited computational experience. Here, we present a detailed description of a computational workflow designed to take raw sequencing reads as input, to obtain small RNA predictions, and to detect the differentially expressed microRNAs as a result. The exact commands and pieces of code are provided and hopefully can be adapted and used by other researchers to facilitate the study of small RNA regulation.

  1. KDE Bioscience: platform for bioinformatics analysis workflows.

    Science.gov (United States)

    Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue

    2006-08-01

    Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research.

  2. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  3. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  4. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment of the ...... where a baked goods company seeks to improve production time while simultaneously minimising the cost and use of resources....

  5. Provenance for distributed biomedical workflow execution.

    Science.gov (United States)

    Madougou, Souley; Santcroos, Mark; Benabdelkader, Ammar; van Schaik, Barbera D C; Shahand, Shayan; Korkhov, Vladimir; van Kampen, Antoine H C; Olabarriaga, Sílvia D

    2012-01-01

    Scientific research has become very data and compute intensive because of the progress in data acquisition and measurement devices, which is particularly true in Life Sciences. To cope with this deluge of data, scientists use distributed computing and storage infrastructures. The use of such infrastructures introduces by itself new challenges to the scientists in terms of proper and efficient use. Scientific workflow management systems play an important role in facilitating the use of the infrastructure by hiding some of its complexity. Although most scientific workflow management systems are provenance-aware, not all of them come with provenance functionality out of the box. In this paper we describe the improvement and integration of a provenance system into an e-infrastructure for biomedical research based on the MOTEUR workflow management system. The main contributions of the paper are: presenting an OPM implementation using relational database backend for the provenance store, providing an e-infrastructure with a comprehensive provenance system, defining a generic approach to provenance implementation, potentially suitable for other workflow systems and application domains and demonstrating the value of this system based on use cases presenting the provenance data through a user-friendly web interface.

  6. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  7. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  8. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  9. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  10. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...utilizes information on operations, geometry, orientation, weather, and materials, generating Three-Dimensional (3D) Building Information Models ( BIM ...executed a demonstration of Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and

  11. Workflow Management for Complex HEP Analyses

    Science.gov (United States)

    Erdmann, M.; Fischer, R.; Rieger, M.; von Cube, R. F.

    2017-10-01

    We present the novel Analysis Workflow Management (AWM) that provides users with the tools and competences of professional large scale workflow systems, e.g. Apache’s Airavata[1]. The approach presents a paradigm shift from executing parts of the analysis to defining the analysis. Within AWM an analysis consists of steps. For example, a step defines to run a certain executable for multiple files of an input data collection. Each call to the executable for one of those input files can be submitted to the desired run location, which could be the local computer or a remote batch system. An integrated software manager enables automated user installation of dependencies in the working directory at the run location. Each execution of a step item creates one report for bookkeeping purposes containing error codes and output data or file references. Required files, e.g. created by previous steps, are retrieved automatically. Since data storage and run locations are exchangeable from the steps perspective, computing resources can be used opportunistically. A visualization of the workflow as a graph of the steps in the web browser provides a high-level view on the analysis. The workflow system is developed and tested alongside of a ttbb cross section measurement where, for instance, the event selection is represented by one step and a Bayesian statistical inference is performed by another. The clear interface and dependencies between steps enables a make-like execution of the whole analysis.

  12. Rehab Games as Components of Workflow: A Case Study.

    Science.gov (United States)

    Ekbia, Hamid R; Lee, Joomi; Wiley, Steve

    2014-08-01

    This study explored and evaluated the integration of rehabilitation games into therapy workflow. A multistage and multimethod study was followed: (1) A formative study involving observations and interviews with a total of approximately 90 therapists across the rehabilitation continuum was conducted for the design and development of an interactive therapy platform using Microsoft(®) (Redmond, WA) Kinect(®) for Windows. (2) A pilot study was carried out in an inpatient facility, involving patients and therapists. (3) Heuristic evaluation was performed on the basis of the principles of universal design. The findings from the study show the overall appeal of the system to patients and therapists, but also the challenges of integrating the system into the workflow. The findings from the study are in line with other similar studies in terms of the appeal and potential impact of games for rehabilitation and the applicability of the principles of universal design, as well as the need for institutional and public policy changes in the field of medical rehabilitation.

  13. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  14. Workflow ART: a framework for multidimensional workflow analysis

    Science.gov (United States)

    Monakova, Ganna; Leymann, Frank

    2013-02-01

    Business processes are obliged to follow numerous constraints, such as compliance regulations, service level agreements, security regulations and budget constraints. To be able to understand relations between different constraint types and their impact on the business, a common constraint specification framework is required. This work presents a framework that provides visual support for the analysis of the different constraints a business process has to adhere to by mapping the business process model to the cuboids in 3D space and business process constraints to the spatial restrictions of this space. This enables a business process designer to gain insight into how different constraints influence the process as a whole.

  15. Jflow: a workflow management system for web applications

    National Research Council Canada - National Science Library

    Mariette, Jérôme; Escudié, Frédéric; Bardou, Philippe; Nabihoudine, Ibouniyamine; Noirot, Céline; Trotard, Marie-Stéphane; Gaspin, Christine; Klopp, Christophe

    2016-01-01

    .... With Jflow, we introduce a Workflow Management System (WMS), composed of jQuery plug-ins which can easily be embedded in any web application and a Python library providing all requested features to setup, run and monitor workflows...

  16. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  17. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator.

    Science.gov (United States)

    Garcia Castro, Alexander; Thoraval, Samuel; Garcia, Leyla J; Ragan, Mark A

    2005-04-07

    Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces) in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results) can be reproduced or shared among users. http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive), ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download). From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous analytical tools.

  18. Investigating reproducibility and tracking provenance - A genomic workflow case study.

    Science.gov (United States)

    Kanwal, Sehrish; Khan, Farah Zaib; Lonie, Andrew; Sinnott, Richard O

    2017-07-12

    Computational bioinformatics workflows are extensively used to analyse genomics data, with different approaches available to support implementation and execution of these workflows. Reproducibility is one of the core principles for any scientific workflow and remains a challenge, which is not fully addressed. This is due to incomplete understanding of reproducibility requirements and assumptions of workflow definition approaches. Provenance information should be tracked and used to capture all these requirements supporting reusability of existing workflows. We have implemented a complex but widely deployed bioinformatics workflow using three representative approaches to workflow definition and execution. Through implementation, we identified assumptions implicit in these approaches that ultimately produce insufficient documentation of workflow requirements resulting in failed execution of the workflow. This study proposes a set of recommendations that aims to mitigate these assumptions and guides the scientific community to accomplish reproducible science, hence addressing reproducibility crisis. Reproducing, adapting or even repeating a bioinformatics workflow in any environment requires substantial technical knowledge of the workflow execution environment, resolving analysis assumptions and rigorous compliance with reproducibility requirements. Towards these goals, we propose conclusive recommendations that along with an explicit declaration of workflow specification would result in enhanced reproducibility of computational genomic analyses.

  19. Intelligent workflow driven processing for electronic mail management

    African Journals Online (AJOL)

    ... developed from different models to enable better workflow activities. The intelligent workflow activities from an email combine uses both dynamic and static workflow to carry out response which was implemented using MATLAB fuzzy logic tool box with overall system performance of 83.893%. Keywords: Email, fuzzy logic, ...

  20. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  1. Using telephony data to facilitate discovery of clinical workflows.

    Science.gov (United States)

    Rucker, Donald W

    2017-04-19

     Discovery of clinical workflows to target for redesign using methods such as Lean and Six Sigma is difficult. VoIP telephone call pattern analysis may complement direct observation and EMR-based tools in understanding clinical workflows at the enterprise level by allowing visualization of institutional telecommunications activity.  To build an analytic framework mapping repetitive and high-volume telephone call patterns in a large medical center to their associated clinical units using an enterprise unified communications server log file and to support visualization of specific call patterns using graphical networks.  Consecutive call detail records from the medical center's unified communications server were parsed to cross-correlate telephone call patterns and map associated phone numbers to a cost center dictionary. Hashed data structures were built to allow construction of edge and node files representing high volume call patterns for display with an open source graph network tool.  Summary statistics for an analysis of exactly one week's call detail records at a large academic medical center showed that 912,386 calls were placed with a total duration of 23,186 hours. Approximately half of all calling called number pairs had an average call duration under 60 seconds and of these the average call duration was 27 seconds.  Cross-correlation of phone calls identified by clinical cost center can be used to generate graphical displays of clinical enterprise communications. Many calls are short. The compact data transfers within short calls may serve as automation or re-design targets. The large absolute amount of time medical center employees were engaged in VoIP telecommunications suggests that analysis of telephone call patterns may offer additional insights into core clinical workflows.

  2. Bioprocess development workflow: Transferable physiological knowledge instead of technological correlations.

    Science.gov (United States)

    Reichelt, Wieland N; Haas, Florian; Sagmeister, Patrick; Herwig, Christoph

    2017-01-01

    Microbial bioprocesses need to be designed to be transferable from lab scale to production scale as well as between setups. Although substantial effort is invested to control technological parameters, usually the only true constant parameter is the actual producer of the product: the cell. Hence, instead of solely controlling technological process parameters, the focus should be increasingly laid on physiological parameters. This contribution aims at illustrating a workflow of data life cycle management with special focus on physiology. Information processing condenses the data into physiological variables, while information mining condenses the variables further into physiological descriptors. This basis facilitates data analysis for a physiological explanation for observed phenomena in productivity. Targeting transferability, we demonstrate this workflow using an industrially relevant Escherichia coli process for recombinant protein production and substantiate the following three points: (1) The postinduction phase is independent in terms of productivity and physiology from the preinduction variables specific growth rate and biomass at induction. (2) The specific substrate uptake rate during induction phase was found to significantly impact the maximum specific product titer. (3) The time point of maximum specific titer can be predicted by an easy accessible physiological variable: while the maximum specific titers were reached at different time points (19.8 ± 7.6 h), those maxima were reached all within a very narrow window of cumulatively consumed substrate dSn (3.1 ± 0.3 g/g). Concluding, this contribution provides a workflow on how to gain a physiological view on the process and illustrates potential benefits. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 33:261-270, 2017. © 2016 American Institute of Chemical Engineers.

  3. Barriers to effective, safe communication and workflow between nurses and non-consultant hospital doctors during out-of-hours.

    Science.gov (United States)

    Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh

    2017-11-01

    This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow.

  4. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  5. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...... with probabilistic non-deterministic branching. We present an algorithm that allows for exhaustive generation of possible error states that could arise in execution of the model, where the generated error states allow for both fail-stop behaviour and continued system execution. We employ stochastic model checking...... of the system being modelled. From these calculations, a comprehensive fault tree is generated. Further, we show that annotating the model with rewards (data) allows the expected mean values of reward structures to be calculated at points of failure....

  6. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    in highly complex and dynamic environments like healthcare and case management sectors, where the processes ex- hibit a lot of uncertainty and unexpected behavior and thereby require high degree of flexibility. Several research groups have suggested declarative models as a good approach to handle such ad...... using DCR Graphs and to make the formal model available to a wider audience, we have developed prototype tools for specification and a workflow engine for the execution of DCR Graphs. We have also developed tools interfacing SPIN model checker to formally verify safety and liveness properties on the DCR...... Graphs. Case studies from healthcare and case management domains have been modeled in DCR Graphs to show that our formal model is suitable for modeling the workflows from those dynamic sectors. This PhD project is funded by the Danish Strategic Research Council through the Trustworthy Pervasive...

  7. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  8. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  9. On the parameterized complexity of the workflow satisfiability problem

    DEFF Research Database (Denmark)

    Crampton, Jason; Gutin, Gregory; Yeo, Anders

    2012-01-01

    an assignment of users to workflow steps that satisfies all the constraints. An algorithm for determining whether such an assignment exists is important, both as a static analysis tool for workflow specifications, and for the construction of run-time reference monitors for workflow management systems. Finding......A workflow specification defines a set of steps and the order in which those steps must be executed. Security requirements may impose constraints on which groups of users are permitted to perform subsets of those steps. A workflow specification is said to be satisfiable if there exists...... such an assignment is a hard problem in general, but work by Wang and Li in 2010 using the theory of parameterized complexity suggests that efficient algorithms exist under reasonable assumptions about workflow specifications. In this paper, we improve the complexity bounds for the workflow satisfiability problem...

  10. Computing Workflows for Biologists: A Roadmap.

    Directory of Open Access Journals (Sweden)

    Ashley Shade

    Full Text Available Extremely large datasets have become routine in biology. However, performing a computational analysis of a large dataset can be overwhelming, especially for novices. Here, we present a step-by-step guide to computing workflows with the biologist end-user in mind. Starting from a foundation of sound data management practices, we make specific recommendations on how to approach and perform computational analyses of large datasets, with a view to enabling sound, reproducible biological research.

  11. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  12. The Prosthetic Workflow in the Digital Era

    OpenAIRE

    Lidia Tordiglione; Michele De Franco; Giovanni Bosetti

    2016-01-01

    The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland) intraoral scanner, which provided digital casts on which t...

  13. Implementation of a blockchain workflow management prototype

    OpenAIRE

    Fridgen, Gilbert; Sablowsky, Bernd; Urbach, Nils

    2017-01-01

    Blockchain technology offers huge potential to various industries and application areas. In a joint applied research project, Fraunhofer Institute for Applied Information Technology (FIT) together with Norddeutsche Landesbank (NORD/LB) identified (inter-company) workflow management as a promising application area and developed a Blockchain prototype for a documentary letter of credit in the international shipping business. In addition to the projects explicit outcome a Blockchain prototype an...

  14. Ad–hoc Workflows for Higher Education

    OpenAIRE

    Martinho, David; Coelho, Samuel; Guerra e Silva, Luis; Carvalho, João Carlos; Severo, Rita; Cruz, Luis; Ventura, Artur; Santos, Pedro; Barata, Ricardo

    2015-01-01

    It is commonplace in higher education institutions, for a large number of informal business processes to be handled through e–mail. However, e–mail fails to properly support traceability, and it fosters artifact duplication. This paper introduces GEARS: a solution to the e–mail overuse phenomenon. GEARS is a new approach to ad–hoc workflow systems that focuses on keeping the simplicity of the e–mail user experience while implementing a participation–driven process execution.

  15. Reduction of Hospital Physicians' Workflow Interruptions: A Controlled Unit-Based Intervention Study

    Directory of Open Access Journals (Sweden)

    Matthias Weigl

    2012-01-01

    Full Text Available Highly interruptive clinical environments may cause work stress and suboptimal clinical care. This study features an intervention to reduce workflow interruptions by re-designing work and organizational practices in hospital physicians providing ward coverage. A prospective, controlled intervention was conducted in two surgical and two internal wards. The intervention was based on physician quality circles - a participative technique to involve employees in the development of solutions to overcome work-related stressors. Outcome measures were the frequency of observed workflow interruptions. Workflow interruptions by fellow physicians and nursing staff were significantly lower after the intervention. However, a similar decrease was also observed in control units. Additional interviews to explore process-related factors suggested that there might have been spill-over effects in the sense that solutions were not strictly confined to the intervention group. Recommendations for further research on the effectiveness and consequences of such interventions for professional communication and patient safety are discussed.

  16. Parallel workflow tools to facilitate human brain MRI post-processing

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2015-05-01

    Full Text Available Multi-modal magnetic resonance imaging (MRI techniques are widely applied in human brain studies. To obtain specific brain measures of interest from MRI datasets, a number of complex image post-processing steps are typically required. Parallel workflow tools have recently been developed, concatenating individual processing steps and enabling fully automated processing of raw MRI data to obtain the final results. These workflow tools are also designed to make optimal use of available computational resources and to support the parallel processing of different subjects or of independent processing steps for a single subject. Automated, parallel MRI post-processing tools can greatly facilitate relevant brain investigations and are being increasingly applied. In this review, we briefly summarize these parallel workflow tools and discuss relevant issues.

  17. MOARF, an Integrated Workflow for Multiobjective Optimization: Implementation, Synthesis, and Biological Evaluation.

    Science.gov (United States)

    Firth, Nicholas C; Atrash, Butrus; Brown, Nathan; Blagg, Julian

    2015-06-22

    We describe the development and application of an integrated, multiobjective optimization workflow (MOARF) for directed medicinal chemistry design. This workflow couples a rule-based molecular fragmentation scheme (SynDiR) with a pharmacophore fingerprint-based fragment replacement algorithm (RATS) to broaden the scope of reconnection options considered in the generation of potential solution structures. Solutions are ranked by a multiobjective scoring algorithm comprising ligand-based (shape similarity) biochemical activity predictions as well as physicochemical property calculations. Application of this iterative workflow to optimization of the CDK2 inhibitor Seliciclib (CYC202, R-roscovitine) generated solution molecules in desired physicochemical property space. Synthesis and experimental evaluation of optimal solution molecules demonstrates CDK2 biochemical activity and improved human metabolic stability.

  18. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    Science.gov (United States)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  19. Kwf-Grid workflow management system for Earth science applications

    Science.gov (United States)

    Tran, V.; Hluchy, L.

    2009-04-01

    In this paper, we present workflow management tool for Earth science applications in EGEE. The workflow management tool was originally developed within K-wf Grid project for GT4 middleware and has many advanced features like semi-automatic workflow composition, user-friendly GUI for managing workflows, knowledge management. In EGEE, we are porting the workflow management tool to gLite middleware for Earth science applications K-wf Grid workflow management system was developed within "Knowledge-based Workflow System for Grid Applications" under the 6th Framework Programme. The workflow mangement system intended to - semi-automatically compose a workflow of Grid services, - execute the composed workflow application in a Grid computing environment, - monitor the performance of the Grid infrastructure and the Grid applications, - analyze the resulting monitoring information, - capture the knowledge that is contained in the information by means of intelligent agents, - and finally to reuse the joined knowledge gathered from all participating users in a collaborative way in order to efficiently construct workflows for new Grid applications. Kwf Grid workflow engines can support different types of jobs (e.g. GRAM job, web services) in a workflow. New class of gLite job has been added to the system, allows system to manage and execute gLite jobs in EGEE infrastructure. The GUI has been adapted to the requirements of EGEE users, new credential management servlet is added to portal. Porting K-wf Grid workflow management system to gLite would allow EGEE users to use the system and benefit from its avanced features. The system is primarly tested and evaluated with applications from ES clusters.

  20. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...... care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems...

  1. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes...... an approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory...

  2. An observation tool for studying patient-oriented workflow in hospital emergency departments.

    Science.gov (United States)

    Ozkaynak, M; Brennan, P

    2013-01-01

    Studying workflow is a critical step in designing, implementing and evaluating informatics interventions in complex sociotechnical settings, such as hospital emergency departments (EDs). Known approaches to studying workflow in clinical settings attend to the activities of individual clinicians, thus being inadequate to characterize patient care as a cooperative work. The purpose of this paper is twofold. First, we introduce a novel, theory-driven patient-oriented workflow methodology, which better addresses the complex, multiple-provider nature of patient care. Second, we report the development of an observational tool and protocol for use in studies of this type, and the results of an evaluation study. We created a tablet computer implementation of an instrument to efficiently capture patient-oriented workflow, and evaluated it through a field study in three EDs. We focused on activities occurring over time during a single patient care episode as well as the roles of the ED staff members who conducted the activities. The evidence generated supports the validity, viability, and reliability of the tool. The coverage of the tool in terms of activities and roles was satisfactory. The tool was able to capture the sequence of activity-role pairs for 108 patient care episodes. The inter-rater reliability assessment yielded a high kappa value (0.79). The patient-oriented workflow methodology has the potential to facilitate modeling patient care in EDs by characterizing both roles and activities in sequence. The methodology also provides researchers and practitioners a more realistic and comprehensive workflow perspective that can inform the design, implementation and evaluation of health information technology interventions.

  3. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  4. Integration of implant planning workflows into the PACS infrastructure

    Science.gov (United States)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific selection and positioning of implants from a repository according to patient image data. The information models are based on clinical workflows from ENT, cardiac, and orthopedic surgery as well as technical requirements derived from different use cases and systems. We show an exemplary implementation of the model for application in ENT surgery: the selection and positioning of an ossicular implant in the middle ear. An implant repository is stored in the PACS. It makes use of an experimental implementation of the Surface Mesh Module that is currently being developed as extension to the DICOM standard.

  5. Kepler WebView: A Lightweight, Portable Framework for Constructing Real-time Web Interfaces of Scientific Workflows.

    Science.gov (United States)

    Crawl, Daniel; Singh, Alok; Altintas, Ilkay

    2016-01-01

    Modern web technologies facilitate the creation of high-quality data visualizations, and rich, interactive components across a wide variety of devices. Scientific workflow systems can greatly benefit from these technologies by giving scientists a better understanding of their data or model leading to new insights. While several projects have enabled web access to scientific workflow systems, they are primarily organized as a large portal server encapsulating the workflow engine. In this vision paper, we propose the design for Kepler WebView, a lightweight framework that integrates web technologies with the Kepler Scientific Workflow System. By embedding a web server in the Kepler process, Kepler WebView enables a wide variety of usage scenarios that would be difficult or impossible using the portal model.

  6. Physicians' and Nurses' Opinions about the Impact of a Computerized Provider Order Entry System on Their Workflow.

    Science.gov (United States)

    Ayatollahi, Haleh; Roozbehi, Masoud; Haghani, Hamid

    2015-01-01

    In clinical practices, the use of information technology, especially computerized provider order entry (CPOE) systems, has been found to be an effective strategy to improve patient care. This study aimed to compare physicians' and nurses' views about the impact of CPOE on their workflow. This case study was conducted in 2012. The potential participants included all physicians (n = 28) and nurses (n = 145) who worked in a teaching hospital. Data were collected using a five-point Likert-scale questionnaire and were analyzed using SPSS version 18.0. The results showed a significant difference between physicians' and nurses' views about the impact of the system on interorganizational workflow (p = .001) and working relationships between physicians and nurses (p = .017). Interorganizational workflow and working relationships between care providers are important issues that require more attention. Before a CPOE system is designed, it is necessary to identify workflow patterns and hidden structures to avoid compromising quality of care and patient safety.

  7. Modelo para el diseño de sistemas gestores de workflows con funcionalidades colaborativas, cloud y móviles

    OpenAIRE

    Castelán Maldonado, Edgar

    2014-01-01

    This research was developed in the context of WFMS (Workflow Management Systems), mobile applications, cloud computing, and collaborative systems. Currently the design of WFMS is based on the reference model proposed by the WfMC (Workflow Management Coalition). The problem that exists today in the design and development of WfMS is that the reference model proposed by the WfMC was designed many years before the rise of mobile technologies, cloud computing and collaborative systems. It is impor...

  8. chemalot and chemalot_knime: Command line programs as workflow tools for drug discovery.

    Science.gov (United States)

    Lee, Man-Ling; Aliagas, Ignacio; Feng, Jianwen A; Gabriel, Thomas; O'Donnell, T J; Sellers, Benjamin D; Wiswedel, Bernd; Gobbi, Alberto

    2017-06-12

    Analyzing files containing chemical information is at the core of cheminformatics. Each analysis may require a unique workflow. This paper describes the chemalot and chemalot_knime open source packages. Chemalot is a set of command line programs with a wide range of functionalities for cheminformatics. The chemalot_knime package allows command line programs that read and write SD files from stdin and to stdout to be wrapped into KNIME nodes. The combination of chemalot and chemalot_knime not only facilitates the compilation and maintenance of sequences of command line programs but also allows KNIME workflows to take advantage of the compute power of a LINUX cluster. Use of the command line programs is demonstrated in three different workflow examples: (1) A workflow to create a data file with project-relevant data for structure-activity or property analysis and other type of investigations, (2) The creation of a quantitative structure-property-relationship model using the command line programs via KNIME nodes, and (3) The analysis of strain energy in small molecule ligand conformations from the Protein Data Bank database. The chemalot and chemalot_knime packages provide lightweight and powerful tools for many tasks in cheminformatics. They are easily integrated with other open source and commercial command line tools and can be combined to build new and even more powerful tools. The chemalot_knime package facilitates the generation and maintenance of user-defined command line workflows, taking advantage of the graphical design capabilities in KNIME. Graphical abstract Example KNIME workflow with chemalot nodes and the corresponding command line pipe.

  9. Text mining for the biocuration workflow

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  10. Workflow interruptions and failed action regulation in surgery personnel.

    Science.gov (United States)

    Elfering, Achim; Nützi, Marina; Koch, Patricia; Baur, Heiner

    2014-03-01

    Workflow interruptions during surgery may cause a threat to patient's safety. Workflow interruptions were tested to predict failure in action regulation that in turn predicts near-accidents in surgery and related health care. One-hundred-and-thirty-three theater nurses and physicians from eight Swiss hospitals participated in a cross-sectional questionnaire survey. The study participation rate was 43%. Structural equation modeling confirmed an indirect path from workflow interruptions through cognitive failure in action regulation on near-accidents (p < 0.05). The indirect path was stronger for workflow interruptions by malfunctions and task organizational blockages compared with workflow interruptions that were caused by persons. The indirect path remained meaningful when individual differences in conscientiousness and compliance with safety regulations were controlled. Task interruptions caused by malfunction and organizational constraints are likely to trigger errors in surgery. Work redesign is recommended to reduce workflow interruptions by malfunction and regulatory constraints.

  11. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  12. Workflow Interruptions and Failed Action Regulation in Surgery Personnel

    Science.gov (United States)

    Elfering, Achim; Nützi, Marina; Koch, Patricia; Baur, Heiner

    2013-01-01

    Background Workflow interruptions during surgery may cause a threat to patient's safety. Workflow interruptions were tested to predict failure in action regulation that in turn predicts near-accidents in surgery and related health care. Methods One-hundred-and-thirty-three theater nurses and physicians from eight Swiss hospitals participated in a cross-sectional questionnaire survey. The study participation rate was 43%. Results Structural equation modeling confirmed an indirect path from workflow interruptions through cognitive failure in action regulation on near-accidents (p < 0.05). The indirect path was stronger for workflow interruptions by malfunctions and task organizational blockages compared with workflow interruptions that were caused by persons. The indirect path remained meaningful when individual differences in conscientiousness and compliance with safety regulations were controlled. Conclusion Task interruptions caused by malfunction and organizational constraints are likely to trigger errors in surgery. Work redesign is recommended to reduce workflow interruptions by malfunction and regulatory constraints. PMID:24932412

  13. Integrated Cloud-Based Services for Medical Workflow Systems

    Directory of Open Access Journals (Sweden)

    Gharbi Nada

    2016-12-01

    Full Text Available Recent years have witnessed significant progress of workflow systems in different business areas. However, in the medical domain, the workflow systems are comparatively scarcely researched. In the medical domain, the workflows are as important as in other areas. In fact, the flow of information in the healthcare industry is even more critical than it is in other industries. Workflow can provide a new way of looking at how processes and procedures are completed in particular medical systems, and it can help improve the decision-making in these systems. Despite potential capabilities of workflow systems, medical systems still often perceive critical challenges in maintaining patient medical information that results in the difficulties in accessing patient data by different systems. In this paper, a new cloud-based service-oriented architecture is proposed. This architecture will support a medical workflow system integrated with cloud services aligned with medical standards to improve the healthcare system.

  14. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  15. A Chemistry-Inspired Workflow Management System for Decentralizing Workflow Execution

    NARCIS (Netherlands)

    Fernandez, H.J.; Tedeschi, Cedric; Priol, Thierry

    2013-01-01

    With the recent widespread adoption of service-oriented architecture, the dynamic composition of services is now a crucial issue in the area of distributed computing. The coordination and execution of composite Web services are today typically conducted by heavyweight centralized workflow engines,

  16. RAMPART: a workflow management system for de novo genome assembly.

    Science.gov (United States)

    Mapleson, Daniel; Drou, Nizar; Swarbreck, David

    2015-06-01

    The de novo assembly of genomes from whole- genome shotgun sequence data is a computationally intensive, multi-stage task and it is not known a priori which methods and parameter settings will produce optimal results. In current de novo assembly projects, a popular strategy involves trying many approaches, using different tools and settings, and then comparing and contrasting the results in order to select a final assembly for publication. Herein, we present RAMPART, a configurable workflow management system for de novo genome assembly, which helps the user identify combinations of third-party tools and settings that provide good results for their particular genome and sequenced reads. RAMPART is designed to exploit High performance computing environments, such as clusters and shared memory systems, where available. RAMPART is available under the GPLv3 license at: https://github.com/TGAC/RAMPART. © The Author 2015. Published by Oxford University Press.

  17. Beginning WF Windows Workflow in .NET 4.0

    CERN Document Server

    Collins, M

    2010-01-01

    Windows Workflow Foundation is a ground-breaking addition to the core of the .NET Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for a workflow-based solution has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been compose

  18. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  19. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  20. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  1. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  2. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  3. Research electronic data capture (REDCap)—A metadata-driven methodology and workflow process for providing translational research informatics support

    National Research Council Canada - National Science Library

    Harris, Paul A; Taylor, Robert; Thielke, Robert; Payne, Jonathon; Gonzalez, Nathaniel; Conde, Jose G

    2009-01-01

    Research electronic data capture (REDCap) is a novel workflow methodology and software solution designed for rapid development and deployment of electronic data capture tools to support clinical and translational research. We present: (1...

  4. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  5. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Directory of Open Access Journals (Sweden)

    Gabriele Bleser

    Full Text Available Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1 fusion of inertial and visual sensor information from an on-body sensor network (BSN to robustly track the user's pose in magnetically polluted environments; 2 learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3 domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4 context-sensitive augmented reality (AR user feedback using a head-mounted display (HMD. A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen

  6. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks.

    Science.gov (United States)

    Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G; Hogg, David C; Stricker, Didier

    2015-01-01

    Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user's pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a

  7. Emergency medicine resident physicians' perceptions of electronic documentation and workflow: a mixed methods study.

    Science.gov (United States)

    Neri, P M; Redden, L; Poole, S; Pozner, C N; Horsky, J; Raja, A S; Poon, E; Schiff, G; Landman, A

    2015-01-01

    To understand emergency department (ED) physicians' use of electronic documentation in order to identify usability and workflow considerations for the design of future ED information system (EDIS) physician documentation modules. We invited emergency medicine resident physicians to participate in a mixed methods study using task analysis and qualitative interviews. Participants completed a simulated, standardized patient encounter in a medical simulation center while documenting in the test environment of a currently used EDIS. We recorded the time on task, type and sequence of tasks performed by the participants (including tasks performed in parallel). We then conducted semi-structured interviews with each participant. We analyzed these qualitative data using the constant comparative method to generate themes. Eight resident physicians participated. The simulation session averaged 17 minutes and participants spent 11 minutes on average on tasks that included electronic documentation. Participants performed tasks in parallel, such as history taking and electronic documentation. Five of the 8 participants performed a similar workflow sequence during the first part of the session while the remaining three used different workflows. Three themes characterize electronic documentation: (1) physicians report that location and timing of documentation varies based on patient acuity and workload, (2) physicians report a need for features that support improved efficiency; and (3) physicians like viewing available patient data but struggle with integration of the EDIS with other information sources. We confirmed that physicians spend much of their time on documentation (65%) during an ED patient visit. Further, we found that resident physicians did not all use the same workflow and approach even when presented with an identical standardized patient scenario. Future EHR design should consider these varied workflows while trying to optimize efficiency, such as improving

  8. Cognitive Learning, Monitoring and Assistance of Industrial Workflows Using Egocentric Sensor Networks

    Science.gov (United States)

    Bleser, Gabriele; Damen, Dima; Behera, Ardhendu; Hendeby, Gustaf; Mura, Katharina; Miezal, Markus; Gee, Andrew; Petersen, Nils; Maçães, Gustavo; Domingues, Hugo; Gorecky, Dominic; Almeida, Luis; Mayol-Cuevas, Walterio; Calway, Andrew; Cohn, Anthony G.; Hogg, David C.; Stricker, Didier

    2015-01-01

    Today, the workflows that are involved in industrial assembly and production activities are becoming increasingly complex. To efficiently and safely perform these workflows is demanding on the workers, in particular when it comes to infrequent or repetitive tasks. This burden on the workers can be eased by introducing smart assistance systems. This article presents a scalable concept and an integrated system demonstrator designed for this purpose. The basic idea is to learn workflows from observing multiple expert operators and then transfer the learnt workflow models to novice users. Being entirely learning-based, the proposed system can be applied to various tasks and domains. The above idea has been realized in a prototype, which combines components pushing the state of the art of hardware and software designed with interoperability in mind. The emphasis of this article is on the algorithms developed for the prototype: 1) fusion of inertial and visual sensor information from an on-body sensor network (BSN) to robustly track the user’s pose in magnetically polluted environments; 2) learning-based computer vision algorithms to map the workspace, localize the sensor with respect to the workspace and capture objects, even as they are carried; 3) domain-independent and robust workflow recovery and monitoring algorithms based on spatiotemporal pairwise relations deduced from object and user movement with respect to the scene; and 4) context-sensitive augmented reality (AR) user feedback using a head-mounted display (HMD). A distinguishing key feature of the developed algorithms is that they all operate solely on data from the on-body sensor network and that no external instrumentation is needed. The feasibility of the chosen approach for the complete action-perception-feedback loop is demonstrated on three increasingly complex datasets representing manual industrial tasks. These limited size datasets indicate and highlight the potential of the chosen technology as a

  9. Computational workflow for the fine-grained analysis of metagenomic samples.

    Science.gov (United States)

    Pérez-Wohlfeil, Esteban; Arjona-Medina, Jose A; Torreno, Oscar; Ulzurrun, Eugenia; Trelles, Oswaldo

    2016-10-25

    The field of metagenomics, defined as the direct genetic analysis of uncultured samples of genomes contained within an environmental sample, is gaining increasing popularity. The aim of studies of metagenomics is to determine the species present in an environmental community and identify changes in the abundance of species under different conditions. Current metagenomic analysis software faces bottlenecks due to the high computational load required to analyze complex samples. A computational open-source workflow has been developed for the detailed analysis of metagenomes. This workflow provides new tools and datafile specifications that facilitate the identification of differences in abundance of reads assigned to taxa (mapping), enables the detection of reads of low-abundance bacteria (producing evidence of their presence), provides new concepts for filtering spurious matches, etc. Innovative visualization ideas for improved display of metagenomic diversity are also proposed to better understand how reads are mapped to taxa. Illustrative examples are provided based on the study of two collections of metagenomes from faecal microbial communities of adult female monozygotic and dizygotic twin pairs concordant for leanness or obesity and their mothers. The proposed workflow provides an open environment that offers the opportunity to perform the mapping process using different reference databases. Additionally, this workflow shows the specifications of the mapping process and datafile formats to facilitate the development of new plugins for further post-processing. This open and extensible platform has been designed with the aim of enabling in-depth analysis of metagenomic samples and better understanding of the underlying biological processes.

  10. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  11. Workflow interruptions and mental workload in hospital pediatricians: an observational study.

    Science.gov (United States)

    Weigl, Matthias; Müller, Andreas; Angerer, Peter; Hoffmann, Florian

    2014-09-24

    Pediatricians' workload is increasingly thought to affect pediatricians' quality of work life and patient safety. Workflow interruptions are a frequent stressor in clinical work, impeding clinicians' attention and contributing to clinical malpractice. We aimed to investigate prospective associations of workflow interruptions with multiple dimensions of mental workload in pediatricians during clinical day shifts. In an Academic Children's Hospital a prospective study of 28 full shift observations was conducted among pediatricians providing ward coverage. The prevalence of workflow interruptions was based on expert observation using a validated observation instrument. Concurrently, Pediatricians' workload ratings were assessed with three workload dimensions of the well-validated NASA-Task Load Index: mental demands, effort, and frustration. Observed pediatricians were, on average, disrupted 4.7 times per hour. Most frequent were interruptions by colleagues (30.2%), nursing staff (29.7%), and by telephone/beeper calls (16.3%). Interruption measures were correlated with two workload outcomes of interest: frequent workflow interruptions were related to less cognitive demands, but frequent interruptions were associated with increased frustration. With regard to single sources, interruptions by colleagues showed the strongest associations to workload. The findings provide insights into specific pathways between different types of interruptions and pediatricians' mental workload. These findings suggest further research and yield a number of work and organization re-design suggestions for pediatric care.

  12. Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.

    Science.gov (United States)

    Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob

    2017-06-12

    A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO3- and NO2- in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg2+, Ca2+, and Ba2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.

  13. Interacting with the National Database for Autism Research (NDAR) via the LONI Pipeline workflow environment.

    Science.gov (United States)

    Torgerson, Carinna M; Quinn, Catherine; Dinov, Ivo; Liu, Zhizhong; Petrosyan, Petros; Pelphrey, Kevin; Haselgrove, Christian; Kennedy, David N; Toga, Arthur W; Van Horn, John Darrell

    2015-03-01

    Under the umbrella of the National Database for Clinical Trials (NDCT) related to mental illnesses, the National Database for Autism Research (NDAR) seeks to gather, curate, and make openly available neuroimaging data from NIH-funded studies of autism spectrum disorder (ASD). NDAR has recently made its database accessible through the LONI Pipeline workflow design and execution environment to enable large-scale analyses of cortical architecture and function via local, cluster, or "cloud"-based computing resources. This presents a unique opportunity to overcome many of the customary limitations to fostering biomedical neuroimaging as a science of discovery. Providing open access to primary neuroimaging data, workflow methods, and high-performance computing will increase uniformity in data collection protocols, encourage greater reliability of published data, results replication, and broaden the range of researchers now able to perform larger studies than ever before. To illustrate the use of NDAR and LONI Pipeline for performing several commonly performed neuroimaging processing steps and analyses, this paper presents example workflows useful for ASD neuroimaging researchers seeking to begin using this valuable combination of online data and computational resources. We discuss the utility of such database and workflow processing interactivity as a motivation for the sharing of additional primary data in ASD research and elsewhere.

  14. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  15. A Hybrid Task Graph Scheduler for High Performance Image Processing Workflows.

    Science.gov (United States)

    Blattner, Timothy; Keyrouz, Walid; Bhattacharyya, Shuvra S; Halem, Milton; Brady, Mary

    2017-12-01

    Designing applications for scalability is key to improving their performance in hybrid and cluster computing. Scheduling code to utilize parallelism is difficult, particularly when dealing with data dependencies, memory management, data motion, and processor occupancy. The Hybrid Task Graph Scheduler (HTGS) improves programmer productivity when implementing hybrid workflows for multi-core and multi-GPU systems. The Hybrid Task Graph Scheduler (HTGS) is an abstract execution model, framework, and API that increases programmer productivity when implementing hybrid workflows for such systems. HTGS manages dependencies between tasks, represents CPU and GPU memories independently, overlaps computations with disk I/O and memory transfers, keeps multiple GPUs occupied, and uses all available compute resources. Through these abstractions, data motion and memory are explicit; this makes data locality decisions more accessible. To demonstrate the HTGS application program interface (API), we present implementations of two example algorithms: (1) a matrix multiplication that shows how easily task graphs can be used; and (2) a hybrid implementation of microscopy image stitching that reduces code size by ≈ 43% compared to a manually coded hybrid workflow implementation and showcases the minimal overhead of task graphs in HTGS. Both of the HTGS-based implementations show good performance. In image stitching the HTGS implementation achieves similar performance to the hybrid workflow implementation. Matrix multiplication with HTGS achieves 1.3× and 1.8× speedup over the multi-threaded OpenBLAS library for 16k × 16k and 32k × 32k size matrices, respectively.

  16. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  17. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This report presents a model as

  18. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  19. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu jinmin, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  20. Piloting an empirical study on measures for workflow similarity

    NARCIS (Netherlands)

    Wombacher, Andreas; Rozie, M.

    Service discovery of state dependent services has to take workflow aspects into account. To increase the usability of a service discovery, the result list of services should be ordered with regard to the relevance of the services. Means of ordering a list of workflows due to their similarity with

  1. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  2. A Formal Semantics for UML Activity Diagrams - Formalising Workflow Models

    NARCIS (Netherlands)

    Eshuis, H.; Wieringa, Roelf J.

    In this report we define a formal execution semantics for UML activity diagrams that is appropriate for workflow modelling. Our workflow models express software requirements and therefore assume a perfect implementation. In our semantics, software state changes do not take time. It is based upon the

  3. Building and documenting workflows with python-based snakemake

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to

  4. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  5. Kronos: a workflow assembler for genome analytics and informatics.

    Science.gov (United States)

    Taghiyar, M Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C; Morin, Ryan D; Bashashati, Ali; Shah, Sohrab P

    2017-07-01

    The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into "best practices" for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos.

  6. An architecture including network QoS in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    The quality of the network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune the execution quality of workflows neglecting network resources, and by selecting only optimal software services and computing resources. One

  7. Network resource selection for data transfer processes in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    Quality of the service (QoS) plays an important role in the life-cycle of scientific workflows for composing and executing applications. However, the quality of network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune

  8. Integrating accelerated tryptic digestion into proteomics workflows.

    Science.gov (United States)

    Slysz, Gordon W; Schriemer, David C

    2009-01-01

    An accelerated protein digestion procedure is described that features a microscale trypsin cartridge operated under aqueous-organic conditions. High sequence coverage digestions obtained in seconds with small amounts of enzyme are possible with the approach, which also supports online integration of digestion with reversed-phase protein separation. The construction and operation of effective digestor cartridges for rapid sample processing are described. For workflows involving chromatographic protein separation an easily assembled fluidic system is presented, which inserts the digestion step after column-based separation. Successful integration requires dynamic effluent titration immediately prior to transmission through the digestor. This is achieved through the co-ordination of the column gradient system with an inverse gradient system to produce steady pH and organic solvent levels. System assembly and operation sufficient for achieving digestion and identification of subnanogram levels of protein are described.

  9. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  10. Possibilistic Information Flow Control for Workflow Management Systems

    Directory of Open Access Journals (Sweden)

    Thomas Bauereiss

    2014-04-01

    Full Text Available In workflows and business processes, there are often security requirements on both the data, i.e. confidentiality and integrity, and the process, e.g. separation of duty. Graphical notations exist for specifying both workflows and associated security requirements. We present an approach for formally verifying that a workflow satisfies such security requirements. For this purpose, we define the semantics of a workflow as a state-event system and formalise security properties in a trace-based way, i.e. on an abstract level without depending on details of enforcement mechanisms such as Role-Based Access Control (RBAC. This formal model then allows us to build upon well-known verification techniques for information flow control. We describe how a compositional verification methodology for possibilistic information flow can be adapted to verify that a specification of a distributed workflow management system satisfies security requirements on both data and processes.

  11. wft4galaxy: a workflow testing tool for galaxy.

    Science.gov (United States)

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it.

  12. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  13. Digital Workflow for Computer-Guided Implant Surgery in Edentulous Patients: A Case Report.

    Science.gov (United States)

    Oh, Ji-Hyeon; An, Xueyin; Jeong, Seung-Mi; Choi, Byung-Ho

    2017-12-01

    The purpose of this article was to describe a fully digital workflow used to perform computer-guided flapless implant placement in an edentulous patient without the use of conventional impressions, models, or a radiographic guide. Digital data for the workflow were acquired using an intraoral scanner and cone-beam computed tomography (CBCT). The image fusion of the intraoral scan data and CBCT data was performed by matching resin markers placed in the patient's mouth. The definitive digital data were used to design a prosthetically driven implant position, surgical template, and computer-aided design and computer-aided manufacturing fabricated fixed dental prosthesis. The authors believe this is the first published case describing such a technique in computer-guided flapless implant surgery for edentulous patients. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  14. Teaching workflow analysis and lean thinking via simulation: a formative evaluation.

    Science.gov (United States)

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation).

  15. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    Science.gov (United States)

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Workflow Of Socialization Media Creation About Ad Aware For Teenagers In Junior High School

    OpenAIRE

    Ahmad Faiz Muntazori; Winny Gunarti W.W.,; Rina Wahyu Winarni

    2015-01-01

    Ad consumption among teenagers may have an impact on consumer lifestyle. To anticipate it would required the creation of a medium of socialization about ad aware in school. The purpose is so that the teens can be selectively consume ad. This study used a qualitative approach and design sociology to describe workflow processes of socialization media creation in junior high school environment. As part of the solution to the problem among teenagers this study also formulate attitudes for t...

  17. The Integration of Product Data with Workflow Management Systems Through a Common Data Model

    CERN Document Server

    Kovács, Z

    1999-01-01

    Traditionally product models, and their definitions, have been handled separately from process models and their definitions. In industry, each has been managed by database systems defined for their specific domain, e.g. Product Data Management (PDM) for product definitions and Workflow Management System (WfM) for process definitions. There is little or no overlap between these two views of systems even though product and process information interact over the complete life cycle from design to production...

  18. Simplified Toolbar to Accelerate Repeated Tasks (START) for ArcGIS: Optimizing Workflows in Humanitarian Demining

    OpenAIRE

    Lacroix, Pierre Marcel Anselme; De Roulet, Pablo; Ray, Nicolas

    2014-01-01

    This paper presents START (Simplified Toolbar to Accelerate Repeated Tasks), a new, freely downloadable ArcGIS extension designed for non-expert GIS users. START was developed jointly by the Geneva International Centre for Humanitarian Demining (GICHD) and the University of Geneva to support frequent workflows relating to mine action. START brings together a series of basic ArcGIS tools in one toolbar and provides new geoprocessing, geometry and database management functions. The toolbar oper...

  19. BioMoby extensions to the Taverna workflow management and enactment software

    Directory of Open Access Journals (Sweden)

    Senger Martin

    2006-11-01

    Full Text Available Abstract Background As biology becomes an increasingly computational science, it is critical that we develop software tools that support not only bioinformaticians, but also bench biologists in their exploration of the vast and complex data-sets that continue to build from international genomic, proteomic, and systems-biology projects. The BioMoby interoperability system was created with the goal of facilitating the movement of data from one Web-based resource to another to fulfill the requirements of non-expert bioinformaticians. In parallel with the development of BioMoby, the European myGrid project was designing Taverna, a bioinformatics workflow design and enactment tool. Here we describe the marriage of these two projects in the form of a Taverna plug-in that provides access to many of BioMoby's features through the Taverna interface. Results The exposed BioMoby functionality aids in the design of "sensible" BioMoby workflows, aids in pipelining BioMoby and non-BioMoby-based resources, and ensures that end-users need only a minimal understanding of both BioMoby, and the Taverna interface itself. Users are guided through the construction of syntactically and semantically correct workflows through plug-in calls to the Moby Central registry. Moby Central provides a menu of only those BioMoby services capable of operating on the data-type(s that exist at any given position in the workflow. Moreover, the plug-in automatically and correctly connects a selected service into the workflow such that users are not required to understand the nature of the inputs or outputs for any service, leaving them to focus on the biological meaning of the workflow they are constructing, rather than the technical details of how the services will interoperate. Conclusion With the availability of the BioMoby plug-in to Taverna, we believe that BioMoby-based Web Services are now significantly more useful and accessible to bench scientists than are more traditional

  20. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  2. Deploying and sharing U-Compare workflows as web services

    Science.gov (United States)

    2013-01-01

    Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. Results We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. Conclusions The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform. PMID:23419017

  3. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph.......We present a field study of oncology workflow, involving doctors, nurses and pharmacists at Danish hospitals and discuss the obstacles, enablers and challenges for the use of computer based clinical practice guidelines. Related to the CIGDec approach of Pesic and van der Aalst we then describe how...

  4. What is needed for effective open access workflows?

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  5. Editorial and Technological Workflow Tools to Promote Website Quality

    Directory of Open Access Journals (Sweden)

    Emily G. Morton-Owens

    2011-09-01

    Full Text Available Library websites are an increasingly visible representation of the library as an institution, which makes website quality an important way to communicate competence and trustworthiness to users. A website editorial workflow is one way to enforce a process and ensure quality. In a workflow, users receive roles, like author or editor, and content travels through various stages in which grammar, spelling, tone, and format are checked. One library used a workflow system to involve librarians in the creation of content. This system, implemented in Drupal, an opensource content management system, solved problems of coordination, quality, and comprehensiveness that existed on the library’s earlier, static website.

  6. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  7. Leveraging process data from BPM cloud-based workflows

    Directory of Open Access Journals (Sweden)

    Deasún Ó Conchúir

    2016-11-01

    Full Text Available This paper shares the experience of a virtual team of knowledge workers which coordinates its repetitive production work across 12 time zones using cloud-based workflows. A sample of process data and the utility that can be obtained from it are discussed, and recommendations are provided for the management of virtual teams using cloud-based workflows. In addition, possible directions for future research relating to virtual knowledge work teams using cloud workflows are suggested. Some references to related experience for further reading are also given.

  8. Soarian--workflow management applied for health care.

    Science.gov (United States)

    Haux, R; Seggewies, C; Baldauf-Sobez, W; Kullmann, P; Reichert, H; Luedecke, L; Seibold, H

    2003-01-01

    To describe and comment on functionality and architecture of the software product Soarian developed by Siemens, to identify key differentiators to related products, and to comment on predecessor systems and beta versions. This has been done in the framework of a conference on health information systems of the IMIA. Analyzing existing literature. Site visit of a predecessor system at Haukeland Sykehus, Bergen. Pilot of a beta version at the Erlangen University Medical Center, elaborating on major characteristics in discussion rounds. Soarian is a functional comprehensive, clinically oriented software product to support health care processes and to be used for health care professional workstations. It is a software product, designed and written completely new. Three major key differentiators were identified in comparison to related software products: Soarian's workflow engine, its embedded analytics, and its 'smart' user interface. The targeted reduced installation time is stated to be 12 months or less. Soarian has good chances to become one of the major software products for health care professional workstations in the international market to support patient-centered, shared care. Its global design may help to better support and maintain national or language specific versions. The first installations of Soarian will be critical, as they will show how the system will be accepted. To use such software products efficiently, organizational aspects within hospitals as well as between health care institutions have to be considered, e.g. strategic IT planning.

  9. Integrating Behavioral Health in Primary Care Using Lean Workflow Analysis: A Case Study

    Science.gov (United States)

    van Eeghen, Constance; Littenberg, Benjamin; Holman, Melissa D.; Kessler, Rodger

    2016-01-01

    Background Primary care offices are integrating behavioral health (BH) clinicians into their practices. Implementing such a change is complex, difficult, and time consuming. Lean workflow analysis may be an efficient, effective, and acceptable method for integration. Objective Observe BH integration into primary care and measure its impact. Design Prospective, mixed methods case study in a primary care practice. Measurements Change in treatment initiation (referrals generating BH visits within the system). Secondary measures: primary care visits resulting in BH referrals, referrals resulting in scheduled appointments, time from referral to scheduled appointment, and time from referral to first visit. Providers and staff were surveyed on the Lean method. Results Referrals increased from 23 to 37/1000 visits (PLean included workflow improvement, system perspective, and project success. Further evaluation is indicated. PMID:27170796

  10. McRunjob: A High Energy Physics Workflow Planner for Grid Production Processing

    CERN Document Server

    Graham, G E; Bertram, I; Graham, Gregory E.; Evans, Dave; Bertram, Iain

    2003-01-01

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job d...

  11. Enhanced reproducibility of SADI web service workflows with Galaxy and Docker.

    Science.gov (United States)

    Aranguren, Mikel Egaña; Wilkinson, Mark D

    2015-01-01

    Semantic Web technologies have been widely applied in the life sciences, for example by data providers such as OpenLifeData and through web services frameworks such as SADI. The recently reported OpenLifeData2SADI project offers access to the vast OpenLifeData data store through SADI services. This article describes how to merge data retrieved from OpenLifeData2SADI with other SADI services using the Galaxy bioinformatics analysis platform, thus making this semantic data more amenable to complex analyses. This is demonstrated using a working example, which is made distributable and reproducible through a Docker image that includes SADI tools, along with the data and workflows that constitute the demonstration. The combination of Galaxy and Docker offers a solution for faithfully reproducing and sharing complex data retrieval and analysis workflows based on the SADI Semantic web service design patterns.

  12. Workflow Interruptions and Failed Action Regulation in Surgery Personnel

    Directory of Open Access Journals (Sweden)

    Achim Elfering

    2014-03-01

    Conclusion: Task interruptions caused by malfunction and organizational constraints are likely to trigger errors in surgery. Work redesign is recommended to reduce workflow interruptions by malfunction and regulatory constraints.

  13. RAMPART: a workflow management system for de novo genome assembly

    National Research Council Canada - National Science Library

    Mapleson, Daniel; Drou, Nizar; Swarbreck, David

    2015-01-01

    ... assembly for publication. Herein, we present RAMPART, a configurable workflow management system for de novo genome assembly, which helps the user identify combinations of third-party tools and settings that provide good...

  14. Common Workflow Service: Standards Based Solution for Managing Operational Processes

    Science.gov (United States)

    Tinio, A. W.; Hollins, G. A.

    2017-06-01

    The Common Workflow Service is a collaborative and standards-based solution for managing mission operations processes using techniques from the Business Process Management (BPM) discipline. This presentation describes the CWS and its benefits.

  15. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  16. A Community-Driven Workflow Recommendation and Reuse Infrastructure Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX...

  17. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  18. Network resource control for grid workflow management systems

    NARCIS (Netherlands)

    Strijkers, R.J.; Cristea, M.; Korkhov, V.; Marchal, D.; Belloum, A.; Laat, C.de; Meijer, R.J.

    2010-01-01

    Grid workflow management systems automate the orchestration of scientific applications with large computational and data processing needs, but lack control over network resources. Consequently, the management system cannot prevent multiple communication intensive applications to compete for network

  19. Workflow-enabled distributed component-based information architecture for digital medical imaging enterprises.

    Science.gov (United States)

    Wong, Stephen T C; Tjandra, Donny; Wang, Huili; Shen, Weimin

    2003-09-01

    Few information systems today offer a flexible means to define and manage the automated part of radiology processes, which provide clinical imaging services for the entire healthcare organization. Even fewer of them provide a coherent architecture that can easily cope with heterogeneity and inevitable local adaptation of applications and can integrate clinical and administrative information to aid better clinical, operational, and business decisions. We describe an innovative enterprise architecture of image information management systems to fill the needs. Such a system is based on the interplay of production workflow management, distributed object computing, Java and Web techniques, and in-depth domain knowledge in radiology operations. Our design adapts the approach of "4+1" architectural view. In this new architecture, PACS and RIS become one while the user interaction can be automated by customized workflow process. Clinical service applications are implemented as active components. They can be reasonably substituted by applications of local adaptations and can be multiplied for fault tolerance and load balancing. Furthermore, the workflow-enabled digital radiology system would provide powerful query and statistical functions for managing resources and improving productivity. This paper will potentially lead to a new direction of image information management. We illustrate the innovative design with examples taken from an implemented system.

  20. [Guided and computer-assisted implant surgery and prosthetic: The continuous digital workflow].

    Science.gov (United States)

    Pascual, D; Vaysse, J

    2016-02-01

    New continuous digital workflow protocols of guided and computer-assisted implant surgery improve accuracy of implant positioning. The design of the future prosthesis is based on the available prosthetic space, gingival height and occlusal relationship with the opposing and adjacent teeth. The implant position and length depend on volume, density and bone quality, gingival height, tooth-implant and implant-implant distances, implant parallelism, axis and type of the future prosthesis. The crown modeled on the software will therefore serve as a guide to the future implant axis and not the reverse. The guide is made by 3D printing. The software determines surgical protocol with the drilling sequences. The unitary or plural prosthesis, modeled on the software and built before surgery, is loaded directly after implant placing, if needed. These protocols allow for a full continuity of the digital workflow. The software provides the surgeon and the dental technician a total freedom for the prosthetic-surgery guide design and the position of the implants. The prosthetic project, occlusal and aesthetic, taking the bony and surgical constraints into account, is optimized. The implant surgery is simplified and becomes less "stressful" for the patient and the surgeon. Guided and computer-assisted surgery with continuous digital workflow is becoming the technique of choice to improve the accuracy and quality of implant rehabilitation. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  1. Reconstruction of Mandible: A Fully Digital Workflow From Visualized Iliac Bone Grafting to Implant Restoration.

    Science.gov (United States)

    Tian, Taoran; Zhang, Tao; Ma, Quanquan; Zhang, Qi; Cai, Xiaoxiao

    2017-07-01

    Although digital aids can help surgeons compensate for the shortcomings of traditional mandibular reconstruction techniques to perform surgery more precisely and effectively, the use of these digital techniques has often been fragmented, divided, and incomplete. This article describes the workflow of a fully digital mandibular reconstruction to explore the proper indications and discusses innovations based on the accuracy and effectiveness of digital techniques. A restoration-oriented mandibular reconstruction was performed by applying different digital techniques. Preoperative virtual surgery and rapid prototyping were used to aid the vascularized iliac bone graft surgery, which offered a solid basis for the ensuing treatment. Subsequently, implant rehabilitation was accomplished with the assistance of computer-assisted design and manufacture, laser treatment, and selective laser melting techniques. The workflow of the fully digital mandibular reconstruction successfully achieved a restoration-oriented treatment. These predictable, accurate, and effective digital techniques improved the consistency of pretreatment design and follow-up treatment. The treatment sequence achieved high predictability and reproducibility owing to the use of digital techniques. This study shows that a digital workflow can be predictable, accurate, and effective, which suggests that it could be a valid digital protocol for developing a treatment sequence for patients with jaw defects caused by trauma, congenital anomalies, or mandibular tumor resection. Copyright © 2017 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  2. Aligning HST Images to Gaia: A Faster Mosaicking Workflow

    Science.gov (United States)

    Bajaj, V.

    2017-11-01

    We present a fully programmatic workflow for aligning HST images using the high-quality astrometry provided by Gaia Data Release 1. Code provided in a Jupyter Notebook works through this procedure, including parsing the data to determine the query area parameters, querying Gaia for the coordinate catalog, and using the catalog with TweakReg as reference catalog. This workflow greatly simplifies the normally time-consuming process of aligning HST images, especially those taken as part of mosaics.

  3. An interpretive investigation of trust and workflow in advertising communities

    OpenAIRE

    Chim, Jimmy Chi Lung

    2016-01-01

    Adopting a socio–economic perspective and a multimethod field research approach this thesis investigates the correlation between trust and workflow in advertising communities of practice. Using a semiotic mode of analysis a comparative examination of offline and online communities will be conducted to inspect the practices of trust when multiple stakeholders follow the creative workflow process to fulfil creative briefs. The motivation to lead the research stems from a current lack of underst...

  4. Knowledge based approach to flexible workflow management systems

    OpenAIRE

    Lee, Habin

    1999-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded the Korea Advanced Institute of Science and Technology (KAIST). Today's business environments are characterized by dynamic and uncertain environments. In order to effectively support business processes in such contexts, workflow management systems must be able to adapt themselves effectively. In this dissertation, the workflow is redefined in concept and represented with a set of business rules. Business rules ...

  5. Enhanced Phosphoproteomic Profiling Workflow For Growth Factor Signaling Analysis

    DEFF Research Database (Denmark)

    Sylvester, Marc; Burbridge, Mike; Leclerc, Gregory

    2010-01-01

    A549 lung carcinoma cells were used as a model and stimulated with hepatocyte growth factor, epidermal growth factor or fibroblast growth factor. We employed a quick protein digestion workflow with spin filters without using urea. Phosphopeptides in general were enriched by sequential elution from...... transfer dissociation adds confidence in modification site assignment. The workflow is relatively simple but the integration of complementary techniques leads to a deeper insight into cellular signaling networks and the potential pharmacological intervention thereof....

  6. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  7. Automatizace digitalizačního workflow NTK

    OpenAIRE

    Řihák, Jakub

    2013-01-01

    This diploma thesis is focused on the automatization of digitization workflow in the National Library of Technology, Prague, Czech Republic. This thesis examines possibilities of digitization processes automatization by means of scripts written in Perl programming language and Apache Ant build tool. The advantages and disadvantages of both solutions are analyzed as well as their suitability for automatization of digitization workflow. Based on the comparison of both solutions, the scripts in ...

  8. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Directory of Open Access Journals (Sweden)

    Iván Tomás Cotes-Ruiz

    Full Text Available Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS. The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  9. Dynamic Voltage Frequency Scaling Simulator for Real Workflows Energy-Aware Management in Green Cloud Computing.

    Science.gov (United States)

    Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás

    2017-01-01

    Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.

  10. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  11. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Domain-Specific Languages for Composing Signature Discovery Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Ferosh; Gray, Jeff; Wynne, Adam S.; Liu, Yan (Jenny); Baker, Nathan A.

    2012-10-23

    Domain-agnostic signature discovery entails investigation across multiple scientific disciplines. The breadth and cross-disciplinary nature of this work requires that existing executables be integrated with new capabilities into workflows, representing a wide range of user tasks. An algorithm may be written in multiple programming languages for various hardware platforms, and so workflow composition requires integrating executables from any number of remote hosts. This raises an engineering issue on how to generate web service wrappers for these heterogeneous executables and to compose them into a scientific workflow environment (e.g., Taverna). In this paper, we introduce two simple Domain-Specific Languages (DSLs) to automate these processes. Our Service Description Language (SDL) describes key elements of a signature discovery service and automatically generates its implementation code. The Workflow Description Language (WDL) describes the pipeline of services and generates deployable artifacts for the Taverna workflow management system. We demonstrate our approach with a real-world workflow composed of services wrapping remote executables.

  13. CDK-Taverna: an open workflow environment for cheminformatics

    Directory of Open Access Journals (Sweden)

    Zielesny Achim

    2010-03-01

    Full Text Available Abstract Background Small molecules are of increasing interest for bioinformatics in areas such as metabolomics and drug discovery. The recent release of large open access chemistry databases generates a demand for flexible tools to process them and discover new knowledge. To freely support open science based on these data resources, it is desirable for the processing tools to be open source and available for everyone. Results Here we describe a novel combination of the workflow engine Taverna and the cheminformatics library Chemistry Development Kit (CDK resulting in a open source workflow solution for cheminformatics. We have implemented more than 160 different workers to handle specific cheminformatics tasks. We describe the applications of CDK-Taverna in various usage scenarios. Conclusions The combination of the workflow engine Taverna and the Chemistry Development Kit provides the first open source cheminformatics workflow solution for the biosciences. With the Taverna-community working towards a more powerful workflow engine and a more user-friendly user interface, CDK-Taverna has the potential to become a free alternative to existing proprietary workflow tools.

  14. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  15. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  16. Anima: Modular workflow system for comprehensive image data analysis

    Directory of Open Access Journals (Sweden)

    Ville eRantanen

    2014-07-01

    Full Text Available Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and preprocessing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis, and it contains several features that are crucial in high-throughput image data analysis: programming language independence, batch processing, easily customized data processing, interoperability with other software via application programming interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environmens. Anima is fully open source and available with documentation at http://www.anduril.org/anima

  17. Anima: modular workflow system for comprehensive image data analysis.

    Science.gov (United States)

    Rantanen, Ville; Valori, Miko; Hautaniemi, Sampsa

    2014-01-01

    Modern microscopes produce vast amounts of image data, and computational methods are needed to analyze and interpret these data. Furthermore, a single image analysis project may require tens or hundreds of analysis steps starting from data import and pre-processing to segmentation and statistical analysis; and ending with visualization and reporting. To manage such large-scale image data analysis projects, we present here a modular workflow system called Anima. Anima is designed for comprehensive and efficient image data analysis development, and it contains several features that are crucial in high-throughput image data analysis: programing language independence, batch processing, easily customized data processing, interoperability with other software via application programing interfaces, and advanced multivariate statistical analysis. The utility of Anima is shown with two case studies focusing on testing different algorithms developed in different imaging platforms and an automated prediction of alive/dead C. elegans worms by integrating several analysis environments. Anima is a fully open source and available with documentation at www.anduril.org/anima.

  18. BReW: Blackbox Resource Selection for e-Science Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh [Univ. of Southern California, Los Angeles, CA (United States); Soroush, Emad [Univ. of Washington, Seattle, WA (United States); Van Ingen, Catharine [Microsoft Research, San Francisco, CA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-04

    Workflows are commonly used to model data intensive scientific analysis. As computational resource needs increase for eScience, emerging platforms like clouds present additional resource choices for scientists and policy makers. We introduce BReW, a tool enables users to make rapid, highlevel platform selection for their workflows using limited workflow knowledge. This helps make informed decisions on whether to port a workflow to a new platform. Our analysis of synthetic and real eScience workflows shows that using just total runtime length, maximum task fanout, and total data used and produced by the workflow, BReW can provide platform predictions comparable to whitebox models with detailed workflow knowledge.

  19. BioWMS: a web-based Workflow Management System for bioinformatics

    OpenAIRE

    Bartocci Ezio; Corradini Flavio; Merelli Emanuela; Scortichini Lorenzo

    2007-01-01

    Abstract Background An in-silico experiment can be naturally specified as a workflow of activities implementing, in a standardized environment, the process of data and control analysis. A workflow has the advantage to be reproducible, traceable and compositional by reusing other workflows. In order to support the daily work of a bioscientist, several Workflow Management Systems (WMSs) have been proposed in bioinformatics. Generally, these systems centralize the workflow enactment and do not e...

  20. A WORKFLOW FOR UAV’s INTEGRATION INTO A GEODESIGN PLATFORM

    Directory of Open Access Journals (Sweden)

    P. Anca

    2016-06-01

    Full Text Available This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri’s City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen’s attention. This leads to better decisions.

  1. Understanding and Visualizing Multitasking and Task Switching Activities: A Time Motion Study to Capture Nursing Workflow.

    Science.gov (United States)

    Yen, Po-Yin; Kelley, Marjorie; Lopetegui, Marcelo; Rosado, Amber L; Migliore, Elaina M; Chipps, Esther M; Buck, Jacalyn

    2016-01-01

    A fundamental understanding of multitasking within nursing workflow is important in today's dynamic and complex healthcare environment. We conducted a time motion study to understand nursing workflow, specifically multitasking and task switching activities. We used TimeCaT, a comprehensive electronic time capture tool, to capture observational data. We established inter-observer reliability prior to data collection. We completed 56 hours of observation of 10 registered nurses. We found, on average, nurses had 124 communications and 208 hands-on tasks per 4-hour block of time. They multitasked (having communication and hands-on tasks simultaneously) 131 times, representing 39.48% of all times; the total multitasking duration ranges from 14.6 minutes to 109 minutes, 44.98 minutes (18.63%) on average. We also reviewed workflow visualization to uncover the multitasking events. Our study design and methods provide a practical and reliable approach to conducting and analyzing time motion studies from both quantitative and qualitative perspectives.

  2. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  3. The myth of standardized workflow in primary care.

    Science.gov (United States)

    Holman, G Talley; Beasley, John W; Karsh, Ben-Tzion; Stone, Jamie A; Smith, Paul D; Wetterneck, Tosha B

    2016-01-01

    Primary care efficiency and quality are essential for the nation's health. The demands on primary care physicians (PCPs) are increasing as healthcare becomes more complex. A more complete understanding of PCP workflow variation is needed to guide future healthcare redesigns. This analysis evaluates workflow variation in terms of the sequence of tasks performed during patient visits. Two patient visits from 10 PCPs from 10 different United States Midwestern primary care clinics were analyzed to determine physician workflow. Tasks and the progressive sequence of those tasks were observed, documented, and coded by task category using a PCP task list. Variations in the sequence and prevalence of tasks at each stage of the primary care visit were assessed considering the physician, the patient, the visit's progression, and the presence of an electronic health record (EHR) at the clinic. PCP workflow during patient visits varies significantly, even for an individual physician, with no single or even common workflow pattern being present. The prevalence of specific tasks shifts significantly as primary care visits progress to their conclusion but, notably, PCPs collect patient information throughout the visit. PCP workflows were unpredictable during face-to-face patient visits. Workflow emerges as the result of a "dance" between physician and patient as their separate agendas are addressed, a side effect of patient-centered practice. Future healthcare redesigns should support a wide variety of task sequences to deliver high-quality primary care. The development of tools such as electronic health records must be based on the realities of primary care visits if they are to successfully support a PCP's mental and physical work, resulting in effective, safe, and efficient primary care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    Science.gov (United States)

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Workflow for multi-analyte bioprocess monitoring demonstrated on inline NIR spectroscopy of P. chrysogenum fermentation.

    Science.gov (United States)

    Luoma, Pekka; Golabgir, Aydin; Brandstetter, Markus; Kasberger, Jürgen; Herwig, Christoph

    2017-01-01

    Fourier transform near-infrared (FT-NIR) spectroscopy combined with multivariate analysis has been applied in bioprocesses for a couple of decades. Nevertheless the papers published in this field are case-specific and do not focus on providing the community generic workflows to conduct experiments, especially as a standard Design of Experiment (DoE) for a multi-analyte process might require overwhelming amount of measurements. In this paper, a workflow for feasibility studies and inline implementation of FT-NIR spectrometer in multi-analyte fermentation processes is presented. The workflow is applied to Penicillium crysogenum fermentation, where the similarities in chemical structures and growth trends between the key analytes together with the aeration and growing fungi make the task challenging: first, the pure analytes are measured off-line with FT-NIR and clustered using principal component analysis. To study the separability of the gained clusters, a DoE approach by spiking is applied. The multivariate modelling of the separable analytes is conducted using the off-line and inline data followed by a comparison of the properties of the different models. Finally, the model output constraints are set by means of outlier diagnostics. As a result, biomass, penicillin (PEN), phenoxyacetic acid (POX), ammonia and biomass were shown to be separable with root mean square error of predictions of 2.62 g/l, 0.34 g/l, 0.51 g/l and 18.3 mM, respectively. Graphical abstract Flowchart illustrating the workflow for feasibility studies and implementation of models for inline monitoring of Ammonia, Biomass, Phenoxyacetic acid and Penicillin.

  6. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  7. Reproducible Large-Scale Neuroimaging Studies with the OpenMOLE Workflow Management System.

    Science.gov (United States)

    Passerat-Palmbach, Jonathan; Reuillon, Romain; Leclaire, Mathieu; Makropoulos, Antonios; Robinson, Emma C; Parisot, Sarah; Rueckert, Daniel

    2017-01-01

    OpenMOLE is a scientific workflow engine with a strong emphasis on workload distribution. Workflows are designed using a high level Domain Specific Language (DSL) built on top of Scala. It exposes natural parallelism constructs to easily delegate the workload resulting from a workflow to a wide range of distributed computing environments. OpenMOLE hides the complexity of designing complex experiments thanks to its DSL. Users can embed their own applications and scale their pipelines from a small prototype running on their desktop computer to a large-scale study harnessing distributed computing infrastructures, simply by changing a single line in the pipeline definition. The construction of the pipeline itself is decoupled from the execution context. The high-level DSL abstracts the underlying execution environment, contrary to classic shell-script based pipelines. These two aspects allow pipelines to be shared and studies to be replicated across different computing environments. Workflows can be run as traditional batch pipelines or coupled with OpenMOLE's advanced exploration methods in order to study the behavior of an application, or perform automatic parameter tuning. In this work, we briefly present the strong assets of OpenMOLE and detail recent improvements targeting re-executability of workflows across various Linux platforms. We have tightly coupled OpenMOLE with CARE, a standalone containerization solution that allows re-executing on a Linux host any application that has been packaged on another Linux host previously. The solution is evaluated against a Python-based pipeline involving packages such as scikit-learn as well as binary dependencies. All were packaged and re-executed successfully on various HPC environments, with identical numerical results (here prediction scores) obtained on each environment. Our results show that the pair formed by OpenMOLE and CARE is a reliable solution to generate reproducible results and re-executable pipelines. A

  8. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  9. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  10. Optimizing high performance computing workflow for protein functional annotation.

    Science.gov (United States)

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  11. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  12. Phase Segmentation Methods for an Automatic Surgical Workflow Analysis

    Directory of Open Access Journals (Sweden)

    Dinh Tuan Tran

    2017-01-01

    Full Text Available In this paper, we present robust methods for automatically segmenting phases in a specified surgical workflow by using latent Dirichlet allocation (LDA and hidden Markov model (HMM approaches. More specifically, our goal is to output an appropriate phase label for each given time point of a surgical workflow in an operating room. The fundamental idea behind our work lies in constructing an HMM based on observed values obtained via an LDA topic model covering optical flow motion features of general working contexts, including medical staff, equipment, and materials. We have an awareness of such working contexts by using multiple synchronized cameras to capture the surgical workflow. Further, we validate the robustness of our methods by conducting experiments involving up to 12 phases of surgical workflows with the average length of each surgical workflow being 12.8 minutes. The maximum average accuracy achieved after applying leave-one-out cross-validation was 84.4%, which we found to be a very promising result.

  13. Assessment of the Nurse Medication Administration Workflow Process

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2016-01-01

    Full Text Available This paper presents findings of an observational study of the Registered Nurse (RN Medication Administration Process (MAP conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  14. Evaluating plant immunity using mass spectrometry-based metabolomics workflows

    Directory of Open Access Journals (Sweden)

    Adam L Heuberger

    2014-06-01

    Full Text Available Metabolic processes in plants are key components of physiological and biochemical disease resistance. Metabolomics, the analysis of a broad range of small molecule compounds in a biological system, has been used to provide a systems-wide overview of plant metabolism associated with defense responses. Plant immunity has been examined using multiple metabolomics workflows that vary in methods of detection, annotation, and interpretation, and the choice of workflow can significantly impact the conclusions inferred from a metabolomics investigation. The broad range of metabolites involved in plant defense often supports the need for multiple chemical detection platforms and implementation of a non-targeted approach. A review of the current literature reveals a wide range of workflows that are currently used in plant metabolomics, and new methods for analyzing and reporting mass spectrometry data can improve the ability to translate investigative findings among different plant-pathogen systems.

  15. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  16. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  17. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  18. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  19. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows

  20. Structured workflow for implementing digital archiving standards in an organisation

    CSIR Research Space (South Africa)

    Schmitz, Peter MU

    2009-05-01

    Full Text Available is a Web archive providing long-term access to online publications and web sites that have Australian content. The acquisition of archival material for PANDORA is via PANDAS (PANDORA Digital Archiving System) (NLA, 2008). PANDAS is a workflow... Archives and Records Service of South Africa, Private Bag X236, Pretoria, 0001, South Africa. Accessed 14 November 2008 at: http://www.national.archives.gov.za/aboutnasa_content.html NLA. (2007). PANDAS WORKFLOW. Accessed 21 November 2008 at: http://pandora.nla.gov.au/manual/pandas...

  1. Interventional radiology workflow management in the electronic medical record.

    Science.gov (United States)

    Gassert, Geralyn; Durham, Janette; Cain, Michael; Sachs, Peter B

    2014-06-01

    The electronic medical record (EMR) has significantly improved efficiency in many areas of radiology workflow. Following implementation of an electronic protocol selection process for cross-sectional imaging at the University of Colorado Hospital, the interventional radiology (IR) division desired to have a similar tool. Evaluation of the IR workflow demonstrated the need for a multilayered solution, which accounted for consultation, physician review, authorization and scheduling, pre-procedural nursing evaluation, physician rounding, and resource allocation and prioritization. This paper outlines the rationale for and components of this process.

  2. A Workflow Framework for Health Management in Daily Living Settings.

    Science.gov (United States)

    Ozkaynak, Mustafa; Jones, Jacqueline; Weiss, Jason; Klem, Patrick; Reeder, Blaine

    2016-01-01

    Daily-living settings are increasingly becoming care delivery settings, particularly for chronic conditions. Workflow studies can help understand care delivery in daily-living settings, but traditional frameworks originally developed for institutional settings may not be appropriate to study health management in daily-living settings. Based on a qualitative study of health management patterns among eight patients at an academic hospital anticoagulation clinic, we have developed a model for examining daily living setting-based workflow. This model can inform consumer informatics interventions.

  3. Contextual cloud-based service oriented architecture for clinical workflow.

    Science.gov (United States)

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW.

  4. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas

    2013-01-01

    with researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction......Declarative workflow languages have been a growing research subject over the past ten years, but applications of the declarative approach in industry are still uncommon. Over the past two years Exformatics A/S, a Danish provider of Electronic Case Management systems, has been cooperating...

  5. MO-D-BRB-00: SBRT Workflow Overview

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    Increased use of SBRT and hypofractionation in radiation oncology practice has posted a number of challenges to medical physicist, ranging from planning, image-guided patient setup and on-treatment monitoring, to quality assurance (QA) and dose delivery. This symposium is designed to provide current knowledge necessary for the safe and efficient implementation of SBRT in various linac platforms, including the emerging digital linacs equipped with high dose rate FFF beams. Issues related to 4D CT, PET and MRI simulations, 3D/4D CBCT guided patient setup, real-time image guidance during SBRT dose delivery using gated/un-gated VMAT/IMRT, and technical advancements in QA of SBRT (in particular, strategies dealing with high dose rate FFF beams) will be addressed. The symposium will help the attendees to gain a comprehensive understanding of the SBRT workflow and facilitate their clinical implementation of the state-of-art imaging and planning techniques. Learning Objectives: Present background knowledge of SBRT, describe essential requirements for safe implementation of SBRT, and discuss issues specific to SBRT treatment planning and QA. Update on the use of multi-dimensional and multi-modality imaging for reliable guidance of SBRT. Discuss treatment planning and QA issues specific to SBRT. Provide a comprehensive overview of emerging digital linacs and summarize the key geometric and dosimetric features of the new generation of linacs for substantially improved SBRT. NIH/NCI; Varian Medical Systems; F. Yin, Duke University has a research agreement with Varian Medical Systems. In addition to research grant, I had a technology license agreement with Varian Medical Systems.

  6. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  7. Workflow Of Socialization Media Creation About Ad Aware For Teenagers In Junior High School

    Directory of Open Access Journals (Sweden)

    Ahmad Faiz Muntazori

    2017-02-01

    Full Text Available Ad consumption among teenagers may have an impact on consumer lifestyle. To anticipate it would required the creation of a medium of socialization about ad aware in school. The purpose is so that the teens can be selectively consume ad. This study used a qualitative approach and design sociology to describe workflow processes of socialization media creation in junior high school environment. As part of the solution to the problem among teenagers this study also formulate attitudes for teens to consume ad.

  8. A recommended workflow for DNase I footprinting using a capillary electrophoresis genetic analyzer.

    Science.gov (United States)

    Sivapragasam, Smitha; Pande, Anuja; Grove, Anne

    2015-07-15

    Fragment analysis was developed to determine the sizes of DNA fragments relative to size standards of known lengths using a capillary electrophoresis genetic analyzer. This approach has since been adapted for use in DNA footprinting. However, DNA footprinting requires accurate determination of both fragment length and intensity, imposing specific demands on the experimental design. Here we delineate essential considerations involved in optimizing the fragment analysis workflow for use in DNase I footprinting to ensure that changes in DNase I cleavage patterns may be reliably identified. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. REDACLE A Database for the Workflow Management of the CMS ECAL Construction

    CERN Document Server

    Barone, Luciano; Costantini, Silvia; Dafinei, Ioan; Diemoz, Marcella; Organtini, Giovanni; Paramatti, Riccardo; Pellegrino, Fabio; Moro, R; Cau, S

    2003-01-01

    The REDACLE Project aims at the realization of a simple, flexible and fast database to assist the construction of the CMS electromagnetic calorimeter. The project started in January 2003 as a backup solution for the previously used product: CRISTAL. The REDACLE database was designed to be flexible enough to be used for the construction of virtually any kind of product. One of the key element of the project was the complete decoupling between the database structure and the workflow process software: rather than being a missing feature it allows to use the database for very different projects ranging from very simple to much more complex systems.

  10. Flight and Operational Medicine Clinic (FOMC) Workflow Analysis

    Science.gov (United States)

    2014-03-14

    the subsequent HFACS root cause analysis. Other lean principles the study team considered included: • Smoothing the flow of the clinic staff and... amalgamation of the site-specific workflows • A pictorial representation of the Swiss cheese model annotated with the active and latent failures that were

  11. Portable Rapid Visual Workflow Simulation Tool for Human Robot Coproduction

    NARCIS (Netherlands)

    Dukalski, R.R.; Çençen, A.; Aschenbrenner, D.; Verlinden, J.C.

    2017-01-01

    Within the European Factory-in-a-day project, the aim is to improve communication between automation integrator and factory owner, in their analysis of feasibility and appropriateness of automating a manual task. A visualisation tool with preconfigured workflows and working principles, with

  12. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  13. Casehandling of gewoon workflow? Situatiefactoren geven de doorslag

    NARCIS (Netherlands)

    Ramaekers, M.; Eertink, P.; Sikkel, Nicolaas; Limburg, D.

    De afgelopen drie jaren heeft zich binnen de wereld van de workflow management systemen een nieuwe klasse van systemen geprofileerd, de zogenaamde case handling systemen, ook wel case management systemen genoemd. Deze systemen pretenderen ondersteuning te bieden op die gebieden waar de traditionele

  14. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu jinmin, Jinmin; Grefen, P.W.P.J.

    Electronic service outsourcing creates a new paradigm for automated enterprise collaboration. The service-oriented paradigm requires a high level of flexibility of current workflow management systems and support for Business-to-Business (B2B) collaboration to realize collaborative enterprises. This

  15. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    We present a field study of oncology workflow, involving doctors, nurses and pharmacists at Danish hospitals and discuss the obstacles, enablers and challenges for the use of computer based clinical practice guidelines. Related to the CIGDec approach of Pesic and van der Aalst we then describe ho...

  16. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  17. 3D workflows in orthodontics, maxillofacial surgery and prosthodontics

    NARCIS (Netherlands)

    van der Meer, Wicher Jurjen

    2016-01-01

    In this thesis different aspects of digital workflows in Orthodontics, Maxillofacial Surgery and Prosthodontics are discussed and, where possible, placed in a broader perspective thereby attempting to go both broader and deeper into the implications of the introduction of 3D digital technology in

  18. SHIWA workflow interoperability solutions for neuroimaging data analysis

    NARCIS (Netherlands)

    Korkhov, Vladimir; Krefting, Dagmar; Montagnat, Johan; Truong Huu, Tram; Kukla, Tamas; Terstyanszky, Gabor; Manset, David; Caan, Matthan; Olabarriaga, Silvia

    2012-01-01

    Neuroimaging is a field that benefits from distributed computing infrastructures (DCIs) to perform data- and compute-intensive processing and analysis. Using grid workflow systems not only automates the processing pipelines, but also enables domain researchers to implement their expertise on how to

  19. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  20. Workflow-based service selection under multi-constraints

    NARCIS (Netherlands)

    Xia, Chao; Chi, Chihung; Wong, Raymond; Wombacher, Andreas; Ferreira Pires, Luis; van Sinderen, Marten J.; Ding, Chen

    Despite the availability of services with similar functionality but from different providers in the cloud, using them in a workflow might be subject to constraints such as service QoS and service bundling. Service bundling refers to the situation where the subscription of two services have to be

  1. Experiences with Resource Provisioning for Scientific Workflows Using Corral

    Directory of Open Access Journals (Sweden)

    Gideon Juve

    2010-01-01

    Full Text Available The development of grid and workflow technologies has enabled complex, loosely coupled scientific applications to be executed on distributed resources. Many of these applications consist of large numbers of short-duration tasks whose runtimes are heavily influenced by delays in the execution environment. Such applications often perform poorly on the grid because of the large scheduling overheads commonly found in grids. In this paper we present a provisioning system based on multi-level scheduling that improves workflow runtime by reducing scheduling overheads. The system reserves resources for the exclusive use of the application, and gives applications control over scheduling policies. We describe our experiences with the system when running a suite of real workflow-based applications including in astronomy, earthquake science, and genomics. Provisioning resources with Corral ahead of the workflow execution, reduced the runtime of the astronomy application by up to 78% (45% on average and of a genome mapping application by an order of magnitude when compared to traditional methods. We also show how provisioning can benefit applications both on a small local cluster as well as a large-scale campus resource.

  2. Supporting Ad-hoc Changes in Distributed Workflow Management Systems.

    NARCIS (Netherlands)

    Reichert, M.U.; Bauer, T.

    Flexible support of distributed business processes is a characteristic challenge for any workflow management system (WfMS). Scalability at the presence of high loads as well as the capability to dynamically adapt running process instances are essential requirements. Should the latter one be not met,

  3. The effectiveness of workflow management systems: A longitudinal study

    NARCIS (Netherlands)

    Reijers, H.A.; Vanderfeesten, I.; van der Aalst, W.M.P.

    2016-01-01

    Workflow management systems coordinate and allocate work through the various stages of executing business processes. The benefits of such systems appear pervasive, but no hard data is available that confirms that their implementation improves organizational performance. In part, this is due to the

  4. You’ve Got Email: a Workflow Management Extraction System

    NARCIS (Netherlands)

    P. chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); M.J. McAleer (Michael)

    2017-01-01

    textabstractEmail is one of the most powerful tools for communication. Many businesses use email as the main channel for communication, so it is possible that substantial data are included in email content. In order to help businesses grow faster, a workflow management system may be required. The

  5. Semantics and Architecture of Global Transaction Support in Workflow Environments

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    We present an approach to global transaction management in workflow environments. The transaction mechanism is based on the well-known notion of sagas, but extended to deal with arbitrary process structures including cycles and savepoints that allow partial compensation. We present a formal

  6. A Generalized Email Classification System for Workflow Analysis

    NARCIS (Netherlands)

    P. chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2017-01-01

    textabstractOne of the most powerful internet communication channels is email. As employees and their clients communicate primarily via email, much crucial business data is conveyed via email content. Where businesses are understandably concerned, they need a sophisticated workflow management

  7. Automation of Global Adjoint Tomography Based on ASDF and Workflow Management Tools

    Science.gov (United States)

    Lei, W.; Ruan, Y.; Bozdag, E.; Smith, J. A.; Modrak, R. T.; Krischer, L.; Chen, Y.; Lefebvre, M. P.; Tromp, J.

    2016-12-01

    Global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Though a collaboration with the Oak Ridge National Laboratory computing group and an allocation on the `Titan' GPU-accelerated supercomputer, we have begun to assimilate waveform data from more than 4,000 earthquakes, from 1995 to 2015, in our inversions. However, since conventional file formats and signal processing tools were not designed for parallel processing of massive data volumes, use of such tools in high-resolution global inversions leads to major bottlenecks. To overcome such problems and allow for continued scientific progress, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of processing tools based on ASDF, covering from signal processing (pytomo3d), time window selection (pyflex) to adjoint source (pyadjoint). These new tools greatly enhance the reproducibility and accountability of our research while taking full advantage of parallel computing, showing superior scaling on modern computational platforms. The entire inversion workflow, intrinsically complex and sensitive to human errors, is carefully handled and automated by modern workflow management tools, preventing data contamination and saving a huge amount of time. Our starting model GLAD-M15 (Bozdag et al., 2016), an elastic model with transversely isotropic upper mantle, is based on 253 earthquakes and 15 nonlinear conjugate gradient iterations. We have now completed source inversions for more than 1,000 earthquakes and have started structural inversions using a quasi-Newton optimization algorithm. We will discuss the challenges of large-scale workflows on HPC systems, the solutions offered by our new adjoint tomography tools, and the initial tomographic results obtained using the new expanded dataset.

  8. VisTrails is an open-source scientific workflow and provenance management system

    CSIR Research Space (South Africa)

    Mthombeni, Thabo DM

    2011-12-01

    Full Text Available VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Whereas workflows have been traditionally used to automate repetitive tasks, for applications...

  9. Contract-Based Transaction Management in Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Grefen, P.W.P.J.

    Cross-organizational workflow management is an essential ingredient for process integration in virtual enterprises. To obtain cross-organizational workflow processes with robust semantics, these processes should be supported by highlevel cross-organizational transaction management. In this context,

  10. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  11. Understanding collaborative studies through interoperable workflow provenance

    NARCIS (Netherlands)

    Altintas, I.; Anand, M.K.; Crawl, D.; Bowers, S.; Belloum, A.; Missier, P.; Ludäscher, B.; Goble, C.A.; Sloot, P.M.A.

    2010-01-01

    The provenance of a data product contains information about how the product was derived, and is crucial for enabling scientists to easily understand, reproduce, and verify scientific results. Currently, most provenance models are designed to capture the provenance related to a single run, and mostly

  12. What Not To Do: Anti-patterns for Developing Scientific Workflow Software Components

    Science.gov (United States)

    Futrelle, J.; Maffei, A. R.; Sosik, H. M.; Gallager, S. M.; York, A.

    2013-12-01

    Scientific workflows promise to enable efficient scaling-up of researcher code to handle large datasets and workloads, as well as documentation of scientific processing via standardized provenance records, etc. Workflow systems and related frameworks for coordinating the execution of otherwise separate components are limited, however, in their ability to overcome software engineering design problems commonly encountered in pre-existing components, such as scripts developed externally by scientists in their laboratories. In practice, this often means that components must be rewritten or replaced in a time-consuming, expensive process. In the course of an extensive workflow development project involving large-scale oceanographic image processing, we have begun to identify and codify 'anti-patterns'--problematic design characteristics of software--that make components fit poorly into complex automated workflows. We have gone on to develop and document low-effort solutions and best practices that efficiently address the anti-patterns we have identified. The issues, solutions, and best practices can be used to evaluate and improve existing code, as well as guiding the development of new components. For example, we have identified a common anti-pattern we call 'batch-itis' in which a script fails and then cannot perform more work, even if that work is not precluded by the failure. The solution we have identified--removing unnecessary looping over independent units of work--is often easier to code than the anti-pattern, as it eliminates the need for complex control flow logic in the component. Other anti-patterns we have identified are similarly easy to identify and often easy to fix. We have drawn upon experience working with three science teams at Woods Hole Oceanographic Institution, each of which has designed novel imaging instruments and associated image analysis code. By developing use cases and prototypes within these teams, we have undertaken formal evaluations of

  13. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  14. Workflow modelling using a temporal object-oriented model with roles

    OpenAIRE

    Edelweiss, Nina; Nicolao, Mariano

    2004-01-01

    The representation of all the processes that compose a workflow, including all the constituent activities, their execution sequence and relationships, the agents responsible for their execution, and the resources that are used during execution, is known as Workflow Modelling. Several techniques are being proposed to model workflow. In this paper a workflow modelling technique is proposed, using a temporal object-oriented data model, the TF-ORM model. The TF-ORM model is presented, as are the ...

  15. Workflow Automation with Lotus Notes for the Governmental Administrative Information System

    OpenAIRE

    Maskeliunas, Saulius

    1999-01-01

    The paper presents an introductory overview of the workflow automation area, outlining the main types, basic technologies, the essential features of workflow applications. Two sorts of process models for the definition of workflows (according to the conversation-based and activity-based methodologies) are sketched. Later on, the nature of Lotus Notes and its capabilities (as an environment for workflow management systems development) are indicated. Concluding, the experience of automating adm...

  16. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  17. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  18. CSPBuilder - CSP based Scientific Workflow Modelling

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard; Vinter, Brian

    2008-01-01

    This paper introduces a framework for building CSP based applications, targeted for clusters and next generation CPU designs. CPUs are produced with several cores today and every future CPU generation will feature increasingly more cores, resulting in a requirement for concurrency that has...... not previously been called for. The framework is CSP presented as a scienti¿c work¿ow model, specialized for scienti¿c computing applications. The purpose of the framework is to enable scientists to exploit large parallel computation resources, which has previously been hard due of the dif¿culty of concurrent...

  19. Workflow of CAD / CAM Scoliosis Brace Adjustment in Preparation Using 3D Printing.

    Science.gov (United States)

    Weiss, Hans-Rudolf; Tournavitis, Nicos; Nan, Xiaofeng; Borysov, Maksym; Paul, Lothar

    2017-01-01

    High correction bracing is the most effective conservative treatment for patients with scoliosis during growth. Still today braces for the treatment of scoliosis are made by casting patients while computer aided design (CAD) and computer aided manufacturing (CAM) is available with all possibilities to standardize pattern specific brace treatment and improve wearing comfort. CAD / CAM brace production mainly relies on carving a polyurethane foam model which is the basis for vacuuming a polyethylene (PE) or polypropylene (PP) brace. Purpose of this short communication is to describe the workflow currently used and to outline future requirements with respect to 3D printing technology. Description of the steps of virtual brace adjustment as available today are content of this paper as well as an outline of the great potential there is for the future 3D printing technology. For 3D printing of scoliosis braces it is necessary to establish easy to use software plug-ins in order to allow adding 3D printing technology to the current workflow of virtual CAD / CAM brace adjustment. Textures and structures can be added to the brace models at certain well defined locations offering the potential of more wearing comfort without losing in-brace correction. Advances have to be made in the field of CAD / CAM software tools with respect to design and generation of individually structured brace models based on currently well established and standardized scoliosis brace libraries.

  20. Improvement of workflow and processes to ease and enrich meaningful use of health information technology.

    Science.gov (United States)

    Singh, Ranjit; Singh, Ashok; Singh, Devan R; Singh, Gurdev

    2013-01-01

    The introduction of health information technology (HIT) can have unexpected and unintended patient safety and/or quality consequences. This highly desirable but complex intervention requires workflow changes in order to be effective. Workflow is often cited by providers as the number one 'pain point'. Its redesign needs to be tailored to the organizational context, current workflow, HIT system being introduced, and the resources available. Primary care practices lack the required expertise and need external assistance. Unfortunately, the current methods of using esoteric charts or software are alien to health care workers and are, therefore, perceived to be barriers. Most importantly and ironically, these do not readily educate or enable staff to inculcate a common vision, ownership, and empowerment among all stakeholders. These attributes are necessary for creating highly reliable organizations. We present a tool that addresses US Accreditation Council for Graduate Medical (ACGME) competency requirements. Of the six competencies called for by the ACGME, the two that this tool particularly addresses are 'system-based practice' and 'practice-based learning and continuing improvement'. This toolkit is founded on a systems engineering approach. It includes a motivational and orientation presentation, 128 magnetic pictorial and write-erase icons of 40 designs, dry-erase magnetic board, and five visual aids for reducing cognitive and emotive biases in staff. Pilot tests were carried out in practices in Western New York and Colorado, USA. In addition, the toolkit was presented at the 2011 North American Primary Care Research Group (NAPCRG) meeting and an Agency for Health Research and Quality (AHRQ) meeting in 2013 to solicit responses from attendees. It was also presented to the officers of the Office of the National Coordinator (ONC) for HIT. All qualitative feedback was extremely positive and enthusiastic. The respondents recommended that the toolkit be disseminated

  1. Phxnlme: An R package that facilitates pharmacometric workflow of Phoenix NLME analyses.

    Science.gov (United States)

    Lim, Chay Ngee; Liang, Shuang; Feng, Kevin; Chittenden, Jason; Henry, Ana; Mouksassi, Samer; Birnbaum, Angela K

    2017-03-01

    Pharmacometric analyses are integral components of the drug development process, and Phoenix NLME is one of the popular software used to conduct such analyses. To address current limitations with model diagnostic graphics and efficiency of the workflow for this software, we developed an R package, Phxnlme, to facilitate its workflow and provide improved graphical diagnostics. Phxnlme was designed to provide functionality for the major tasks that are usually performed in pharmacometric analyses (i.e. nonlinear mixed effects modeling, basic model diagnostics, visual predictive checks and bootstrap). Various estimation methods for modeling using the R package are made available through the Phoenix NLME engine. The Phxnlme R package utilizes other packages such as ggplot2 and lattice to produce the graphical output, and various features were included to allow customizability of the output. Interactive features for some plots were also added using the manipulate R package. Phxnlme provides enhanced capabilities for nonlinear mixed effects modeling that can be accessed using the phxnlme() command. Output from the model can be graphed to assess the adequacy of model fits and further explore relationships in the data using various functions included in this R package, such as phxplot() and phxvpc.plot(). Bootstraps, stratified up to three variables, can also be performed to obtain confidence intervals around the model estimates. With the use of an R interface, different R projects can be created to allow multi-tasking, which addresses the current limitation of the Phoenix NLME desktop software. In addition, there is a wide selection of diagnostic and exploratory plots in the Phxnlme package, with improvements in the customizability of plots, compared to Phoenix NLME. The Phxnlme package is a flexible tool that allows implementation of the analytical workflow of Phoenix NLME with R, with features for greater overall efficiency and improved customizable graphics. Phxnlme is

  2. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  3. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    Science.gov (United States)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  4. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  5. Work Around Distributed Image Processing and Workflow Management

    Science.gov (United States)

    Schaaff, A.; Bonnarel, F.; Claudon, J.-J.; Louys, M.; Pestel, C.; David, R.; Genaud, S.; Louys, M.; Wolf, C.

    2006-07-01

    Many people develop tools for image processing in various languages (C, C++, FORTRAN, MATLAB, etc) but do not diffuse them. One of the reasons is the portability and also the difficulty to make them collaborate with other tools. We have developed an architecture in which such tools can be wrapped and accessed through a standardized way (CGI and Web Services to use them in other applications, a Java Applet to use them directly). We have also developed Workflow libraries (client and server sides) to enable the (easy) creation and management of more complex tasks. The initially isolated tasks can now be associated into complex workflows. We are now working on the distribution of this architecture to enable the creation of image processing nodes.

  6. A framework for streamlining research workflow in neuroscience and psychology.

    Science.gov (United States)

    Kubilius, Jonas

    2013-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for a faster, more robust code development and collaboration for researchers.

  7. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  8. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    Science.gov (United States)

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  9. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  10. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  11. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  12. A novel workflow for seismic net pay estimation with uncertainty

    OpenAIRE

    Glinsky, Michael E.; Baptiste, Dale; Unaldi, Muhlis; Nagassar, Vishal

    2016-01-01

    This paper presents a novel workflow for seismic net pay estimation with uncertainty. It is demonstrated on the Cassra/Iris Field. The theory for the stochastic wavelet derivation (which estimates the seismic noise level along with the wavelet, time-to-depth mapping, and their uncertainties), the stochastic sparse spike inversion, and the net pay estimation (using secant areas) along with its uncertainty; will be outlined. This includes benchmarking of this methodology on a synthetic model. A...

  13. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    Science.gov (United States)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  14. Knowledge Management and Information Systems based on Workflow Technology

    OpenAIRE

    Martínez Toro, Iván; Gallego Vico, Daniel; Salvachúa Rodríguez, Joaquín

    2011-01-01

    Knowledge management is critical for the success of virtual communities, especially in the case of distributed working groups. A representative example of this scenario is the distributed software development, where it is necessary an optimal coordination to avoid common problems such as duplicated work. In this paper the feasibility of using the workflow technology as a knowledge management system is discussed, and a practical use case is presented. This use case is an information system tha...

  15. Yadage and Packtivity – analysis preservation using parametrized workflows

    Science.gov (United States)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps – “packtivities” – linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation – “yadage” – capable of executing workflows of analysis preserved via Linux containers.

  16. Robust Execution of Service Workflows Using Redundancy and Advance Reservations

    OpenAIRE

    Stein, S; Payne, T; Jennings, N

    2010-01-01

    In this paper, we develop a novel algorithm that allows service consumers to execute business processes (or workflows) of interdependent services in a dependable manner within tight time-constraints. In particular, we consider large inter-organisational service-oriented systems, where services are offered by external organisations that demand financial remuneration and where their use has to be negotiated in advance using explicit service-level agreements (as is common in Grids and cloud comp...

  17. Multi-perspective workflow modeling for online surgical situation models.

    Science.gov (United States)

    Franke, Stefan; Meixensberger, Jürgen; Neumuth, Thomas

    2015-04-01

    Surgical workflow management is expected to enable situation-aware adaptation and intelligent systems behavior in an integrated operating room (OR). The overall aim is to unburden the surgeon and OR staff from both manual maintenance and information seeking tasks. A major step toward intelligent systems behavior is a stable classification of the surgical situation from multiple perspectives based on performed low-level tasks. The present work proposes a method for the classification of surgical situations based on multi-perspective workflow modeling. A model network that interconnects different types of surgical process models is described. Various aspects of a surgical situation description were considered: low-level tasks, high-level tasks, patient status, and the use of medical devices. A study with sixty neurosurgical interventions was conducted to evaluate the performance of our approach and its robustness against incomplete workflow recognition input. A correct classification rate of over 90% was measured for high-level tasks and patient status. The device usage models for navigation and neurophysiology classified over 95% of the situations correctly, whereas the ultrasound usage was more difficult to predict. Overall, the classification rate decreased with an increasing level of input distortion. Autonomous adaptation of medical devices and intelligent systems behavior do not currently depend solely on low-level tasks. Instead, they require a more general type of understanding of the surgical condition. The integration of various surgical process models in a network provided a comprehensive representation of the interventions and allowed for the generation of extensive situation descriptions. Multi-perspective surgical workflow modeling and online situation models will be a significant pre-requisite for reliable and intelligent systems behavior. Hence, they will contribute to a cooperative OR environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  18. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  19. A STRUCTURAL MODEL OF AN EXCAVATOR WORKFLOW CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. Gurko

    2016-12-01

    Full Text Available Earthwork improving is connected with excavators automation. In this paper, on the basis of the analysis of problems that a hydraulic excavator control system have to solve, the hierarchical structure of a control system have been proposed. The decomposition of the control process had been executed that allowed to develop the structural model which reflects the characteristics of a multilevel space-distributed control system of an excavator workflow.

  20. GO2OGS 1.0: a versatile workflow to integrate complex geological information with fault data into numerical simulation models

    Science.gov (United States)

    Fischer, T.; Naumov, D.; Sattler, S.; Kolditz, O.; Walther, M.

    2015-11-01

    We offer a versatile workflow to convert geological models built with the ParadigmTM GOCAD© (Geological Object Computer Aided Design) software into the open-source VTU (Visualization Toolkit unstructured grid) format for usage in numerical simulation models. Tackling relevant scientific questions or engineering tasks often involves multidisciplinary approaches. Conversion workflows are needed as a way of communication between the diverse tools of the various disciplines. Our approach offers an open-source, platform-independent, robust, and comprehensible method that is potentially useful for a multitude of environmental studies. With two application examples in the Thuringian Syncline, we show how a heterogeneous geological GOCAD model including multiple layers and faults can be used for numerical groundwater flow modeling, in our case employing the OpenGeoSys open-source numerical toolbox for groundwater flow simulations. The presented workflow offers the chance to incorporate increasingly detailed data, utilizing the growing availability of computational power to simulate numerical models.

  1. Understanding the impact on intensive care staff workflow due to the introduction of a critical care information system: a mixed methods research methodology.

    Science.gov (United States)

    Shaw, N T; Mador, R L; Ho, S; Mayes, D; Westbrook, J I; Creswick, N; Thiru, K; Brown, M

    2009-01-01

    The Intensive Care Unit (ICU) is a complex and dynamic tertiary care environment that requires health care providers to balance many competing tasks and responsibilities. Inefficient and interruption-driven workflow is believed to increase the likelihood of medical errors and, therefore, present a serious risk to patients in the ICU. The introduction of a Critical Care Information System (CCIS), is purported to result in fewer medical errors and better patient care by streamlining workflow. Little objective research, however, has investigated these assertions. This paper reports on the design of a research methodology to explore the impact of a CCIS on the workflow of Respiratory Therapists, Pediatric Intensivists, Nurses, and Unit Clerks in a Pediatric ICU (PICU) and a General Systems ICU (GSICU) in Northern Canada.

  2. Optimal Workflow Scheduling in Critical Infrastructure Systems with Neural Networks

    Directory of Open Access Journals (Sweden)

    S. Vukmirović

    2012-04-01

    Full Text Available Critical infrastructure systems (CISs, such as power grids, transportation systems, communication networks and water systems are the backbone of a country’s national security and industrial prosperity. These CISs execute large numbers of workflows with very high resource requirements that can span through different systems and last for a long time. The proper functioning and synchronization of these workflows is essential since humanity’s well-being is connected to it. Because of this, the challenge of ensuring availability and reliability of these services in the face of a broad range of operating conditions is very complicated. This paper proposes an architecture which dynamically executes a scheduling algorithm using feedback about the current status of CIS nodes. Different artificial neural networks (ANNs were created in order to solve the scheduling problem. Their performances were compared and as the main result of this paper, an optimal ANN architecture for workflow scheduling in CISs is proposed. A case study is shown for a meter data management system with measurements from a power distribution management system in Serbia. Performance tests show that significant improvement of the overall execution time can be achieved by ANNs.

  3. Reproducible Research Data Analyses using the Common Workflow Language standards

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    This talk will introduce the Common Workflow Language project. In July 2016 they released standards that enable the portable, interoperable, and executable description of command line data analysis tools and workflow made from those tools. These descriptions are enhanced by CWL's first class (but optional) support for Docker containers. CWL originated from the world of bioinformatics but is not discipline specific and is gaining interest and use in other fields. Attendees who want to play with CWL prior to attending the presentation are invited to go through the "Gentle Introduction to the Common Workflow Language" tutorial on any OS X or Linux machine on their own time. About the speaker Michael R. Crusoe is one of the co-founders of the CWL project and is the CWL Community Engineer. His facilitation, technical contributions, and training on behalf of the project draw from his time as the former lead developer of C. Titus Brown's k-h-mer project, his previous career as a sysadmin and programmer, and his ex...

  4. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  5. THERMAL REMOTE SENSING WITH UAV-BASED WORKFLOWS

    Directory of Open Access Journals (Sweden)

    R. Boesch

    2017-08-01

    Full Text Available Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  6. ATLAS Job Transforms: A Data Driven Workflow Engine

    Science.gov (United States)

    Stewart, G. A.; Breaden-Madden, W. B.; Maddocks, H. J.; Harenberg, T.; Sandhoff, M.; Sarrazin, B.

    2014-06-01

    The need to run complex workflows for a high energy physics experiment such as ATLAS has always been present. However, as computing resources have become even more constrained, compared to the wealth of data generated by the LHC, the need to use resources efficiently and manage complex workflows within a single grid job have increased. In ATLAS, a new Job Transform framework has been developed that we describe in this paper. This framework manages the multiple execution steps needed to 'transform' one data type into another (e.g., RAW data to ESD to AOD to final ntuple) and also provides a consistent interface for the ATLAS production system. The new framework uses a data driven workflow definition which is both easy to manage and powerful. After a transform is defined, jobs are expressed simply by specifying the input data and the desired output data. The transform infrastructure then executes only the necessary substeps to produce the final data products. The global execution cost of running the job is minimised and the transform can adapt to scenarios where data can be produced along different execution paths. Transforms for specific physics tasks which support up to 60 individual substeps have been successfully run. As the new transforms infrastructure has been deployed in production many features have been added to the framework which improve reliability, quality of error reporting and also provide support for multi-process jobs.

  7. Thermal Remote Sensing with Uav-Based Workflows

    Science.gov (United States)

    Boesch, R.

    2017-08-01

    Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  8. The CMS tracker calibration workflow: experience with cosmic ray data.

    CERN Document Server

    Frosali, Simone

    2009-01-01

    During the second part of 2008 a CMS commissioning was performed with the acquisition of cosmic events in global runs. Cosmic rays detected in the muon chambers were used to trigger the readout of all CMS subdetectors in the general data acquisition system. A total of about 300M of tracks were collected by the CMS Muon Chambers with a 3.8T magnetic field produced by the CMS superconducting solenoid, 6M of which pointing to the tracker region and reconstructed by the Si-Strip tracker (SST) detectors. Other 1M of cosmic tracks were collected with the magnetic field off. Using the cosmic data available it was possible to validate the performances of the CMS tracker calibration workflows. In this paper the adopted calibration workflow is described. In particular, the three main calibration workflows requested for the low level reconstruction of the SST, i.e. gain calibration, Lorentz angle calibration and bad components identification, are described. The results obtained using cosmic tracks for these three ca...

  9. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  10. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  11. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    Science.gov (United States)

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  12. DIGITAL WORKFLOWS FOR RESTORATION AND MANAGEMENT OF THE MUSEUM AFFANDI - A CASE STUDY IN CHALLENGING CIRCUMSTANCES

    Directory of Open Access Journals (Sweden)

    U. Herbig

    2017-08-01

    Full Text Available The appropriate restoration of architectural heritage needs a careful and comprehensive documentation of the existing structures, which even elaborates, if the function of the building needs special attention, like in museums. In a collaborative project between the Universitas Gadjah Mada, Yogyakarta, Indonesia and two universities in Austria (TU Wien and the Danube University Krems a restoration and adaptation concept of the Affandi Museum in Yogyakarta is currently in progress. It provides a perfect case study for the development of a workflow to combine data from a building survey, architectural research, indoor climate measurements and the documentation of artwork in a challenging environment, from hot and humid tropical climate to continuous threads by natural hazards like earthquakes or volcanic eruptions. The Affandi Museum houses the collection of Affandi, who is considered to be Indonesia's foremost Expressionist painter and partly designed and constructed the museum by himself. With the spirit of the artist still perceptible in the complex the Affandi Museum is an important part of the Indonesian cultural heritage. Thus its preservation takes special attention and adds to the complexity of the development of a monitoring and maintenance concept. This paper describes the ongoing development of an approach to a workflow from the measurement and research of the objects, both architectural and artwork, to the semantically enriched BIM Model as the base for a sustainable monitoring tool for the Affandi Museum.

  13. Optimizing Multiple QoS for Workflow Applications using PSO and Min-Max Strategy

    Science.gov (United States)

    Umar Ambursa, Faruku; Latip, Rohaya; Abdullah, Azizol; Subramaniam, Shamala

    2017-08-01

    Workflow scheduling under multiple QoS constraints is a complicated optimization problem. Metaheuristic techniques are excellent approaches used in dealing with such problem. Many metaheuristic based algorithms have been proposed, that considers various economic and trustworthy QoS dimensions. However, most of these approaches lead to high violation of user-defined QoS requirements in tight situation. Recently, a new Particle Swarm Optimization (PSO)-based QoS-aware workflow scheduling strategy (LAPSO) is proposed to improve performance in such situations. LAPSO algorithm is designed based on synergy between a violation handling method and a hybrid of PSO and min-max heuristic. Simulation results showed a great potential of LAPSO algorithm to handling user requirements even in tight situations. In this paper, the performance of the algorithm is anlysed further. Specifically, the impact of the min-max strategy on the performance of the algorithm is revealed. This is achieved by removing the violation handling from the operation of the algorithm. The results show that LAPSO based on only the min-max method still outperforms the benchmark, even though the LAPSO with the violation handling performs more significantly better.

  14. MCRUNJOB: A High energy physics workflow planner for grid production processing

    Energy Technology Data Exchange (ETDEWEB)

    Graham, Gregory E.

    2004-08-26

    McRunjob is a powerful grid workflow manager used to manage the generation of large numbers of production processing jobs in High Energy Physics. In use at both the DZero and CMS experiments, McRunjob has been used to manage large Monte Carlo production processing since 1999 and is being extended to uses in regular production processing for analysis and reconstruction. Described at CHEP 2001, McRunjob converts core metadata into jobs submittable in a variety of environments. The powerful core metadata description language includes methods for converting the metadata into persistent forms, job descriptions, multi-step workflows, and data provenance information. The language features allow for structure in the metadata by including full expressions, namespaces, functional dependencies, site specific parameters in a grid environment, and ontological definitions. It also has simple control structures for parallelization of large jobs. McRunjob features a modular design which allows for easy expansion to new job description languages or new application level tasks.

  15. Workflow for Criticality Assessment Applied in Biopharmaceutical Process Validation Stage 1

    Directory of Open Access Journals (Sweden)

    Thomas Zahel

    2017-10-01

    Full Text Available Identification of critical process parameters that impact product quality is a central task during regulatory requested process validation. Commonly, this is done via design of experiments and identification of parameters significantly impacting product quality (rejection of the null hypothesis that the effect equals 0. However, parameters which show a large uncertainty and might result in an undesirable product quality limit critical to the product, may be missed. This might occur during the evaluation of experiments since residual/un-modelled variance in the experiments is larger than expected a priori. Estimation of such a risk is the task of the presented novel retrospective power analysis permutation test. This is evaluated using a data set for two unit operations established during characterization of a biopharmaceutical process in industry. The results show that, for one unit operation, the observed variance in the experiments is much larger than expected a priori, resulting in low power levels for all non-significant parameters. Moreover, we present a workflow of how to mitigate the risk associated with overlooked parameter effects. This enables a statistically sound identification of critical process parameters. The developed workflow will substantially support industry in delivering constant product quality, reduce process variance and increase patient safety.

  16. Digital Workflows for Restoration and Management of the Museum Affandi - a Case Study in Challenging Circumstances

    Science.gov (United States)

    Herbig, U.; Styhler-Aydın, G.; Grandits, D.; Stampfer, L.; Pont, U.; Mayer, I.

    2017-08-01

    The appropriate restoration of architectural heritage needs a careful and comprehensive documentation of the existing structures, which even elaborates, if the function of the building needs special attention, like in museums. In a collaborative project between the Universitas Gadjah Mada, Yogyakarta, Indonesia and two universities in Austria (TU Wien and the Danube University Krems) a restoration and adaptation concept of the Affandi Museum in Yogyakarta is currently in progress. It provides a perfect case study for the development of a workflow to combine data from a building survey, architectural research, indoor climate measurements and the documentation of artwork in a challenging environment, from hot and humid tropical climate to continuous threads by natural hazards like earthquakes or volcanic eruptions. The Affandi Museum houses the collection of Affandi, who is considered to be Indonesia's foremost Expressionist painter and partly designed and constructed the museum by himself. With the spirit of the artist still perceptible in the complex the Affandi Museum is an important part of the Indonesian cultural heritage. Thus its preservation takes special attention and adds to the complexity of the development of a monitoring and maintenance concept. This paper describes the ongoing development of an approach to a workflow from the measurement and research of the objects, both architectural and artwork, to the semantically enriched BIM Model as the base for a sustainable monitoring tool for the Affandi Museum.

  17. Optimized workflow and imaging protocols for whole-body oncologic PET/MRI.

    Science.gov (United States)

    Ishii, Shirou; Hara, Takamitsu; Nanbu, Takeyuki; Suenaga, Hiroki; Sugawara, Shigeyasu; Kuroiwa, Daichi; Sekino, Hirofumi; Miyajima, Masayuki; Kubo, Hitoshi; Oriuchi, Noboru; Ito, Hiroshi

    2016-11-01

    Although PET/MRI has the advantages of a simultaneous acquisition of PET and MRI, high soft-tissue contrast of the MRI images, and reduction of radiation exposure, its low profitability and long acquisition time are significant problems in clinical settings. Thus, MRI protocols that meet oncological purposes need to be used in order to reduce examination time while securing detectability. Currently, half-Fourier acquisition single-shot turbo spin echo and 3D-T1 volumetric interpolated breath-hold examination may be the most commonly used sequences for whole-body imaging due to their shorter acquisition time and higher diagnostic accuracy. Although there have been several reports that adding diffusion weighted image (DWI) to PET/MRI protocol has had no effect on tumor detection to date, in cases of liver, kidney, bladder, and prostate cancer, the use of DWI may be beneficial in detecting lesions. Another possible option is to scan each region with different MRI sequences instead of scanning the whole body using one sequence continuously. We herein report a workflow and imaging protocols for whole-body oncologic PET/MRI using an integrated system in the clinical routine, designed for the detection, for example by cancer screening, of metastatic lesions, in order to help future users optimize their workflow and imaging protocols.

  18. Flexible End2End Workflow Automation of Hit-Discovery Research.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Thurow, Kerstin

    2014-08-01

    The article considers a new approach of more complex laboratory automation at the workflow layer. The authors purpose the automation of end2end workflows. The combination of all relevant subprocesses-whether automated or manually performed, independently, and in which organizational unit-results in end2end processes that include all result dependencies. The end2end approach focuses on not only the classical experiments in synthesis or screening, but also on auxiliary processes such as the production and storage of chemicals, cell culturing, and maintenance as well as preparatory activities and analyses of experiments. Furthermore, the connection of control flow and data flow in the same process model leads to reducing of effort of the data transfer between the involved systems, including the necessary data transformations. This end2end laboratory automation can be realized effectively with the modern methods of business process management (BPM). This approach is based on a new standardization of the process-modeling notation Business Process Model and Notation 2.0. In drug discovery, several scientific disciplines act together with manifold modern methods, technologies, and a wide range of automated instruments for the discovery and design of target-based drugs. The article discusses the novel BPM-based automation concept with an implemented example of a high-throughput screening of previously synthesized compound libraries. © 2014 Society for Laboratory Automation and Screening.

  19. Inter-laboratory evaluation of instrument platforms and experimental workflows for quantitative accuracy and reproducibility assessment

    Directory of Open Access Journals (Sweden)

    Andrew J. Percy

    2015-09-01

    Full Text Available The reproducibility of plasma protein quantitation between laboratories and between instrument types was examined in a large-scale international study involving 16 laboratories and 19 LC–MS/MS platforms, using two kits designed to evaluate instrument performance and one kit designed to evaluate the entire bottom-up workflow. There was little effect of instrument type on the quality of the results, demonstrating the robustness of LC/MRM-MS with isotopically labeled standards. Technician skill was a factor, as errors in sample preparation and sub-optimal LC–MS performance were evident. This highlights the importance of proper training and routine quality control before quantitation is done on patient samples.

  20. Workflow in Clinical Trial Sites & Its Association with Near Miss Events for Data Quality: Ethnographic, Workflow & Systems Simulation

    Science.gov (United States)

    Araujo de Carvalho, Elias Cesar; Batilana, Adelia Portero; Claudino, Wederson; Lima Reis, Luiz Fernando; Schmerling, Rafael A.; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    Background With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Methodology/Principal Findings Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Conclusions/Significance Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage

  1. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  2. Considering Time in Orthophotography Production: from a General Workflow to a Shortened Workflow for a Faster Disaster Response

    Science.gov (United States)

    Lucas, G.

    2015-08-01

    This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic); these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed) their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the "normal" practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator); the second is simplified and make compromise over positional accuracy (using direct geo-referencing) and seamlines preparation in order to achieve

  3. CONSIDERING TIME IN ORTHOPHOTOGRAPHY PRODUCTION: FROM A GENERAL WORKFLOW TO A SHORTENED WORKFLOW FOR A FASTER DISASTER RESPONSE

    Directory of Open Access Journals (Sweden)

    G. Lucas

    2015-08-01

    Full Text Available This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic; these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the “normal” practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator; the second is simplified and make compromise over positional accuracy (using direct geo-referencing and seamlines preparation in

  4. BioWMS: a web-based Workflow Management System for bioinformatics.

    Science.gov (United States)

    Bartocci, Ezio; Corradini, Flavio; Merelli, Emanuela; Scortichini, Lorenzo

    2007-03-08

    An in-silico experiment can be naturally specified as a workflow of activities implementing, in a standardized environment, the process of data and control analysis. A workflow has the advantage to be reproducible, traceable and compositional by reusing other workflows. In order to support the daily work of a bioscientist, several Workflow Management Systems (WMSs) have been proposed in bioinformatics. Generally, these systems centralize the workflow enactment and do not exploit standard process definition languages to describe, in order to be reusable, workflows. While almost all WMSs require heavy stand-alone applications to specify new workflows, only few of them provide a web-based process definition tool. We have developed BioWMS, a Workflow Management System that supports, through a web-based interface, the definition, the execution and the results management of an in-silico experiment. BioWMS has been implemented over an agent-based middleware. It dynamically generates, from a user workflow specification, a domain-specific, agent-based workflow engine. Our approach exploits the proactiveness and mobility of the agent-based technology to embed, inside agents behaviour, the application domain features. Agents are workflow executors and the resulting workflow engine is a multiagent system - a distributed, concurrent system--typically open, flexible, and adaptative. A demo is available at http://litbio.unicam.it:8080/biowms. BioWMS, supported by Hermes mobile computing middleware, guarantees the flexibility, scalability and fault tolerance required to a workflow enactment over distributed and heterogeneous environment. BioWMS is funded by the FIRB project LITBIO (Laboratory for Interdisciplinary Technologies in Bioinformatics).

  5. TCGA Workflow: Analyze cancer genomics and epigenomics data using Bioconductor packages [version 2; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Tiago C. Silva

    2016-12-01

    Full Text Available Biotechnological advances in sequencing have led to an explosion of publicly available data via large international consortia such as The Cancer Genome Atlas (TCGA, The Encyclopedia of DNA Elements (ENCODE, and The NIH Roadmap Epigenomics Mapping Consortium (Roadmap. These projects have provided unprecedented opportunities to interrogate the epigenome of cultured cancer cell lines as well as normal and tumor tissues with high genomic resolution. The Bioconductor project offers more than 1,000 open-source software and statistical packages to analyze high-throughput genomic data. However, most packages are designed for specific data types (e.g. expression, epigenetics, genomics and there is no one comprehensive tool that provides a complete integrative analysis of the resources and data provided by all three public projects. A need to create an integration of these different analyses was recently proposed. In this workflow, we provide a series of biologically focused integrative analyses of different molecular data. We describe how to download, process and prepare TCGA data and by harnessing several key Bioconductor packages, we describe how to extract biologically meaningful genomic and epigenomic data. Using Roadmap and ENCODE data, we provide a work plan to identify biologically relevant functional epigenomic elements associated with cancer. To illustrate our workflow, we analyzed two types of brain tumors: low-grade glioma (LGG versus high-grade glioma (glioblastoma multiform or GBM. This workflow introduces the following Bioconductor packages: AnnotationHub, ChIPSeeker, ComplexHeatmap, pathview, ELMER, GAIA, MINET, RTCGAToolbox, TCGAbiolinks.

  6. Enhancing clinical effectiveness of pre-radiotherapy workflow by using multidisciplinary-cooperating e-control and e-alerts: A SQUIRE-compliant quality-improving study.

    Science.gov (United States)

    Lin, Yung-Hsiang; Hung, Shih-Kai; Lee, Moon-Sing; Chiou, Wen-Yen; Lai, Chun-Liang; Shih, Yi-Ting; Yeh, Pei-Han; Lin, Yi-An; Tsai, Wei-Ta; Hsieh, Hui-Ling; Chen, Liang-Cheng; Huang, Li-Wen; Lin, Po-Hao; Liu, Dai-Wei; Hsu, Feng-Chun; Tsai, Shiang-Jiun; Liu, Jia-Chi; Chung, En-Seu; Lin, Hon-Yi

    2017-06-01

    Radiotherapy (RT) is useful in managing cancer diseases. In clinical practice, early initiation of RT is crucial for enhancing tumor control. But, delivering precise RT requires a series of pre-RT working processes in a tight staff-cooperation manner. In this regard, using information system to conduct e-control and e-alerts has been suggested to improve practice effectiveness; however, this effect is not well defined in a real-world RT setting.We designed an information system to perform e-control and e-alerts for the whole process of pre-RT workflow to shorten processing time, to improve overall staff satisfaction, and to enhance working confidence.A quality-improving study conducted in a large RT center.Externally validated data were retrospectively analyzed for comparison before (from Sep. 2012 to Dec. 2012, n = 223) and after (from Sep. 2013 to Dec. 2013, n = 240) implementation of pre-RT e-control and e-alerts.Applying the e-control with delay-working e-alerts in pre-RT workflow was the main intervention.Nine workstations were identified in pre-RT workflow. The primary outcome measure was the processing time in each pre-RT workstations before and after implementing the e-control and e-alerts. Secondary measures were staff-working confidence and near-missing cases during the process of pre-RT workflow.After implementing e-control, overall processing time of pre-RT workflow was shortened from 12.2 days to 8.9 days (P workflow. Clinical effectiveness, staff satisfaction, and working confidence are able to be enhanced obviously.

  7. The impact of missing sensor information on surgical workflow management.

    Science.gov (United States)

    Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas

    2013-09-01

    Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.

  8. Accelerating Medical Research using the Swift Workflow System

    Science.gov (United States)

    STEF-PRAUN, Tiberiu; CLIFFORD, Benjamin; FOSTER, Ian; HASSON, Uri; HATEGAN, Mihael; SMALL, Steven L.; WILDE, Michael; ZHAO, Yong

    2009-01-01

    Both medical research and clinical practice are starting to involve large quantities of data and to require large-scale computation, as a result of the digitization of many areas of medicine. For example, in brain research – the domain that we consider here – a single research study may require the repeated processing, using computationally demanding and complex applications, of thousands of files corresponding to hundreds of functional MRI studies. Execution efficiency demands the use of parallel or distributed computing, but few medical researchers have the time or expertise to write the necessary parallel programs. The Swift system addresses these concerns. A simple scripting language, SwiftScript, provides for the concise high-level specification of workflows that invoke various application programs on potentially large quantities of data. The Swift engine provides for the efficient execution of these workflows on sequential computers, parallel computers, and/or distributed grids that federate the computing resources of many sites. Last but not least, the Swift provenance catalog keeps track of all actions performed, addressing vital bookkeeping functions that so often cause difficulties in large computations. To illustrate the use of Swift for medical research, we describe its use for the analysis of functional MRI data as part of a research project examining the neurological mechanisms of recovery from aphasia after stroke. We show how SwiftScript is used to encode an application workflow, and present performance results that demonstrate our ability to achieve significant speedups on both a local parallel computing cluster and multiple parallel clusters at distributed sites. PMID:17476063

  9. A practical data processing workflow for multi-OMICS projects.

    Science.gov (United States)

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post

  10. Automated manipulation of systems biology models using libSBML within Taverna workflows.

    Science.gov (United States)

    Li, Peter; Oinn, Tom; Soiland, Stian; Kell, Douglas B

    2008-01-15

    Many data manipulation processes involve the use of programming libraries. These processes may beneficially be automated due to their repeated use. A convenient type of automation is in the form of workflows that also allow such processes to be shared amongst the community. The Taverna workflow system has been extended to enable it to use and invoke Java classes and methods as tasks within Taverna workflows. These classes and methods are selected for use during workflow construction by a Java Doclet application called the API Consumer. This selection is stored as an XML file which enables Taverna to present the subset of the API for use in the composition of workflows. The ability of Taverna to invoke Java classes and methods is demonstrated by a workflow in which we use libSBML to map gene expression data onto a metabolic pathway represented as a SBML model. Taverna and the API Consumer application can be freely downloaded from http://taverna.sourceforge.net

  11. Inter-observer reliability assessments in time motion studies: the foundation for meaningful clinical workflow analysis.

    Science.gov (United States)

    Lopetegui, Marcelo A; Bai, Shasha; Yen, Po-Yin; Lai, Albert; Embi, Peter; Payne, Philip R O

    2013-01-01

    Understanding clinical workflow is critical for researchers and healthcare decision makers. Current workflow studies tend to oversimplify and underrepresent the complexity of clinical workflow. Continuous observation time motion studies (TMS) could enhance clinical workflow studies by providing rich quantitative data required for in-depth workflow analyses. However, methodological inconsistencies have been reported in continuous observation TMS, potentially reducing the validity of TMS' data and limiting their contribution to the general state of knowledge. We believe that a cornerstone in standardizing TMS is to ensure the reliability of the human observers. In this manuscript we review the approaches for inter-observer reliability assessment (IORA) in a representative sample of TMS focusing on clinical workflow. We found that IORA is an uncommon practice, inconsistently reported, and often uses methods that provide partial and overestimated measures of agreement. Since a comprehensive approach to IORA is yet to be proposed and validated, we provide initial recommendations for IORA reporting in continuous observation TMS.

  12. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  13. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  14. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M., Jr.; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  15. Workflow for large-scale analysis of melanoma tissue samples

    Directory of Open Access Journals (Sweden)

    Maria E. Yakovleva

    2015-09-01

    Full Text Available The aim of the present study was to create an optimal workflow for analysing a large cohort of malignant melanoma tissue samples. Samples were lysed with urea and enzymatically digested with trypsin or trypsin/Lys C. Buffer exchange or dilution was used to reduce urea concentration prior to digestion. The tissue digests were analysed directly or following strong cation exchange (SCX fractionation by nano LC–MS/MS. The approach which resulted in the largest number of protein IDs involved a buffer exchange step before enzymatic digestion with trypsin and chromatographic separation in 120 min gradient followed by SCX–RP separation of peptides.

  16. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  17. Inpatient nursing care and early warning scores: a workflow mismatch.

    Science.gov (United States)

    Watson, Anne; Skipper, Chantel; Steury, Rachel; Walsh, Heather; Levin, Amanda

    2014-01-01

    Early warning scores calculated by registered nurses (RNs) are used in hospitals to enhance the recognition of and communication about patient deterioration. This study evaluated workflow variables surrounding calculation and documentation of one pediatric hospital's use of an early warning score. Results indicated that there were significant delays in documentation of early warning scores by RNs and inconsistencies between the early warning scores and vital signs collected and documented by non-RN personnel. These findings reflected information obtained from the RNs about how they prioritize tasks and use work-arounds to specific systems issues regarding assessment and documentation in the electronic medical record.

  18. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  19. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  20. Workflow Characterization in a Busy Urban Primary Care Clinic

    OpenAIRE

    Louthan, Michelle; Carrington, Scott; Bahamon, Nicholas; Bauer, Justin; Zafar, Atif; Lehto, Mark

    2006-01-01

    We are examining the workflow processes within a large, urban general internal medicine practice in order to understand task inefficiencies that can lead to medical errors. We are performing a time-motion study looking at task management of check-in, check-out clerks, nurses, nurse’s aides and physicians. Our pilot data suggests that there is significant variability in the task burden at different times of the day due to several factors, including patient-show rates, time allotted to late arr...

  1. Software Design for Empowering Scientists

    OpenAIRE

    De Roure, David; Goble, Carole

    2009-01-01

    Scientific research is increasingly digital. Some activities, such as data analysis, search, and simulation, can be accelerated by letting scientists write workflows and scripts that automate routine activities. These capture pieces of the scientific method that scientists can share. The averna Workbench, a widely deployed scientific-workflow-management system, together with the myExperiment social Web site for sharing scientific experiments, follow six principles of designing software for ad...

  2. Semantic Document Library: A Virtual Research Environment for Documents, Data and Workflows Sharing

    Science.gov (United States)

    Kotwani, K.; Liu, Y.; Myers, J.; Futrelle, J.

    2008-12-01

    The Semantic Document Library (SDL) was driven by use cases from the environmental observatory communities and is designed to provide conventional document repository features of uploading, downloading, editing and versioning of documents as well as value adding features of tagging, querying, sharing, annotating, ranking, provenance, social networking and geo-spatial mapping services. It allows users to organize a catalogue of watershed observation data, model output, workflows, as well publications and documents related to the same watershed study through the tagging capability. Users can tag all relevant materials using the same watershed name and find all of them easily later using this tag. The underpinning semantic content repository can store materials from other cyberenvironments such as workflow or simulation tools and SDL provides an effective interface to query and organize materials from various sources. Advanced features of the SDL allow users to visualize the provenance of the materials such as the source and how the output data is derived. Other novel features include visualizing all geo-referenced materials on a geospatial map. SDL as a component of a cyberenvironment portal (the NCSA Cybercollaboratory) has goal of efficient management of information and relationships between published artifacts (Validated models, vetted data, workflows, annotations, best practices, reviews and papers) produced from raw research artifacts (data, notes, plans etc.) through agents (people, sensors etc.). Tremendous scientific potential of artifacts is achieved through mechanisms of sharing, reuse and collaboration - empowering scientists to spread their knowledge and protocols and to benefit from the knowledge of others. SDL successfully implements web 2.0 technologies and design patterns along with semantic content management approach that enables use of multiple ontologies and dynamic evolution (e.g. folksonomies) of terminology. Scientific documents involved with

  3. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-11-06

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse.

  4. Standardizing clinical trials workflow representation in UML for international site comparison.

    Science.gov (United States)

    de Carvalho, Elias Cesar Araujo; Jayanti, Madhav Kishore; Batilana, Adelia Portero; Kozan, Andreia M O; Rodrigues, Maria J; Shah, Jatin; Loures, Marco R; Patil, Sunita; Payne, Philip; Pietrobon, Ricardo

    2010-11-09

    With the globalization of clinical trials, a growing emphasis has been placed on the standardization of the workflow in order to ensure the reproducibility and reliability of the overall trial. Despite the importance of workflow evaluation, to our knowledge no previous studies have attempted to adapt existing modeling languages to standardize the representation of clinical trials. Unified Modeling Language (UML) is a computational language that can be used to model operational workflow, and a UML profile can be developed to standardize UML models within a given domain. This paper's objective is to develop a UML profile to extend the UML Activity Diagram schema into the clinical trials domain, defining a standard representation for clinical trial workflow diagrams in UML. Two Brazilian clinical trial sites in rheumatology and oncology were examined to model their workflow and collect time-motion data. UML modeling was conducted in Eclipse, and a UML profile was developed to incorporate information used in discrete event simulation software. Ethnographic observation revealed bottlenecks in workflow: these included tasks requiring full commitment of CRCs, transferring notes from paper to computers, deviations from standard operating procedures, and conflicts between different IT systems. Time-motion analysis revealed that nurses' activities took up the most time in the workflow and contained a high frequency of shorter duration activities. Administrative assistants performed more activities near the beginning and end of the workflow. Overall, clinical trial tasks had a greater frequency than clinic routines or other general activities. This paper describes a method for modeling clinical trial workflow in UML and standardizing these workflow diagrams through a UML profile. In the increasingly global environment of clinical trials, the standardization of workflow modeling is a necessary precursor to conducting a comparative analysis of international clinical trials

  5. A Framework for Modeling Workflow Execution by an Interdisciplinary Healthcare Team.

    Science.gov (United States)

    Kezadri-Hamiaz, Mounira; Rosu, Daniela; Wilk, Szymon; Kuziemsky, Craig; Michalowski, Wojtek; Carrier, Marc

    2015-01-01

    The use of business workflow models in healthcare is limited because of insufficient capture of complexities associated with behavior of interdisciplinary healthcare teams that execute healthcare workflows. In this paper we present a novel framework that builds on the well-founded business workflow model formalism and related infrastructures and introduces a formal semantic layer that describes selected aspects of team dynamics and supports their real-time operationalization.

  6. Using SensorML to describe scientific workflows in distributed web service environments

    CSIR Research Space (South Africa)

    Van Zyl, TL

    2009-07-01

    Full Text Available . Compared with business workflows, scientific workflow has special features such as computation, data or transaction intensity, less human interaction, and a large number of activities [3]. Some emerging computing infrastructures... such as grid computing with powerful computing and resource sharing capabilities present the potential for accommodating those special features. The workflows aim to provide a simple concise notation that allows easy parallelization...

  7. Allocation optimale multicontraintes des workflows aux ressources d’un environnement Cloud Computing

    OpenAIRE

    Yassa, Sonia

    2014-01-01

    Cloud Computing is increasingly recognized as a new way to use on-demand, computing, storage and network services in a transparent and efficient way. In this thesis, we address the problem of workflows scheduling on distributed heterogeneous infrastructure of Cloud Computing. The existing workflows scheduling approaches mainly focus on the bi-objective optimization of the makespan and the cost. In this thesis, we propose news workflows scheduling algorithms based on metaheuristics. Our algori...

  8. A software-aided workflow for precinct-scale residential redevelopment

    Energy Technology Data Exchange (ETDEWEB)

    Glackin, Stephen, E-mail: sglackin@swin.edu.au [Swinburne University of Technology, Melbourne, Victoria (Australia); Trubka, Roman, E-mail: r.trubka@gmail.com [Curtin University, Perth, Western Australia (Australia); Dionisio, Maria Rita, E-mail: rita.dionisio@canterbury.ac.nz [University of Canterbury (New Zealand)

    2016-09-15

    Growing urban populations, combined with environmental challenges, have placed significant pressure on urban planning to supply housing while addressing policy issues such as sustainability, affordability, and liveability. The interrelated nature of these issues, combined with the requirement of evidence-based planning, has made decision-making so complex that urban planners need to combine expertise on energy, water, carbon emissions, transport and economic development along with other bodies of knowledge necessary to make well-informed decisions. This paper presents two geospatial software systems that can assist in the mediation of complexity, by allowing users to assess a variety of planning metrics without expert knowledge in those disciplines. Using Envision and Envision Scenario Planner (ESP), both products of the Greening the Greyfields research project funded by the Cooperative Research Centre for Spatial Information (CRCSI) in Australia, we demonstrate a workflow for identifying potential redevelopment precincts and designing and assessing possible redevelopment scenarios to optimise planning outcomes.

  9. Cluster Flow: A user-friendly bioinformatics workflow tool [version 1; referees: 3 approved

    Directory of Open Access Journals (Sweden)

    Philip Ewels

    2016-12-01

    Full Text Available Pipeline tools are becoming increasingly important within the field of bioinformatics. Using a pipeline manager to manage and run workflows comprised of multiple tools reduces workload and makes analysis results more reproducible. Existing tools require significant work to install and get running, typically needing pipeline scripts to be written from scratch before running any analysis. We present Cluster Flow, a simple and flexible bioinformatics pipeline tool designed to be quick and easy to install. Cluster Flow comes with 40 modules for common NGS processing steps, ready to work out of the box. Pipelines are assembled using these modules with a simple syntax that can be easily modified as required. Core helper functions automate many common NGS procedures, making running pipelines simple. Cluster Flow is available with an GNU GPLv3 license on GitHub. Documentation, examples and an online demo are available at http://clusterflow.io.

  10. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial.

    Science.gov (United States)

    Joda, Tim; Brägger, Urs

    2015-01-01

    To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the

  11. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC environments

    Directory of Open Access Journals (Sweden)

    Athanassios M. Kintsakis

    2017-01-01

    Full Text Available Hermes introduces a new “describe once, run anywhere” paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  12. Detecting dissonance in clinical and research workflow for translational psychiatric registries.

    Science.gov (United States)

    Cofiel, Luciana; Bassi, Débora U; Ray, Ryan Kumar; Pietrobon, Ricardo; Brentani, Helena

    2013-01-01

    The interplay between the workflow for clinical tasks and research data collection is often overlooked, ultimately making it ineffective. To the best of our knowledge, no previous studies have developed standards that allow for the comparison of workflow models derived from clinical and research tasks toward the improvement of data collection processes. In this study we used the term dissonance for the occurrences where there was a discord between clinical and research workflows. We developed workflow models for a translational research study in psychiatry and the clinic where its data collection was carried out. After identifying points of dissonance between clinical and research models we derived a corresponding classification system that ultimately enabled us to re-engineer the data collection workflow. We considered (1) the number of patients approached for enrollment and (2) the number of patients enrolled in the study as indicators of efficiency in research workflow. We also recorded the number of dissonances before and after the workflow modification. We identified 22 episodes of dissonance across 6 dissonance categories: actor, communication, information, artifact, time, and space. We were able to eliminate 18 episodes of dissonance and increase the number of patients approached and enrolled in research study trough workflow modification. The classification developed in this study is useful for guiding the identification of dissonances and reveal modifications required to align the workflow of data collection and the clinical setting. The methodology described in this study can be used by researchers to standardize data collection process.

  13. How to plan workflow changes: a practical quality improvement tool used in an outpatient hospital pharmacy.

    Science.gov (United States)

    Aguilar, Christine; Chau, Connie; Giridharan, Neha; Huh, Youchin; Cooley, Janet; Warholak, Terri L

    2013-06-01

    A quality improvement tool is provided to improve pharmacy workflow with the goal of minimizing errors caused by workflow issues. This study involved workflow evaluation and reorganization, and staff opinions of these proposed changes. The study pharmacy was an outpatient pharmacy in the Tucson area. However, the quality improvement tool may be applied in all pharmacy settings, including but not limited to community, hospital, and independent pharmacies. This tool can help the user to identify potential workflow problem spots, such as high-traffic areas through the creation of current and proposed workflow diagrams. Creating a visual representation can help the user to identify problem spots and to propose changes to optimize workflow. It may also be helpful to assess employees' opinions of these changes. The workflow improvement tool can be used to assess where improvements are needed in a pharmacy's floor plan and workflow. Suggestions for improvements in the study pharmacy included increasing the number of verification points and decreasing high traffic areas in the workflow. The employees of the study pharmacy felt that the proposed changes displayed greater continuity, sufficiency, accessibility, and space within the pharmacy.

  14. Geometric processing workflow for vertical and oblique hyperspectral frame images collected using UAV

    Science.gov (United States)

    Markelin, L.; Honkavaara, E.; Näsi, R.; Nurminen, K.; Hakala, T.

    2014-08-01

    Remote sensing based on unmanned airborne vehicles (UAVs) is a rapidly developing field of technology. UAVs enable accurate, flexible, low-cost and multiangular measurements of 3D geometric, radiometric, and temporal properties of land and vegetation using various sensors. In this paper we present a geometric processing chain for multiangular measurement system that is designed for measuring object directional reflectance characteristics in a wavelength range of 400-900 nm. The technique is based on a novel, lightweight spectral camera designed for UAV use. The multiangular measurement is conducted by collecting vertical and oblique area-format spectral images. End products of the geometric processing are image exterior orientations, 3D point clouds and digital surface models (DSM). This data is needed for the radiometric processing chain that produces reflectance image mosaics and multiangular bidirectional reflectance factor (BRF) observations. The geometric processing workflow consists of the following three steps: (1) determining approximate image orientations using Visual Structure from Motion (VisualSFM) software, (2) calculating improved orientations and sensor calibration using a method based on self-calibrating bundle block adjustment (standard photogrammetric software) (this step is optional), and finally (3) creating dense 3D point clouds and DSMs using Photogrammetric Surface Reconstruction from Imagery (SURE) software that is based on semi-global-matching algorithm and it is capable of providing a point density corresponding to the pixel size of the image. We have tested the geometric processing workflow over various targets, including test fields, agricultural fields, lakes and complex 3D structures like forests.

  15. A comprehensive workflow for general-purpose neural modeling with highly configurable neuromorphic hardware systems.

    Science.gov (United States)

    Brüderle, Daniel; Petrovici, Mihai A; Vogginger, Bernhard; Ehrlich, Matthias; Pfeil, Thomas; Millner, Sebastian; Grübl, Andreas; Wendt, Karsten; Müller, Eric; Schwartz, Marc-Olivier; de Oliveira, Dan Husmann; Jeltsch, Sebastian; Fieres, Johannes; Schilling, Moritz; Müller, Paul; Breitwieser, Oliver; Petkov, Venelin; Muller, Lyle; Davison, Andrew P; Krishnamurthy, Pradeep; Kremkow, Jens; Lundqvist, Mikael; Muller, Eilif; Partzsch, Johannes; Scholze, Stefan; Zühl, Lukas; Mayr, Christian; Destexhe, Alain; Diesmann, Markus; Potjans, Tobias C; Lansner, Anders; Schüffny, René; Schemmel, Johannes; Meier, Karlheinz

    2011-05-01

    In this article, we present a methodological framework that meets novel requirements emerging from upcoming types of accelerated and highly configurable neuromorphic hardware systems. We describe in detail a device with 45 million programmable and dynamic synapses that is currently under development, and we sketch the conceptual challenges that arise from taking this platform into operation. More specifically, we aim at the establishment of this neuromorphic system as a flexible and neuroscientifically valuable modeling tool that can be used by non-hardware experts. We consider various functional aspects to be crucial for this purpose, and we introduce a consistent workflow with detailed descriptions of all involved modules that implement the suggested steps: The integration of the hardware interface into the simulator-independent model description language PyNN; a fully automated translation between the PyNN domain and appropriate hardware configurations; an executable specification of the future neuromorphic system that can be seamlessly integrated into this biology-to-hardware mapping process as a test bench for all software layers and possible hardware design modifications; an evaluation scheme that deploys models from a dedicated benchmark library, compares the results generated by virtual or prototype hardware devices with reference software simulations and analyzes the differences. The integration of these components into one hardware-software workflow provides an ecosystem for ongoing preparative studies that support the hardware design process and represents the basis for the maturity of the model-to-hardware mapping software. The functionality and flexibility of the latter is proven with a variety of experimental results.

  16. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    Energy Technology Data Exchange (ETDEWEB)

    Copps, Kevin D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data management coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.

  17. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  18. Magallanes: a web services discovery and automatic workflow composition tool

    Directory of Open Access Journals (Sweden)

    Trelles Oswaldo

    2009-10-01

    Full Text Available Abstract Background To aid in bioinformatics data processing and analysis, an increasing number of web-based applications are being deployed. Although this is a positive circumstance in general, the proliferation of tools makes it difficult to find the right tool, or more importantly, the right set of tools that can work together to solve real complex problems. Results Magallanes (Magellan is a versatile, platform-independent Java library of algorithms aimed at discovering bioinformatics web services and associated data types. A second important feature of Magallanes is its ability to connect available and compatible web services into workflows that can process data sequentially to reach a desired output given a particular input. Magallanes' capabilities can be exploited both as an API or directly accessed through a graphic user interface. The Magallanes' API is freely available for academic use, and together with Magallanes application has been tested in MS-Windows™ XP and Unix-like operating systems. Detailed implementation information, including user manuals and tutorials, is available at http://www.bitlab-es.com/magallanes. Conclusion Different implementations of the same client (web page, desktop applications, web services, etc. have been deployed and are currently in use in real installations such as the National Institute of Bioinformatics (Spain and the ACGT-EU project. This shows the potential utility and versatility of the software library, including the integration of novel tools in the domain and with strong evidences in the line of facilitate the automatic discovering and composition of workflows.

  19. Workflow of the Grover algorithm simulation incorporating CUDA and GPGPU

    Science.gov (United States)

    Lu, Xiangwen; Yuan, Jiabin; Zhang, Weiwei

    2013-09-01

    The Grover quantum search algorithm, one of only a few representative quantum algorithms, can speed up many classical algorithms that use search heuristics. No true quantum computer has yet been developed. For the present, simulation is one effective means of verifying the search algorithm. In this work, we focus on the simulation workflow using a compute unified device architecture (CUDA). Two simulation workflow schemes are proposed. These schemes combine the characteristics of the Grover algorithm and the parallelism of general-purpose computing on graphics processing units (GPGPU). We also analyzed the optimization of memory space and memory access from this perspective. We implemented four programs on CUDA to evaluate the performance of schemes and optimization. Through experimentation, we analyzed the organization of threads suited to Grover algorithm simulations, compared the storage costs of the four programs, and validated the effectiveness of optimization. Experimental results also showed that the distinguished program on CUDA outperformed the serial program of libquantum on a CPU with a speedup of up to 23 times (12 times on average), depending on the scale of the simulation.

  20. Metagenomics workflow analysis of endophytic bacteria from oil palm fruits

    Science.gov (United States)

    Tanjung, Z. A.; Aditama, R.; Sudania, W. M.; Utomo, C.; Liwang, T.

    2017-05-01

    Next-Generation Sequencing (NGS) has become a powerful sequencing tool for microbial study especially to lead the establishment of the field area of metagenomics. This study described a workflow to analyze metagenomics data of a Sequence Read Archive (SRA) file under accession ERP004286 deposited by University of Sao Paulo. It was a direct sequencing data generated by 454 pyrosequencing platform originated from oil palm fruits endophytic bacteria which were cultured using oil-palm enriched medium. This workflow used SortMeRNA to split ribosomal reads sequence, Newbler (GS Assembler and GS Mapper) to assemble and map reads into genome reference, BLAST package to identify and annotate contigs sequence, and QualiMap for statistical analysis. Eight bacterial species were identified in this study. Enterobacter cloacae was the most abundant species followed by Citrobacter koseri, Seratia marcescens, Latococcus lactis subsp. lactis, Klebsiella pneumoniae, Citrobacter amalonaticus, Achromobacter xylosoxidans, and Pseudomonas sp. respectively. All of these species have been reported as endophyte bacteria in various plant species and each has potential as plant growth promoting bacteria or another application in agricultural industries.

  1. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  2. Value-Oriented Design of Service Coordination Processes: Correctness and Trust

    NARCIS (Netherlands)

    Wieringa, Roelf J.; Gordijn, Jaap

    The rapid growth of service coordination languages creates a need for methodological support for coordination design. Coordination design differs from workflow design because a coordination process connects different businesses that can each make design decisions independently from the others, and

  3. Improvement of workflow and processes to ease and enrich meaningful use of health information technology

    Directory of Open Access Journals (Sweden)

    Singh R

    2013-11-01

    Full Text Available Ranjit Singh,1 Ashok Singh,2 Devan R Singh,3 Gurdev Singh1 1Department of Family Medicine, UB Patient Safety Research Center, School of Medicine and Management, State University of NY at Buffalo, NY, USA; 2Niagara Family Medicine Associates, Niagara Falls, NY, USA; 3SaferPatients LLC, Lewiston, NY, USA Abstract: The introduction of health information technology (HIT can have unexpected and unintended patient safety and/or quality consequences. This highly desirable but complex intervention requires workflow changes in order to be effective. Workflow is often cited by providers as the number one 'pain point'. Its redesign needs to be tailored to the organizational context, current workflow, HIT system being introduced, and the resources available. Primary care practices lack the required expertise and need external assistance. Unfortunately, the current methods of using esoteric charts or software are alien to health care workers and are, therefore, perceived to be barriers. Most importantly and ironically, these do not readily educate or enable staff to inculcate a common vision, ownership, and empowerment among all stakeholders. These attributes are necessary for creating highly reliable organizations. We present a tool that addresses US Accreditation Council for Graduate Medical (ACGME competency requirements. Of the six competencies called for by the ACGME, the two that this tool particularly addresses are 'system-based practice' and 'practice-based learning and continuing improvement'. This toolkit is founded on a systems engineering approach. It includes a motivational and orientation presentation, 128 magnetic pictorial and write-erase icons of 40 designs, dry-erase magnetic board, and five visual aids for reducing cognitive and emotive biases in staff. Pilot tests were carried out in practices in Western New York and Colorado, USA. In addition, the toolkit was presented at the 2011 North American Primary Care Research Group (NAPCRG

  4. An open source workflow for 3D printouts of scientific data volumes

    Science.gov (United States)

    Loewe, P.; Klump, J. F.; Wickert, J.; Ludwig, M.; Frigeri, A.

    2013-12-01

    As the amount of scientific data continues to grow, researchers need new tools to help them visualize complex data. Immersive data-visualisations are helpful, yet fail to provide tactile feedback and sensory feedback on spatial orientation, as provided from tangible objects. The gap in sensory feedback from virtual objects leads to the development of tangible representations of geospatial information to solve real world problems. Examples are animated globes [1], interactive environments like tangible GIS [2], and on demand 3D prints. The production of a tangible representation of a scientific data set is one step in a line of scientific thinking, leading from the physical world into scientific reasoning and back: The process starts with a physical observation, or from a data stream generated by an environmental sensor. This data stream is turned into a geo-referenced data set. This data is turned into a volume representation which is converted into command sequences for the printing device, leading to the creation of a 3D printout. As a last, but crucial step, this new object has to be documented and linked to the associated metadata, and curated in long term repositories to preserve its scientific meaning and context. The workflow to produce tangible 3D data-prints from science data at the German Research Centre for Geosciences (GFZ) was implemented as a software based on the Free and Open Source Geoinformatics tools GRASS GIS and Paraview. The workflow was successfully validated in various application scenarios at GFZ using a RapMan printer to create 3D specimens of elevation models, geological underground models, ice penetrating radar soundings for planetology, and space time stacks for Tsunami model quality assessment. While these first pilot applications have demonstrated the feasibility of the overall approach [3], current research focuses on the provision of the workflow as Software as a Service (SAAS), thematic generalisation of information content and

  5. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    Science.gov (United States)

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in

  6. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  7. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  8. Next Generation Sequence Analysis and Computational Genomics Using Graphical Pipeline Workflows

    Directory of Open Access Journals (Sweden)

    Marquis P. Vawter

    2012-08-01

    Full Text Available Whole-genome and exome sequencing have already proven to be essential and powerful methods to identify genes responsible for simple Mendelian inherited disorders. These methods can be applied to complex disorders as well, and have been adopted as one of the current mainstream approaches in population genetics. These achievements have been made possible by next generation sequencing (NGS technologies, which require substantial bioinformatics resources to analyze the dense and complex sequence data. The huge analytical burden of data from genome sequencing might be seen as a bottleneck slowing the publication of NGS papers at this time, especially in psychiatric genetics. We review the existing methods for processing NGS data, to place into context the rationale for the design of a computational resource. We describe our method, the Graphical Pipeline for Computational Genomics (GPCG, to perform the computational steps required to analyze NGS data. The GPCG implements flexible workflows for basic sequence alignment, sequence data quality control, single nucleotide polymorphism analysis, copy number variant identification, annotation, and visualization of results. These workflows cover all the analytical steps required for NGS data, from processing the raw reads to variant calling and annotation. The current version of the pipeline is freely available at http://pipeline.loni.ucla.edu. These applications of NGS analysis may gain clinical utility in the near future (e.g., identifying miRNA signatures in diseases when the bioinformatics approach is made feasible. Taken together, the annotation tools and strategies that have been developed to retrieve information and test hypotheses about the functional role of variants present in the human genome will help to pinpoint the genetic risk factors for psychiatric disorders.

  9. A Workflow Environment for Reactive Transport Modeling with Application to a Mixing- Controlled Precipitation Experiment

    Science.gov (United States)

    Schuchardt, K. L.; Sun, L.; Chase, J. M.; Elsethagen, T. O.; Freedman, V. L.; Redden, G. D.; Scheibe, T. D.

    2007-12-01

    Advances in subsurface modeling techniques such as multi-scale methods, hybrid models, and inverse modeling, combined with petascale computing capabilities, will result in simulations that run over longer time scales, cover larger geographic regions, and model increasingly detailed physical processes. This will lead to significantly more data of increased complexity, creating challenges to already strained processes for parameterizing and running models, organizing and tracking data, and visualizing outputs. To support effective development and utilization of next-generation simulators, we are developing a process integration framework that combines and extends leading edge technologies for process automation, data and metadata management, and large-scale data visualization. Our process integration framework applies workflow techniques to integrate components for accessing and preparing inputs, running simulations, and analyzing results. Data management and provenance middleware enables sharing and community development of data sources and stores full information about data and processes. In the hands of modelers, experimentalists, and developers, the process integration framework will improve efficiency, accuracy and confidence in results, and broaden the array of theories available. In this poster (which will include a live computer demo of the workflow environment) we will present a prototype of the process integration framework, developed to address a selected benchmark problem. The prototype is being used to perform simulations of an intermediate-scale experiment in which a solid mineral is precipitated from the reaction of two mixing solutes. A range of possible experimental configurations are being explored to support design of a planned set of experiments incorporating heterogeneous media. The prototype provides a user interface to specify parameter ranges, runs the required simulations on a user specified machine, automatically manages the input and output

  10. Ceramic molar crown reproducibility by digital workflow manufacturing: An in vitro study.

    Science.gov (United States)

    Jeong, Ii-Do; Kim, Woong-Chul; Park, Jinyoung; Kim, Chong-Myeong; Kim, Ji-Hwan

    2017-08-01

    This in vitro study aimed to analyze and compare the reproducibility of zirconia and lithium disilicate crowns manufactured by digital workflow. A typodont model with a prepped upper first molar was set in a phantom head, and a digital impression was obtained with a video intraoral scanner (CEREC Omnicam; Sirona GmbH), from which a single crown was designed and manufactured with CAD/CAM into a zirconia crown and lithium disilicate crown (n=12). Reproducibility of each crown was quantitatively retrieved by superimposing the digitized data of the crown in 3D inspection software, and differences were graphically mapped in color. Areas with large differences were analyzed with digital microscopy. Mean quadratic deviations (RMS) quantitatively obtained from each ceramic group were statistically analyzed with Student's t-test (α=.05). The RMS value of lithium disilicate crown was 29.2 (4.1) µm and 17.6 (5.5) µm on the outer and inner surfaces, respectively, whereas these values were 18.6 (2.0) µm and 20.6 (5.1) µm for the zirconia crown. Reproducibility of zirconia and lithium disilicate crowns had a statistically significant difference only on the outer surface (P<.001). The outer surface of lithium disilicate crown showed over-contouring on the buccal surface and under-contouring on the inner occlusal surface. The outer surface of zirconia crown showed both over- and under-contouring on the buccal surface, and the inner surface showed under-contouring in the marginal areas. Restoration manufacturing by digital workflow will enhance the reproducibility of zirconia single crowns more than that of lithium disilicate single crowns.

  11. Launching an EarthCube Interoperability Workbench for Constructing Workflows and Employing Service Interfaces

    Science.gov (United States)

    Fulker, D. W.; Pearlman, F.; Pearlman, J.; Arctur, D. K.; Signell, R. P.

    2016-12-01

    A major challenge for geoscientists—and a key motivation for the National Science Foundation's EarchCube initiative—is to integrate data across disciplines, as is necessary for complex Earth-system studies such as climate change. The attendant technical and social complexities have led EarthCube participants to devise a system-of-systems architectural concept. Its centerpiece is a (virtual) interoperability workbench, around which a learning community can coalesce, supported in their evolving quests to join data from diverse sources, to synthesize new forms of data depicting Earth phenomena, and to overcome immense obstacles that arise, for example, from mismatched nomenclatures, projections, mesh geometries and spatial-temporal scales. The full architectural concept will require significant time and resources to implement, but this presentation describes a (minimal) starter kit. With a keep-it-simple mantra this workbench starter kit can fulfill the following four objectives: 1) demonstrate the feasibility of an interoperability workbench by mid-2017; 2) showcase scientifically useful examples of cross-domain interoperability, drawn, e.g., from funded EarthCube projects; 3) highlight selected aspects of EarthCube's architectural concept, such as a system of systems (SoS) linked via service interfaces; 4) demonstrate how workflows can be designed and used in a manner that enables sharing, promotes collaboration and fosters learning. The outcome, despite its simplicity, will embody service interfaces sufficient to construct—from extant components—data-integration and data-synthesis workflows involving multiple geoscience domains. Tentatively, the starter kit will build on the Jupyter Notebook web application, augmented with libraries for interfacing current services (at data centers involved in EarthCube's Council of Data Facilities, e.g.) and services developed specifically for EarthCube and spanning most geoscience domains.

  12. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  13. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  14. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  15. A history-tracing XML-based provenance framework for workflows

    NARCIS (Netherlands)

    Gerhards, M; Belloum, A.; Berretz, F.; Sander, V.; Skorupa, S.

    2010-01-01

    The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards

  16. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  17. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    This paper gives a detailed overview of the approach to cross-organizational workflow management developed in the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual

  18. CrossFlow: Cross-Organizational Workflow Management for Service Outsourcing in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Aberer, Karl; Ludwig, Heiko; Hoffner, Yigal

    2001-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on

  19. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on

  20. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    Science.gov (United States)

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.