WorldWideScience

Sample records for automatic workflow composition

  1. Magallanes: a web services discovery and automatic workflow composition tool

    Directory of Open Access Journals (Sweden)

    Trelles Oswaldo

    2009-10-01

    Full Text Available Abstract Background To aid in bioinformatics data processing and analysis, an increasing number of web-based applications are being deployed. Although this is a positive circumstance in general, the proliferation of tools makes it difficult to find the right tool, or more importantly, the right set of tools that can work together to solve real complex problems. Results Magallanes (Magellan is a versatile, platform-independent Java library of algorithms aimed at discovering bioinformatics web services and associated data types. A second important feature of Magallanes is its ability to connect available and compatible web services into workflows that can process data sequentially to reach a desired output given a particular input. Magallanes' capabilities can be exploited both as an API or directly accessed through a graphic user interface. The Magallanes' API is freely available for academic use, and together with Magallanes application has been tested in MS-Windows™ XP and Unix-like operating systems. Detailed implementation information, including user manuals and tutorials, is available at http://www.bitlab-es.com/magallanes. Conclusion Different implementations of the same client (web page, desktop applications, web services, etc. have been deployed and are currently in use in real installations such as the National Institute of Bioinformatics (Spain and the ACGT-EU project. This shows the potential utility and versatility of the software library, including the integration of novel tools in the domain and with strong evidences in the line of facilitate the automatic discovering and composition of workflows.

  2. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  3. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  4. Constraint-Guided Workflow Composition Based on the EDAM Ontology

    CERN Document Server

    Lamprecht, Anna-Lena; Steffen, Bernhard; Margaria, Tiziana

    2010-01-01

    Methods for the automatic composition of services into executable workflows need detailed knowledge about the application domain,in particular about the available services and their behavior in terms of input/output data descriptions. In this paper we discuss how the EMBRACE data and methods ontology (EDAM) can be used as background knowledge for the composition of bioinformatics workflows. We show by means of a small example domain that the EDAM knowledge facilitates finding possible workflows, but that additional knowledge is required to guide the search towards actually adequate solutions. We illustrate how the ability to flexibly formulate domain-specific and problem-specific constraints supports the work ow development process.

  5. Software workflow for the automatic tagging of medieval manuscript images (SWATI)

    Science.gov (United States)

    Chandna, Swati; Tonne, Danah; Jejkal, Thomas; Stotzka, Rainer; Krause, Celia; Vanscheidt, Philipp; Busch, Hannah; Prabhune, Ajinkya

    2015-01-01

    Digital methods, tools and algorithms are gaining in importance for the analysis of digitized manuscript collections in the arts and humanities. One example is the BMBF-funded research project "eCodicology" which aims to design, evaluate and optimize algorithms for the automatic identification of macro- and micro-structural layout features of medieval manuscripts. The main goal of this research project is to provide better insights into high-dimensional datasets of medieval manuscripts for humanities scholars. The heterogeneous nature and size of the humanities data and the need to create a database of automatically extracted reproducible features for better statistical and visual analysis are the main challenges in designing a workflow for the arts and humanities. This paper presents a concept of a workflow for the automatic tagging of medieval manuscripts. As a starting point, the workflow uses medieval manuscripts digitized within the scope of the project Virtual Scriptorium St. Matthias". Firstly, these digitized manuscripts are ingested into a data repository. Secondly, specific algorithms are adapted or designed for the identification of macro- and micro-structural layout elements like page size, writing space, number of lines etc. And lastly, a statistical analysis and scientific evaluation of the manuscripts groups are performed. The workflow is designed generically to process large amounts of data automatically with any desired algorithm for feature extraction. As a result, a database of objectified and reproducible features is created which helps to analyze and visualize hidden relationships of around 170,000 pages. The workflow shows the potential of automatic image analysis by enabling the processing of a single page in less than a minute. Furthermore, the accuracy tests of the workflow on a small set of manuscripts with respect to features like page size and text areas show that automatic and manual analysis are comparable. The usage of a computer

  6. Semantics and planning based workflow composition and execution for video processing

    OpenAIRE

    Nadarajan, Gayathri

    2011-01-01

    Traditional workflow systems have several drawbacks, e.g. in their inabilities to rapidly react to changes, to construct workflow automatically (or with user involvement) and to improve performance autonomously (or with user involvement) in an incremental manner according to specified goals. Overcoming these limitations would be highly beneficial for complex domains where such adversities are exhibited. Video processing is one such domain that increasingly requires attention as...

  7. An automatic protocol composition checker

    OpenAIRE

    Kojovic, Ivana

    2012-01-01

    Formal analysis is widely used to prove security properties of the protocols. There are tools to check protocols in isolation, but in fact we use many protocols in parallel or even vertically stacked, e.g. running an application protocol (like login) over a secure channel (like TLS) and in general it is unclear if that is safe. There are several works that give sufficient conditions for parallel and vertical composition, but there exists no program to check whether these conditions are actual...

  8. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    OpenAIRE

    Rhodri Cusack; Alejandro Vicente-Grabovetsky; Daniel J Mitchell; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by m...

  9. Workflow-centred evaluation of an automatic lesion tracking software for chemotherapy monitoring by CT

    International Nuclear Information System (INIS)

    In chemotherapy monitoring, an estimation of the change in tumour size is an important criterion for the assessment of treatment success. This requires a comparison between corresponding lesions in the baseline and follow-up computed tomography (CT) examinations. We evaluate the clinical benefits of an automatic lesion tracking tool that identifies the target lesions in the follow-up CT study and pre-computes the lesion volumes. Four radiologists performed volumetric follow-up examinations for 52 patients with and without lesion tracking. In total, 139 lung nodules, liver metastases and lymph nodes were given as target lesions. We measured reading time, inter-reader variability in lesion identification and volume measurements, and the amount of manual adjustments of the segmentation results. With lesion tracking, target lesion assessment time decreased by 38 % or 22 s per lesion. Relative volume difference between readers was reduced from 0.171 to 0.1. Segmentation quality was comparable with and without lesion tracking. Our automatic lesion tracking tool can make interpretation of follow-up CT examinations quicker and provide results that are less reader-dependent. (orig.)

  10. A method for automatic matching of multi-timepoint findings for enhanced clinical workflow

    Science.gov (United States)

    Raghupathi, Laks; Dinesh, MS; Devarakota, Pandu R.; Valadez, Gerardo Hermosillo; Wolf, Matthias

    2013-03-01

    Non-interventional diagnostics (CT or MR) enables early identification of diseases like cancer. Often, lesion growth assessment done during follow-up is used to distinguish between benign and malignant ones. Thus correspondences need to be found for lesions localized at each time point. Manually matching the radiological findings can be time consuming as well as tedious due to possible differences in orientation and position between scans. Also, the complicated nature of the disease makes the physicians to rely on multiple modalities (PETCT, PET-MR) where it is even more challenging. Here, we propose an automatic feature-based matching that is robust to change in organ volume, subpar or no registration that can be done with very less computations. Traditional matching methods rely mostly on accurate image registration and applying the resulting deformation map on the findings coordinates. This has disadvantages when accurate registration is time-consuming or may not be possible due to vast organ volume differences between scans. Our novel matching proposes supervised learning by taking advantage of the underlying CAD features that are already present and considering the matching as a classification problem. In addition, the matching can be done extremely fast and at reasonable accuracy even when the image registration fails for some reason. Experimental results∗ on real-world multi-time point thoracic CT data showed an accuracy of above 90% with negligible false positives on a variety of registration scenarios.

  11. Characterizing chaotic melodies in automatic music composition

    Science.gov (United States)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  12. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  13. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  14. A Fluent Calculus Approach to Automatic Web Service Composition

    OpenAIRE

    CHIFU, V.; SALOMIE, I.

    2009-01-01

    Web service composition is mandatory when complex functional requirements cannot be satisfied by a single Web service. Because of the exponential growth of available Web services, their automatic discovery and composition are highly desirable tasks. This paper presents a new approach for automatic Web service composition based on the formalism of Fluent Calculus using semantic service descriptions. In our approach, the Web service composition process is viewed as an AI planning problem in the...

  15. Automatic composition of MRI and SPECT images

    International Nuclear Information System (INIS)

    The new method to automatically compose MRI image and SPECT image was devised to support the SPECT image which was inferior in the morphological information. This method is a kind of the coordinate transformation to obtain maximal agreement between images using cross correlation of MRI image and SPECT image as the evaluation function to show the degree of the agreement. For the calculation of the cross correlation, MRI T1 weighted image and the morphological information of SPECT image treated by the spatial quadratic differentiation (Laplacian) were used. This method does not require to fix the control point in the tomographic imaging, and can be also applied to PET other than SPECT. This is also useful to follow up the chronological change of a patient by composition among SPECT images and among PET images. Since this method is focused on the internal structure of brain, it is also useful for cases such as cerebral infarction which brain structure has little change. But this method is still under the trial and the examination of the accuracy remained. (K.H.)

  16. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers of...... three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...... transcription vs. with transcription. We discuss the advantages and drawbacks of a particular ASR approach in different computational platforms when used by various speakers of a given language, who may have different accents and levels of proficiency in that language, and who may have different levels of...

  17. QoS measurement of workflow-based web service compositions using Colored Petri net.

    Science.gov (United States)

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation. PMID:25110748

  18. QoS Measurement of Workflow-Based Web Service Compositions Using Colored Petri Net

    Directory of Open Access Journals (Sweden)

    Hossein Nematzadeh

    2014-01-01

    Full Text Available Workflow-based web service compositions (WB-WSCs is one of the main composition categories in service oriented architecture (SOA. Eflow, polymorphic process model (PPM, and business process execution language (BPEL are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today’s web environments. Business should try to provide good quality regarding the customers’ requirements to a composed web service. Thus, quality of service (QoS which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  19. Aspect-Oriented Workflow Languages

    OpenAIRE

    Charfi, Anis

    2007-01-01

    This thesis focuses on the modularity of workflow process specifications. In particular, it studies the expression support for crosscutting concerns and workflow changes in current workflow languages and workflow management systems. To illustrate the issues, two workflow languages are considered: a visual graph-based language and the Web Service composition language BPEL. This thesis starts by describing the implementation of several crosscutting concerns such as data collection for billing, ...

  20. Automatic page composition with combined image crop and layout metrics

    Science.gov (United States)

    Hunter, Andrew; Greig, Darryl

    2012-03-01

    Automatic layout algorithms simplify the composition of image-rich documents, but they still require users to have sufficient artistry to supply well cropped and composed imagery. Combining an automatic cropping technology with a document layout system enables better results to be produced faster by less-skilled users. This paper reviews prior work in automatic image cropping and automatic page layout and presents a case for a combined crop and layout technology. We describe one such technology in a system for interactive publication design by amateur self-publishers and show that providing an automatic cropping system with additional information about the layout context can enable it to generate a more appropriate set of ranked crop options for a given image. Furthermore, we show that providing an automatic layout system with sets of ranked crop options for images can enable it to compose more appropriate page layouts.

  1. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-01-01

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows. PMID:26673793

  2. Automatic Music Composition using Answer Set Programming

    CERN Document Server

    Boenn, Georg; De Vos, Marina; ffitch, John

    2010-01-01

    Music composition used to be a pen and paper activity. These these days music is often composed with the aid of computer software, even to the point where the computer compose parts of the score autonomously. The composition of most styles of music is governed by rules. We show that by approaching the automation, analysis and verification of composition as a knowledge representation task and formalising these rules in a suitable logical language, powerful and expressive intelligent composition tools can be easily built. This application paper describes the use of answer set programming to construct an automated system, named ANTON, that can compose melodic, harmonic and rhythmic music, diagnose errors in human compositions and serve as a computer-aided composition tool. The combination of harmonic, rhythmic and melodic composition in a single framework makes ANTON unique in the growing area of algorithmic composition. With near real-time composition, ANTON reaches the point where it can not only be used as a ...

  3. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  4. Ontology-Based Workflow Validation

    OpenAIRE

    Pham, Tuan Anh; Nguyen, Thi Hoa Hue; Le Thanh, Nhan

    2015-01-01

    In order to ensure a workflow to be executed correctly, many approaches were introduced. But not many of them consider the semantic correctness of the workflow in the design time and the run time. In this paper, a solution to check the semantic correctness of the workflow automatically is presented. To do that, the workflow must be represented in a machine understandable form, an ontology-based approach to represent a workflow is proposed. In addition, we also provide a set of changed operati...

  5. Concentrate composition for Automatic Milking Systems - Effect on milking frequency

    DEFF Research Database (Denmark)

    Madsen, J; Weisbjerg, Martin Riis; Hvelplund, Torben

    2010-01-01

    The purpose of this study was to investigate the potential of affecting milking frequency in an Automatic Milking System (AMS) by changing ingredient composition of the concentrate fed in the AMS. In six experiments, six experimental concentrates were tested against a Standard concentrate all...... the Standard concentrate. A marked effect was found on the number of visits of the cows in the AMS and the subsequent milk production in relation to composition of the concentrate. The composition of the concentrates also influenced the composition of the milk and the MR intake. Based on the overall...

  6. Distributed Behavioural Adaptation for the Automatic Composition of Semantic Services

    OpenAIRE

    Melliti, Tarek; Poizat, Pascal; Ben Mokhtar, Sonia

    2008-01-01

    International audience Services are developed separately and without knowledge of all possible use contexts. They often mismatch or do not correspond exactly to the end-user needs, making direct composition without mediation impossible. In such a case, software adaptation can support composition by producing semi-automatically new software pieces called adaptors. Adaptation proposals have addressed the signature and behavioural service interface levels. Yet, taking also into account the se...

  7. Text-based LSTM networks for Automatic Music Composition

    OpenAIRE

    Choi, Keunwoo; Fazekas, George; Sandler, Mark

    2016-01-01

    In this paper, we introduce new methods and discuss results of text-based LSTM (Long Short-Term Memory) networks for automatic music composition. The proposed network is designed to learn relationships within text documents that represent chord progressions and drum tracks in two case studies. In the experiments, word-RNNs (Recurrent Neural Networks) show good results for both cases, while character-based RNNs (char-RNNs) only succeed to learn chord progressions. The proposed system can be us...

  8. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  9. Automatic Discovery of Non-Compositional Compounds in Parallel Data

    CERN Document Server

    Melamed, I D

    1997-01-01

    Automatic segmentation of text into minimal content-bearing units is an unsolved problem even for languages like English. Spaces between words offer an easy first approximation, but this approximation is not good enough for machine translation (MT), where many word sequences are not translated word-for-word. This paper presents an efficient automatic method for discovering sequences of words that are translated as a unit. The method proceeds by comparing pairs of statistical translation models induced from parallel texts in two languages. It can discover hundreds of non-compositional compounds on each iteration, and constructs longer compounds out of shorter ones. Objective evaluation on a simple machine translation task has shown the method's potential to improve the quality of MT output. The method makes few assumptions about the data, so it can be applied to parallel data other than parallel texts, such as word spellings and pronunciations.

  10. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this......, the workflow manager needs stringent and complete information about each integrated tool. We discuss how such information is structured in CLARIN-DK. Provided that many tools are made available to and through the CLARIN-DK infrastructure, the automatically created workflows, although simple linear programs...

  11. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  12. Automatic page composition with nested sub-layouts

    Science.gov (United States)

    Hunter, Andrew

    2013-03-01

    This paper provides an overview of a system for the automatic composition of publications. The system first composes nested hierarchies of contents, then applies layout engines at branch points in the hierarchies to explore layout options, and finally selects the best overall options for the finished publications. Although the system has been developed as a general platform for automated publishing, this paper describes its application to the composition and layout of a magazine-like publication for social content from Facebook. The composition process works by assembling design fragments that have been populated with text and images from the Facebook social network. The fragments constitute a design language for a publication. Each design fragment is a nested mutable sub-layout that has no specific size or shape until after it has been laid-out. The layout process balances the space requirements of the fragment's internal contents with its external context in the publication. The mutability of sub-layouts requires that their layout options must be kept open until all the other contents that share the same space have been considered. Coping with large numbers of options is one of the greatest challenges in layout automation. Most existing layout methods work by rapidly elimination design options rather than by keeping options open. A further goal of this publishing system is to confirm that a custom publication can be generated quickly by the described methods. In general, the faster that publications can be created, the greater the opportunities for the technology.

  13. Linked-OWL: A new approach for dynamic linked data service workflow composition

    Directory of Open Access Journals (Sweden)

    Hussien Ahmad

    2013-06-01

    Full Text Available The shift from Web of Document into Web of Data based on Linked Data principles defined by Tim Berners-Lee posed a big challenge to build and develop applications to work in Web of Data environment. There are several attempts to build service and application models for Linked Data Cloud. In this paper, we propose a new service model for linked data "Linked-OWL" which is based on RESTful services and OWL-S and copes with linked data principles. This new model shifts the service concept from functions into linked data things and opens the road for Linked Oriented Architecture (LOA and Web of Services as part and on top of Web of Data. This model also provides high level of dynamic service composition capabilities for more accurate dynamic composition and execution of complex business processes in Web of Data environment.

  14. Workflow Mining of More Perspectives of Workflow

    OpenAIRE

    Peng Liu; Bosheng Zhou

    2008-01-01

    The goal of workflow mining is to obtain objective and valuable information from event logs .The research of workflow mining is of great significance for deploying new business process as well as analyzing and improving the already deployed ones. Many information systems log event data about executed tasks. Workflow mining is concerned with the derivation of a graphical process model out of this data. Currently, workflow mining research is narrowly focused on the rediscovery of control flow m...

  15. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  16. Lattice QCD workflows

    Energy Technology Data Exchange (ETDEWEB)

    Piccoli, Luciano; /Fermilab /IIT, Chicago; Kowalkowski, James B.; Simone, James N.; /Fermilab; Sun, Xian-He; Jin, Hui; /IIT, Chicago; Holmgren, Donald J.; Seenu, Nirmal; Singh, Amitoj G.; /Fermilab

    2008-12-01

    This paper discusses the application of existing workflow management systems to a real world science application (LQCD). Typical workflows and execution environment used in production are described. Requirements for the LQCD production system are discussed. The workflow management systems Askalon and Swift were tested by implementing the LQCD workflows and evaluated against the requirements. We report our findings and future work.

  17. A Novel Approach for Bioinformatics Workflow Discovery

    OpenAIRE

    Walaa Nagy; Hoda M.O. Mokhtar

    2014-01-01

    Workflow systems are typical fit for in the explorative research of bioinformaticians. These systems can help bioinformaticians to design and run their experiments and to automatically capture and store the data generated at runtime. On the other hand, Web services are increasingly used as the preferred method for accessing and processing the information coming from the diverse life science sources. In this work we provide an efficient approach for creating bioinformatic workflow for all-serv...

  18. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  19. A Novel Approach for Bioinformatics Workflow Discovery

    Directory of Open Access Journals (Sweden)

    Walaa Nagy

    2014-11-01

    Full Text Available Workflow systems are typical fit for in the explorative research of bioinformaticians. These systems can help bioinformaticians to design and run their experiments and to automatically capture and store the data generated at runtime. On the other hand, Web services are increasingly used as the preferred method for accessing and processing the information coming from the diverse life science sources. In this work we provide an efficient approach for creating bioinformatic workflow for all-service architecture systems (i.e., all system components are services . This architecture style simplifies the user interaction with workflow systems and facilitates both the change of individual components, and the addition of new components to adopt to other workflow tasks if required. We finally present a case study for the bioinformatics domain to elaborate the applicability of our proposed approach.

  20. Professional Windows Workflow Foundation

    CERN Document Server

    Kitta, Todd

    2007-01-01

    If you want to gain the skills to build Windows Workflow Foundation solutions, then this is the book for you. It provides you with a clear, practical guide on how to develop workflow-based software and integrate it into existing technology landscapes. Throughout the pages, you'll also find numerous real-world examples and sample code that will help you to get started quickly.Each major area of Windows Workflow Foundation is explored in depth along with some of the fundamentals operations related to generic workflow applications. You'll also find detailed coverage on how to develop workflow in

  1. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  2. AUTOMATIC WEB SERVICE SELECTION BY OPTIMIZING COST OF COMPOSITION IN SLAKY COMPOSER USING ASSIGNMENT MINIMIZATION APPROACH

    Directory of Open Access Journals (Sweden)

    P. Sandhya

    2012-12-01

    Full Text Available Web service composition is a means of building enterprises virtually by knitting relevant web services on the fly. Automatic web service composition is done dynamically at runtime. Extensive research has been done in the field of automatic web service composition. However all the works focus on providing client oriented results and hence there is less industry adoption of composition technology. In this paper we have proposed a new service collaboration stack that composes with realistic business metrics of a provider in addition to client metrics. Some of the service provider metrics include time planning, profit management, native intelligence, user adoption, environment, market scenario, vision and industry adoption. In this paper we focus on enhancing industry adoption through optimizing cost of service composition. We propose the SLAKY composer that solves assignment of appropriate service during composition as an assignment minimization problem to reduce the cost of composition. We also extend OWL-S profile sub ontology to augment cost as a service parameter.

  3. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  4. Unidirectional high fiber content composites: Automatic 3D FE model generation and damage simulation

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2009-01-01

    A new method and a software code for the automatic generation of 3D micromechanical FE models of unidirectional long-fiber-reinforced composite (LFRC) with high fiber volume fraction with random fiber arrangement are presented. The fiber arrangement in the cross-section is generated through random...... movements of fibers from their initial regular hexagonal arrangement. Damageable layers are introduced into the fibers to take into account the random distribution of the fiber strengths. A series of computational experiments on the glass fibers reinforced polymer epoxy matrix composite is performed to...

  5. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not...... naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  6. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Andrew P Davison

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  7. Integrated workflows for spiking neuronal network simulations.

    Science.gov (United States)

    Antolík, Ján; Davison, Andrew P

    2013-01-01

    The increasing availability of computational resources is enabling more detailed, realistic modeling in computational neuroscience, resulting in a shift toward more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeler's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modelers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity. To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualization into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo, and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organized configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualization stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modeling studies by relieving the user from manual handling of the flow of metadata between the individual workflow stages. PMID

  8. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  9. AN AI PLANNING APPROACH FOR GENERATING BIG DATA WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Wesley Deneke

    2015-09-01

    Full Text Available The scale of big data causes the compositions of extract-transform-load (ETL workflows to grow increasingly complex. With the turnaround time for delivering solutions becoming a greater emphasis, stakeholders cannot continue to afford to wait the hundreds of hours it takes for domain experts to manually compose a workflow solution. This paper describes a novel AI planning approach that facilitates rapid composition and maintenance of ETL workflows. The workflow engine is evaluated on real-world scenarios from an industrial partner and results gathered from a prototype are reported to demonstrate the validity of the approach.

  10. Covariance among milking frequency, milk yield, and milk composition from automatically milked cows

    DEFF Research Database (Denmark)

    Løvendahl, Peter; Chagunda, G G

    2011-01-01

    Automatic milking systems allow cows voluntary access to milking and concentrates within set limits. This leads to large variation in milking intervals, both within and between cows, which further affects yield per milking and composition of milk. This study aimed to describe the degree to which...... differences in milking interval were attributable to individual cows, and how this correlated to individual differences in yield and composition of milk throughout lactation. Data from 288,366 milkings from 664 cow-lactations were used, of which 229,020 milkings had milk composition results. Cows were...... variance was generally greatest in early lactation and declined thereafter. Accordingly, animal-related variance tended to increase with progression of lactation. Milking frequency (the reverse of milking interval) was found to be moderately repeatable throughout lactation. Daily milk yield expressed per...

  11. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  12. Deductive Synthesis of Workflows for E-Science

    OpenAIRE

    Yang, B.; Bundy, Alan; Smaill, A.; Dixon, L

    2005-01-01

    In this paper we show that the automated reasoning technique of deductive synthesis can be applied to address the problem of machine-assisted composition of e-Science workflows according to users' specifications. We encode formal specifications of e-Science data, services and workflows, constructed from their descriptions, in the generic theorem prover Isabelle. Workflows meeting this specification are then synthesised as a side-effect of proving that these specifications can be met.

  13. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  14. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  15. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    . We present an algorithm for the translation of such models into Markov Decision processes expressed in the syntax of the PRISM model checker. This enables analysis of business processes for the following properties: transient and steadystate probabilities, the timing, occurrence and ordering of...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  16. A Model for Semi-Automatic Composition of Educational Content from Open Repositories of Learning Objects

    Directory of Open Access Journals (Sweden)

    Paula Andrea Rodríguez Marín

    2014-04-01

    Full Text Available Learning objects (LOs repositories are important in building educational content and should allow search, retrieval and composition processes to be successfully developed to reach educational goals. However, such processes require so much time-consuming and not always provide the desired results. Thus, the aim of this paper is to propose a model for the semiautomatic composition of LOs, which are automatically recovered from open repositories. For the development of model, various text similarity measures are discussed, while for calibration and validation some comparison experiments were performed using the results obtained by teachers. Experimental results show that when using a value of k (number of LOs selected of at least 3, the percentage of similarities between the model and such made by experts exceeds 75%. To conclude, it can be established that the model proposed allows teachers to save time and effort for LOs selection by performing a pre-filter process.

  17. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  18. Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils

    Directory of Open Access Journals (Sweden)

    David Castejón

    2016-02-01

    Full Text Available In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC. To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual was determined by 1H-NMR spectroscopy according to this protocol.

  19. Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils

    Science.gov (United States)

    Castejón, David; Fricke, Pascal; Cambero, María Isabel; Herrera, Antonio

    2016-01-01

    In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC). To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual) was determined by 1H-NMR spectroscopy according to this protocol. PMID:26891323

  20. Automatic ¹H-NMR Screening of Fatty Acid Composition in Edible Oils.

    Science.gov (United States)

    Castejón, David; Fricke, Pascal; Cambero, María Isabel; Herrera, Antonio

    2016-02-01

    In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC). To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual) was determined by ¹H-NMR spectroscopy according to this protocol. PMID:26891323

  1. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  2. Partitioning Uncertain Workflows

    CERN Document Server

    Huberman, Bernardo A

    2015-01-01

    It is common practice to partition complex workflows into separate channels in order to speed up their completion times. When this is done within a distributed environment, unavoidable fluctuations make individual realizations depart from the expected average gains. We present a method for breaking any complex workflow into several workloads in such a way that once their outputs are joined, their full completion takes less time and exhibit smaller variance than when running in only one channel. We demonstrate the effectiveness of this method in two different scenarios; the optimization of a convex function and the transmission of a large computer file over the Internet.

  3. Workflow Management in Electronic Commerce

    OpenAIRE

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this to workflow management across the boundaries of organizations. In the third part, we further extend this model by making service processes - implemented as workflows - the objects traded in ecomme...

  4. A quantitative fitness analysis workflow.

    Science.gov (United States)

    Banks, A P; Lawless, C; Lydall, D A

    2012-01-01

    Quantitative Fitness Analysis (QFA) is an experimental and computational workflow for comparing fitnesses of microbial cultures grown in parallel(1,2,3,4). QFA can be applied to focused observations of single cultures but is most useful for genome-wide genetic interaction or drug screens investigating up to thousands of independent cultures. The central experimental method is the inoculation of independent, dilute liquid microbial cultures onto solid agar plates which are incubated and regularly photographed. Photographs from each time-point are analyzed, producing quantitative cell density estimates, which are used to construct growth curves, allowing quantitative fitness measures to be derived. Culture fitnesses can be compared to quantify and rank genetic interaction strengths or drug sensitivities. The effect on culture fitness of any treatments added into substrate agar (e.g. small molecules, antibiotics or nutrients) or applied to plates externally (e.g. UV irradiation, temperature) can be quantified by QFA. The QFA workflow produces growth rate estimates analogous to those obtained by spectrophotometric measurement of parallel liquid cultures in 96-well or 200-well plate readers. Importantly, QFA has significantly higher throughput compared with such methods. QFA cultures grow on a solid agar surface and are therefore well aerated during growth without the need for stirring or shaking. QFA throughput is not as high as that of some Synthetic Genetic Array (SGA) screening methods(5,6). However, since QFA cultures are heavily diluted before being inoculated onto agar, QFA can capture more complete growth curves, including exponential and saturation phases(3). For example, growth curve observations allow culture doubling times to be estimated directly with high precision, as discussed previously(1). Here we present a specific QFA protocol applied to thousands of S. cerevisiae cultures which are automatically handled by robots during inoculation, incubation and

  5. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  6. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    OpenAIRE

    Guillermo Palacios; Carlos Cetina; Raquel Lacuesta; Pau Giner

    2012-01-01

    Sensing technologies in mobile devices play a key role in reducing the gap between the physical and the digital world. The use of automatic identification capabilities can improve user participation in business processes where physical elements are involved (Smart Workflows). However, identifying all objects in the user surroundings does not automatically translate into meaningful services to the user. This work introduces Parkour, an architecture that allows the development of services that ...

  7. DMS systems and workflow

    OpenAIRE

    Jakeš, Jiří

    2008-01-01

    This work refers to systems for document management (DMS) and support of inside processes by integrated workflow modules. The work contains main reasons for implementing DMS, benefits from it, functionality of typical DMS systems, it defines components of system and status on relevant market, and also trends of future direction. This work is focused on practical use and try to find a way from technology to business.

  8. Make Your Workflows Smarter

    Science.gov (United States)

    Jones, Corey; Kapatos, Dennis; Skradski, Cory

    2012-01-01

    Do you have workflows with many manual tasks that slow down your business? Or, do you scale back workflows because there are simply too many manual tasks? Basic workflow robots can automate some common tasks, but not everything. This presentation will show how advanced robots called "expression robots" can be set up to perform everything from simple tasks such as: moving, creating folders, renaming, changing or creating an attribute, and revising, to more complex tasks like: creating a pdf, or even launching a session of Creo Parametric and performing a specific modeling task. Expression robots are able to utilize the Java API and Info*Engine to do almost anything you can imagine! Best of all, these tools are supported by PTC and will work with later releases of Windchill. Limited knowledge of Java, Info*Engine, and XML are required. The attendee will learn what task expression robots are capable of performing. The attendee will learn what is involved in setting up an expression robot. The attendee will gain a basic understanding of simple Info*Engine tasks

  9. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  10. Towards workflow ecosystems through standard representations

    OpenAIRE

    Garijo Verdejo, Daniel; Gil, Yolanda; Corcho, Oscar

    2014-01-01

    Workflows are increasingly used to manage and share scientific computations and methods. Workflow tools can be used to design, validate, execute and visualize scientific workflows and their execution results. Other tools manage workflow libraries or mine their contents. There has been a lot of recent work on workflow system integration as well as common workflow interlinguas, but the interoperability among workflow systems remains a challenge. Ideally, these tools would f...

  11. Analysis of Enterprise Workflow Solutions

    Science.gov (United States)

    Chen, Cui-E.; Wang, Shulin; Chen, Ying; Meng, Yang; Ma, Hua

    Since the 90’s, workflow technology has been widely applied in various industries, such as office automation(OA), manufacturing, telecommunications services, banking, securities, insurance and other financial services, research institutes and education services, and so on, to improve business process automation and integration capabilities. In this paper, based on Workflow theory, the author proposed a set of policy-based workflow approach in order to support dynamic workflow patterns. Through the expansion of the functions of Shark, it implemented a Workflow engine component-OAShark which can support retrieval / rollback function. The related classes were programmed. The technology was applied to the OA system of an enterprise project. The realization of the enterprise workflow solutions greatly improved the efficiency of the office automation.

  12. Tvorba workflow aplikací

    OpenAIRE

    Hanák, Tomáš

    2012-01-01

    Analysis, design and implementation of workflow application for auto service using Bonita open-source process engine. The thesis introduces main terminology in process applications, systems management workflow and BPMS. Methods for ISAC, PDIT and BORM process analysis are examined. Graphic notation BPMN 2.0 for process modeling is briefly described. Finally, workflow application "IT System for Auto Service" (ISA) is designed and implemented on Bonita Open Solution - Commu...

  13. CSP for Executable Scientific Workflows

    OpenAIRE

    Friborg, Rune Møllegaard

    2011-01-01

    This thesis presents CSP as a means of orchestrating the execution of tasks in a scientific workflow. Scientific workflow systems are popular in a wide range of scientific areas, where tasks are organised in directed graphs. Execution of such graphs is handled by the scientific workflow systems and can usually benefit performance-wise from both multiprocessing, cluster and grid environments.PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming langu...

  14. New Interactions with Workflow Systems

    OpenAIRE

    Wassink, I.; Vet, de, H.C.W.; Veer, van der, P.T.; Roos, M.; Dijk, van, G.; Norros, L.; Koskinen, H; Salo, L; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level of the new workflow system.

  15. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  16. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  17. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  18. Faces from the web: automatic selection and composition of media for casual screen consumption and printed artwork

    Science.gov (United States)

    Cheatle, Phil; Greig, Darryl; Slatter, David

    2010-02-01

    Web image search engines facilitate the production of image sets in which faces appear. Many people enjoy producing and sharing media collections of this type and generating new images or video experiences. Skilled practitioners produce visually appealing artifacts from such collections but few users have the time or creative ability to do so. The problem is to automatically create an image or ambient experience which sustains interest. A full solution requires agreements with copyright holders and input from graphics designers. We address the underlying technical problems of extraction and composition. We describe an automatic system that identifies regions containing human faces in each image of an image set resulting from a web search. The face regions are composed into dynamically synthesized multilayer graphical backgrounds. The aesthetic aspects of the composition are controlled by active templates. These aspects include face size and positioning but also face identity and number of faces in a group. The output structure is multi layer supporting both the generation of static images and video consisting of transitions between the compositions.

  19. Enabling smart workflows over heterogeneous ID-sensing technologies.

    Science.gov (United States)

    Giner, Pau; Cetina, Carlos; Lacuesta, Raquel; Palacios, Guillermo

    2012-01-01

    Sensing technologies in mobile devices play a key role in reducing the gap between the physical and the digital world. The use of automatic identification capabilities can improve user participation in business processes where physical elements are involved(Smart Workflows). However, identifying all objects in the user surroundings does not automatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each of the participants in a smart workflow. Parkour is based on a pluggable architecture that can be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are also provided. Several Parkour-based systems have been developed in order to validate the applicability of the proposal. PMID:23202193

  20. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  1. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  2. Customizable Isolation in Transactional Workflow

    OpenAIRE

    Guabtni, Adnene; Charoy, François; Godart, Claude

    2005-01-01

    In Workflow Management Systems (WFMSs) safety of execution is a main need of more and more business processes and transactional workflows are real needs inside enterprizes. In previous works, transactional models consider mainly atomicity as the main issue regarding long term transactions. It rarely consider the fact that many processes may run concurrently and thus access and update the same data. Usually, the main isolation item is the data on which we apply locking approaches and this atti...

  3. Within day variation in fatty acid composition of milk from cows in an automatic milking system

    DEFF Research Database (Denmark)

    Larsen, Mette Krogh; Weisbjerg, Martin Riis; Kristensen, Camilla Bjerg;

    2012-01-01

    Milk fatty acid composition is influenced by a range of conditions such as breed, feeding, and stage of lactation. Knowledge of milk fatty acid composition of individual cows would make it possible to sort milk at farm level according to certain fatty acid specifications. In the present study, 22...

  4. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Directory of Open Access Journals (Sweden)

    Günther Greiner

    2010-01-01

    Full Text Available Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  5. La vérification de patrons de workflow métier basés sur les flux de contrôle : une approche utilisant les systèmes à base de connaissances

    OpenAIRE

    Nguyen, Thi Hoa Hue

    2015-01-01

    This thesis tackles the problem of modelling semantically rich business workflow templates and proposes a process for developing workflow templates. The objective of the thesis is to transform a business process into a control flow-based business workflow template that guarantees syntactic and semantic validity. The main challenges are: (i) to define formalism for representing business processes; (ii) to establish automatic control mechanisms to ensure the correctness of a business workflow t...

  6. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical......Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... semantics, which is a necessary foundation for asynchronously executing distributed processes, is not obvious for declarative formalisms and is so far virtually unexplored. This is in stark contrast to the very successful Petri-net–based process languages, which have an inherent notion of concurrency. In...

  7. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  8. Procesos workflow en la nube

    OpenAIRE

    Peralta, Mario; Salgado, Carlos Humberto; Baigorria, Lorena; Montejano, Germán Antonio; Riesco, Daniel Eduardo

    2014-01-01

    Dada la globalización de la información, las organizaciones tienden a virtualizar sus negocios: subir su negocio a la Nube. Desde la perspectiva de la complejidad de los procesos de negocio, una de las tecnologías más significativas para soportar su automatización son los Sistemas de Gestión Workflow, dando soporte computacional para definir, sincronizar y ejecutar actividades del proceso utilizando workflows. Para favorecer y dar flexibilidad a dichos sistemas, es fundamental tener herram...

  9. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  10. Workflow in Astronomy : the VO France Workflow Working Group experience

    Science.gov (United States)

    Schaaff, A.; Petit, F. L.; Prugniel, P.; Slezak, E.; Surace, C.

    2008-08-01

    The French Action Spécifique Observatoires Virtuels has created the Workflow Working Group in 2005. Its aim is to explore the use of the Workflow paradigm in the astronomical domain. The first consensus was the definition of a Workflow as a sequence of tasks realized in a controlled context (at various levels: intelligence in the choice of the algorithms, flow control, etc.), based on use cases studies, in an architecture which takes into account VO standards. The current roadmap is to provide scientific use cases in several domains (image, spectrum, simulation, data mining, etc.) and to improve them mainly with VO existing tools. Another important point is to develop collaborations with the IT community (links to EGEE, ...). Use cases are useful to compare the pertinence of the possible workflow models and to understand how to implement it as efficiently as possible with the existing tools (ex. : AstroGrid, AÏDA, WebCom-G, etc.). The execution (local machine, cluster, grid) through this kind of tools and the use of VO functionalities (Web Services, Grid, VOSpace, etc.) becomes almost transparent.

  11. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  12. Constructing workflows from script applications

    NARCIS (Netherlands)

    M. Baranowski; A. Belloum; M. Bubak; M. Malawski

    2013-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  13. Web-based Collaborative Workflow Design

    OpenAIRE

    Held, Markus

    2010-01-01

    In recent years Scientific Workflows have enormously gained importance, while Workflow Management has been a major aspect of enterprise systems since the 1990s. Business processes as well as queries of biological databases are modelled as workflows. In comparison to other programs, workflows contain a very high degree of domain-specific logic, thus rendering a close cooperation of subect matter experts and software engineers inevitable. This dissertation presents concepts and processes for co...

  14. Approaches to Workflow Analysis in Healthcare Settings

    OpenAIRE

    Sheehan, Barbara; Bakken, Suzanne

    2012-01-01

    Attention to workflow is an important component of a comprehensive approach to designing usable information systems. In healthcare, inattention to workflow is associated with poorly accepted systems and unforeseen effects of use. How best to examine workflow for the purpose of system design is in itself the subject of scientific inquiry. Several disciplines offer approaches to the study of workflow that can be tailored to meet the needs of systems designers in healthcare settings. This paper ...

  15. Continuous digital workflows for earth science research

    OpenAIRE

    Klump, J.; Löwe, P.

    2007-01-01

    The wealth of data available in the earth sciences is underutilised due to the absence of continuous digital workflows. The emergence of standardised web services for geospatial data, sensor network integration and grid technology now offer tools for the creation of such workflows, orchestrated by workflow engines. The creation of continuous digital workflows enables us to create new tools for global collaboration in the earth sciences by integrating the acquisition of data and metadata, and ...

  16. Delegation Protocols in Human-Centric Workflows

    OpenAIRE

    Gaaloul, Khaled; Proper, Erik; Charoy, François

    2011-01-01

    International audience Organisations are facilitated and conducted using workflow management systems. Currently, we observe a tendency moving away from strict workflow modelling towards dynamic approaches supporting human interactions when deploying a workflow. One specific approach ensuring human-centric workflows is task delegation. Delegating a task may require an access to specific and potentially sensitive data that have to be secured and specified into authorisation policies. In this...

  17. Semi-automatic determination of the carbon and oxygen stable isotope compositions of calcite and dolomite in natural mixtures

    International Nuclear Information System (INIS)

    A semi-automatic, on-line method was developed to determine the δ13C and δ18O values of coexisting calcite and dolomite. An isotopic mass balance is used to calculate the compositions of dolomite after having measured that of calcite and of the “bulk” sample. The limit of validity of this method is established by performing isotopic measurements of artificial mixtures made of precisely weighted and isotopically-characterised dolomite and calcite. The accuracy and repeatability of the calculation of dolomite δ13C and δ18O are statistically determined with a Monte-Carlo procedure of error propagation. Stable isotope ratios are determined by using an automated MultiPrep™ system on-line with an isotope-ratio mass-spectrometer (IRMS). The reaction time and the temperature of reaction were optimised by comparing the results with the isotopic composition of known mixtures. The best results were obtained by phosphoric acid digestion after 20 min at 40 °C for calcite and 45 min at 90 °C for dolomite. This procedure allows an accurate determination of the isotopic ratios from small samples (300 μg). Application of this protocol to natural mixtures of calcite and dolomite requires the accurate determination of the relative abundance of calcite and dolomite by combining Mélières manocalcimetry (MMC) and X-ray diffractometry (XRD).

  18. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    Directory of Open Access Journals (Sweden)

    Guillermo Palacios

    2012-11-01

    Full Text Available Sensing technologies in mobile devices play a key role in reducing the gapbetween the physical and the digital world. The use of automatic identification capabilitiescan improve user participation in business processes where physical elements are involved(Smart Workflows. However, identifying all objects in the user surroundings does notautomatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each ofthe participants in a smart workflow. Parkour is based on a pluggable architecture thatcan be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are alsoprovided. Several Parkour-based systems have been developed in order to validate theapplicability of the proposal.

  19. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    OpenAIRE

    Iochpe, C.; Chiao, C.; Hess, G; Nascimento, G.S.; Thom, L.H.; Reichert, M.U.

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not provide functionalities that enable users to define, query, and reuse workflow patterns properly. In this paper we gather a suite for both process modeling and normalization based on workflow patte...

  20. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  1. In vivo semi-automatic segmentation of multicontrast cardiovascular magnetic resonance for prospective cohort studies on plaque tissue composition: initial experience.

    Science.gov (United States)

    Yoneyama, Taku; Sun, Jie; Hippe, Daniel S; Balu, Niranjan; Xu, Dongxiang; Kerwin, William S; Hatsukami, Thomas S; Yuan, Chun

    2016-01-01

    Automatic in vivo segmentation of multicontrast (multisequence) carotid magnetic resonance for plaque composition has been proposed as a substitute for manual review to save time and reduce inter-reader variability in large-scale or multicenter studies. Using serial images from a prospective longitudinal study, we sought to compare a semi-automatic approach versus expert human reading in analyzing carotid atherosclerosis progression. Baseline and 6-month follow-up multicontrast carotid images from 59 asymptomatic subjects with 16-79 % carotid stenosis were reviewed by both trained radiologists with 2-4 years of specialized experience in carotid plaque characterization with MRI and a previously reported automatic atherosclerotic plaque segmentation algorithm, referred to as morphology-enhanced probabilistic plaque segmentation (MEPPS). Agreement on measurements from individual time points, as well as on compositional changes, was assessed using the intraclass correlation coefficient (ICC). There was good agreement between manual and MEPPS reviews on individual time points for calcification (CA) (area: ICC; 0.85-0.91; volume: ICC; 0.92-0.95) and lipid-rich necrotic core (LRNC) (area: ICC; 0.78-0.82; volume: ICC; 0.84-0.86). For compositional changes, agreement was good for CA volume change (ICC; 0.78) and moderate for LRNC volume change (ICC; 0.49). Factors associated with LRNC progression as detected by MEPPS review included intraplaque hemorrhage (positive association) and reduction in low-density lipoprotein cholesterol (negative association), which were consistent with previous findings from manual review. Automatic classifier for plaque composition produced results similar to expert manual review in a prospective serial MRI study of carotid atherosclerosis progression. Such automatic classification tools may be beneficial in large-scale multicenter studies by reducing image analysis time and avoiding bias between human reviewers. PMID:26169389

  2. Automatic workflow for the classification of local DNA conformations

    Czech Academy of Sciences Publication Activity Database

    Čech, P.; Kukal, J.; Černý, Jiří; Schneider, Bohdan; Svozil, D.

    2013-01-01

    Roč. 14, č. 205 (2013). ISSN 1471-2105 R&D Projects: GA ČR GAP305/12/1801 Institutional research plan: CEZ:AV0Z50520701 Keywords : DNA * Dinucleotide conformation * Classification * Machine learning * Neural network * k-NN * Cluster analysis Subject RIV: EB - Genetics ; Molecular Biology Impact factor: 2.672, year: 2013

  3. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  4. Automatic composition of semantic Web services based on planning graph%基于图规划的语义Web服务自动组合方法

    Institute of Scientific and Technical Information of China (English)

    付鹏斌; 李利波; 杨惠荣

    2011-01-01

    由于单个的Web服务功能有限,难以满足日益增长和不断变化的用户需求.如何根据服务请求者的特定需求进行服务的自动组合来满足用户的需要,就成为了一个迫切需要解决的问题.提出了一种基于图规划的语义Web服务自动组合方法,该方法在建立前驱与后继执行关系知识库的基础上,利用图规划的前向扩张思想和图规划解搜索思想,可实现从服务库中自动地找到满足用户需求的服务组合方案.该方法综合考虑了服务语义和服务组合的效率等因素,在保证Web服务组合质量的前提下,可根据服务请求实现服务的自动组合.最后用仿真实验从服务组合的成功率、效率和组合解质量三方面验证了该方法的有效性和可行性.%Nowadays, the limited function of single Web service can not satisfy the growing and varying custom' s require-ment. Therefore, how to compose Web services automatically according to service request becomes a problem urgent to be solved. This paper proposed an automatic Web service composition method based on planning graph technique, and by estab-lishing precursor-subsequence execution relationship knowledge base to support automatic Web service composition. The meth-od built on automated planning theory and used planning-graph' s expand and search techniques, and could find the service composition solution quickly and automatically. Compared with existing methods, the method took the semantics of Web serv-ices into consideration together with efficiency of the service composition. So the method can ensure quality and efficiency while composing services automatically according to user's service request. Finally using simulation experiment from services composition succeed rate,services composition efficiency and the quality of services composition solution three aspects to vali-date the effectiveness and feasibility of the method.

  5. Common motifs in scientific workflows: An empirical analysis

    OpenAIRE

    Garijo Verdejo, Daniel; Alper, P.; Belhajjame, K.; Corcho, Oscar; Gil, Yolanda; Goble, C.

    2013-01-01

    Workflow technology continues to play an important role as a means for specifying and enacting computational experiments in modern science. Reusing and re-purposing workflows allow scientists to do new experiments faster, since the workflows capture useful expertise from others. As workflow libraries grow, scientists face the challenge of finding workflows appropriate for their task, understanding what each workflow does, and reusing relevant portions of a given workflow.We believe that workf...

  6. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  7. Layered Workflow Process Model Based on Extended Synchronizer

    Directory of Open Access Journals (Sweden)

    Gang Ni

    2014-07-01

    Full Text Available The layered workflow process model provide a modeling approach and analysis for the key process with Petri Net. It not only describes the relation between the process of business flow and transition nodes clearly, but also limits the rapid increase in the scale of libraries, transition and directed arcs. This paper studies the process like reservation and complaint handling information management system, especially for the multi-mergence and discriminator patterns which can not be directly modeled with existing synchronizers. Petri Net is adopted to provide formalization description for the workflow patterns and the relation between Arcs and weight class are also analyzed. We use the number of in and out arcs to generalize the workflow into three synchronous modes: fully synchronous mode, competition synchronous mode and asynchronous mode. The types and parameters for synchronization are added to extend the modeling ability of the synchronizers and the synchronous distance is also expanded. The extended synchronizers have the ability to terminate branches automatically or activate the next link many times, besides the ability of original synchronizers. By the analyses on cases of the key business, it is verified that the original synchronizers can not model directly, while the extended synchronizers based on Petri Net can provide modeling for multi-mergence and discriminator modes.

  8. Quality of Data Driven Simulation Workflows

    Directory of Open Access Journals (Sweden)

    Michael Reiter

    2014-01-01

    Full Text Available Simulations are long-running computations driven by non-trivial data dependencies. Workflow technology helps to automate these simulations and enable using Quality of Data (QoD frameworks to determine the goodness of simulation data. However, existing frameworks are specific to scientific domains, individual applications, or proprietary workflow engine extensions. In this paper, we propose a generic approach to use QoD as a uniform means to steer complex interdisciplinary simulations implemented as workflows. The approach enables scientists to specify abstract QoD requirements that are considered to steer the workflow for ensuring a precise final result. To realize these Quality of Data-driven workflows, we present a middleware architecture and a WS-Policy-based language to describe QoD requirements and capabilities. To prove technical feasibility, we present a prototype for controlling and steering simulation workflows and a real world simulation scenario.

  9. Enabling adaptive scientific workflows via trigger detection

    OpenAIRE

    Salloum, Maher; Bennett, Janine C.; PINAR, Ali; Bhagatwala, Ankit; Chen, Jacqueline H.

    2015-01-01

    Next generation architectures necessitate a shift away from traditional workflows in which the simulation state is saved at prescribed frequencies for post-processing analysis. While the need to shift to in~situ workflows has been acknowledged for some time, much of the current research is focused on static workflows, where the analysis that would have been done as a post-process is performed concurrently with the simulation at user-prescribed frequencies. Recently, research efforts are striv...

  10. Agile parallel bioinformatics workflow management using Pwrake

    OpenAIRE

    Tanaka Masahiro; Sasaki Kensaku; Mishima Hiroyuki; Tatebe Osamu; Yoshiura Koh-ichiro

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  11. VO-compliant workflows and science gateways

    Science.gov (United States)

    Castelli, G.; Taffoni, G.; Sciacca, E.; Becciani, U.; Costa, A.; Krokos, M.; Pasian, F.; Vuerli, C.

    2015-06-01

    Workflow and science gateway technologies have been adopted by scientific communities as a valuable tool to carry out complex experiments. They offer the possibility to perform computations for data analysis and simulations, whereas hiding details of the complex infrastructures underneath. There are many workflow management systems covering a large variety of generic services coordinating execution of workflows. In this paper we describe our experiences in creating workflows oriented science gateways based on gUSE/WS-PGRADE technology and in particular we discuss the efforts devoted to develop a VO-compliant web environment.

  12. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  13. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  14. Workflow Planning and Execution - Final Results

    Directory of Open Access Journals (Sweden)

    Ravikant Dewangan

    2016-01-01

    Full Text Available An abstract workflow generation is choosing and configuring with application components to form an abstract workflow. The application components are chosen by examining the specification of their capabilities and checking to see if they can generate the desired data products. They are configured by assigning input files that exist or that may be generated by other application components. The abstract workflow specifies the order in which the components must be executed. A concrete workflow generation is selecting specific resources, files, and additional jobs required to form a concrete workflow that can be executed in the Grid environment. In order to generate a concrete workflow, each component in the abstract workflow is turned into an executable job by specifying the locations of the physical files of the component and data, as well as the resources assigned to the component in the execution environment. Additional jobs may be included in the concrete workflow, for example, jobs that transfer files to the appropriate locations where resources are available to execute the application components.

  15. Introduction to the Workflow Systems in Management

    OpenAIRE

    Aleksander Wocial

    2007-01-01

    The article concerns ontology of workflow management systems. The fundamental diagrams and their constituent elements are presented, the meaning of components and relation or interaction among them as well. The first is conceptual model of flow process, followed by meta model of process definition. The understanding of terms is crucial for IT or management specialists involved in the area of workflow.

  16. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  17. WIDE - A Distributed Architecture for Workflow Management

    OpenAIRE

    S. Ceri; Grefen, P.W.P.J.; G. Sánchez

    1997-01-01

    This paper presents the distributed architecture of the WIDE workflow management system. We show how distribution and scalability are obtained by the use of a distributed object model, a client/server architecture, and a distributed workflow server architecture. Specific attention is paid to the extended transaction support and active rule support subarchitectures.

  18. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  19. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  20. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  1. Integration of the result visualization into a workflow modeling tool

    OpenAIRE

    Xuejing, Chen

    2010-01-01

    Scientific workflows are frequently used for simulations, experiments analyses or the design and execution of experiments. The operation of the workflow management system must be self-describing, intuitive, and abstracting from underlying technology because there are many non-computer scientists who want to use workflow technology. The visualization workflows are scientific workflows for visualization. For different visualization methods, there should be different visualization workflows....

  2. DEWEY: The DICOM-Enabled Workflow Engine System

    OpenAIRE

    Erickson, Bradley J.; Langer, Steve G.; Blezek, Daniel J.; Ryan, William J.; French, Todd L.

    2014-01-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that re...

  3. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency and the...... readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to...

  4. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  5. Taverna Workflows in the Virtual Observatory

    Science.gov (United States)

    Benson, K.; Cecconi, B.

    2015-12-01

    Taverna workflows used in the Virtual ObservatoryPlanetary and Solar applications developed over the last decade generate dataat a previously unimaginable scale. One of these programmes which builds on the strengths of IDIS of Europlanet FP7, is the Virtual European Solar and Planetary Access (VESPA). With VESPA more data will be distributed and the connectivity of tools and infrastructure willimprove. VESPA enables growth of the user and provider community. However the challenge of connectivity persist throughout applications data services. VESPA calls are formed in part by tools and interactions services. One such tool and interaction service is the Taverna workflow management system. Workflows allow to address the challenges of data interconnectivity by establishing pipeline to services offered by other data streaming services. Workflows offer the capability to cross domains and overome interoperability issues. Furthermore, Taverna offers sharing of workflows; academic community 'myExperiment', a social site for scientists, supports search and opens access to pre existing workflows. This presentation focuses on cross domain workflows including use of the infrastructure setup with Helio, EUROPLANET and VAMDC projects. Hands on demonstration and an opportunity to join the community discussion will make the presentation more interactive

  6. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  7. Application of workflow technology for workshop scheduling

    Institute of Scientific and Technical Information of China (English)

    ZHOU Wan-kun; ZHU Jian-ying

    2005-01-01

    This paper attempts to solve the complexity of scheduling problems and meet the requirement of the ever-changing manufacturing environment. In this paper, a new Workflow-Based Scheduling System (WBSS) is proposed. The integration of Workflow Management System (WfMS) and rule-based scheduler provides us an effective way of generating a task-sheet according to the states of system and the scheduled objects. First, the definition of workflow model for scheduling is proposed, and following are the architecture and mechanism of the proposed WBSS. At last, an application is given to show how the established system works.

  8. P-Graph-based Workflow Modelling

    Directory of Open Access Journals (Sweden)

    József Tick

    2007-03-01

    Full Text Available Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important because of itspractical usage. This paper introduces and examines the workflow modelling techniquebased on the Process-graph as a possible new solution next to the already existingmodelling techniques.

  9. Towards a Pattern-based Automatic Generation of Logical Specifications for Software Models

    OpenAIRE

    Klimek, Radoslaw

    2014-01-01

    The work relates to the automatic generation of logical specifications, considered as sets of temporal logic formulas, extracted directly from developed software models. The extraction process is based on the assumption that the whole developed model is structured using only predefined workflow patterns. A method of automatic transformation of workflow patterns to logical specifications is proposed. Applying the presented concepts enables bridging the gap between the benefits of deductive rea...

  10. Reactive workflows for visual analytics

    OpenAIRE

    Manolescu, Ioana; Khemiri, Wael; Benzaken, Veronique; Fekete, Jean-Daniel

    2009-01-01

    The increasing amounts of electronic data of all forms, produced by humans (e.g. Web pages, structured content such as Wikipedia or the blogosphere etc.) and/or automatic tools (loggers, sensors, Web services, scientific tools etc.) leads to a situation of unprecedented potential for extracting new knowledge, finding new correlations etc. Typically, such analysis is performed by using data visualization techniques, and data analysis programs, which perform potentially complex and/or time-cons...

  11. A practical data processing workflow for multi-OMICS projects.

    Science.gov (United States)

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post

  12. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  13. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  14. Fluent Logic Workflow Analyser: A Tool for The Verification of Workflow Properties

    OpenAIRE

    Regis, Germán; Villar, Fernando; Ricci, Nicolás

    2014-01-01

    In this paper we present the design and implementation, as well as a use case, of a tool for workflow analysis. The tool provides an assistant for the specification of properties of a workflow model. The specification language for property description is Fluent Linear Time Temporal Logic. Fluents provide an adequate flexibility for capturing properties of workflows. Both the model and the properties are encoded, in an automated way, as Labelled Transition Systems, and the analysis is reduced ...

  15. Workflow Management in Occupational Medicine Using the Simple Workflow Access Protocol (SWAP)

    OpenAIRE

    McClay, James

    2001-01-01

    There are over nine million reported work related injuries a year administered through the workers compensation system. Workers compensation requires extensive communication with employers and payers. Workflow automation tools exist in segments of the industry but there isn't a common communication system. The Internet Engineering Task Force (IETF) Working Group on Simple Workflow Access Protocol (SWAP) is addressing the specifications for workflow across the Internet. We are adapting these p...

  16. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  17. Using Technology to Facilitate Technical Services Workflows

    OpenAIRE

    Getz, Kelli; Castro, Jeanne M

    2013-01-01

    Managing workflows in a complex and evolving environment is a challenge for technical services librarians. By taking advantage of technology, technical services librarians at the University of Houston Libraries currently develop and revise workflows using tools such as Google Docs, Microsoft Outlook Tasks, and Drupal-based forms. By embracing technology and harnessing the power of these tools, the UH librarians are able to successfully pair effective communication with a high-level of transpa...

  18. P-Graph-based Workflow Modelling

    OpenAIRE

    József Tick

    2007-01-01

    Workflow modelling has been successfully introduced and implemented in severalapplication fields. Therefore, its significance has increased dramatically. Several work flowmodelling techniques have been published so far, out of which quite a number arewidespread applications. For instance the Petri-Net-based modelling has become popularpartly due to its graphical design and partly due to its correct mathematical background.The workflow modelling based on Unified Modelling Language is important...

  19. Computer-Assisted Scientific Workflow Design

    OpenAIRE

    Cerezo N.; Montagnat J.; Blay-Fornarino M.

    2013-01-01

    Workflows are increasingly adopted to describe large-scale data- and compute-intensive processes that can take advantage of today's Distributed Computing Infrastructures. Still, most Scientific Workflow formalisms are notoriously difficult to fully exploit, as they entangle the description of scientific processes and their implementation, blurring the lines between what is done and how it is done as well as between what is and what is not infrastructure-dependent. This work addresses the prob...

  20. Proof-of-concept engineering workflow demonstrator

    OpenAIRE

    Molinari, M; Cox, SJ; Takeda, K.

    2006-01-01

    When Microsoft needed a proof-of-concept implementation of bespoke engineering workflow software for their customer, BAE Systems, it called on the software engineering skills and experience of the Microsoft Institute for High Performance Computing. BAE Systems was looking into converting their in-house SOLAR software suite to run on the MS Compute Cluster Server product with 64-bit MPI support in conjunction with an extended Windows Workflow environment for use by their engineers

  1. OBJECTFLOW: a modular workflow management system

    OpenAIRE

    Camilo, Ocampo; Botella López, Pere

    1997-01-01

    Workflow Management (WM) is an emerging area that involves cross-disciplinary fields as Database, Software Engineering, Business Management, Human Coordination. A Workflow Management System (WMS) is a software tool to automate Business Processes (BPs) and coordinate people of an organization. BPs are a set of linked procedures concentrated on reaching a business goal, normally following a set of procedural rules. This work presents the OBJECTFLOW(2) project, result of ...

  2. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  3. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; De, K; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2014-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  4. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  5. E-BioFlow: Different Perspectives on Scientific Workflows

    OpenAIRE

    Wassink, I.; Rauwerda, H.; Vet, van der, Paul E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R F; Schneider, K.; Toma, C

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control flow perspective, the data flow perspective, and the resource perspective. All three perspectives are of equal importance, but workflow designers from different domains prefer different perspective...

  6. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    OpenAIRE

    Tianhong Song; Sven Köhler; Bertram Ludäscher; James Hanken; Maureen Kelly; David Lowery; Macklin, James A.; Morris, Paul J.; Morris, Robert A.

    2014-01-01

    Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfal...

  7. The demand for consistent web-based workflow editors

    OpenAIRE

    Gesing, Sandra; Atkinson, Malcolm; Klampanos, Iraklis; Galea, Michelle; Berthold, Michael; Barbera, Robert; Scardaci, Diego; Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter

    2016-01-01

    This paper identifies the high value to researchers in many disciplines of having web-based graphical editors for scientific workflows and draws attention to two technological transitions: good quality editors can now run in a browser and workflow enactment systems are emerging that manage multiple workflow languages and support multi-lingual workflows. We contend that this provides a unique opportunity to introduce multi-lingual graphical workflow editors which in turn would yield substantia...

  8. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  9. YesWorkflow: A User-Oriented, Language-Independent Tool for Recovering Workflow Information from Scripts

    OpenAIRE

    Timothy McPhillips; Tianhong Song; Tyler Kolisnik; Steve Aulenbach; Khalid Belhajjame; R Kyle Bocinsky; Yang Cao; James Cheney; Fernando Chirigati; Saumen Dey; Juliana Freire; Christopher Jones; James Hanken; Kintigh, Keith W.; Kohler, Timothy A.

    2015-01-01

    Scientific workflow management systems offer features for composing complex computational pipelines from modular building blocks, for executing the resulting automated workflows, and for recording the provenance of data products resulting from workflow runs. Despite the advantages such features provide, many automated workflows continue to be implemented and executed outside of scientific workflow systems due to the convenience and familiarity of scripting languages (such as Perl, Python, R, ...

  10. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  11. Workflow reengineering: a methodology for business process reengineering with workflow management technology

    OpenAIRE

    Bitzer, Sharon Marie.

    1995-01-01

    All organizations, both private and public, must improve their business practices to survive in today's volatile and highly competitive marketplace. This thesis overviews business process reengineering principles, and examines four methodologies for its accomplishment. Based on existing approaches, the thesis develops a new reengineering procedure, called the Workflow Reengineering Methodology. This methodology uses workflow automation as an enabler for efficiently and eff...

  12. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  13. Automatic development of normal zone in composite MgB2/CuNi wires with different diameters

    International Nuclear Information System (INIS)

    One of the promising applications with superconducting technology for hydrogen utilization is a sensor with a magnesium-diboride (MgB2) superconductor to detect the position of boundary between the liquid hydrogen and the evaporated gas stored in a Dewar vessel. In our previous experiment for the level sensor, the normal zone has been automatically developed and therefore any energy input with the heater has not been required for normal operation. Although the physical mechanism for such a property of the MgB2 wire has not been clarified yet, the deliberate application might lead to the realization of a simpler superconducting level sensor without heater system. In the present study, the automatic development of normal zone with increasing a transport current is evaluated for samples consisting of three kinds of MgB2 wires with CuNi sheath and different diameters immersed in liquid helium. The influences of the repeats of current excitation and heat cycle on the normal zone development are discussed experimentally. The aim of this paper is to confirm the suitability of MgB2 wire in a heater free level sensor application. This could lead to even more optimized design of the liquid hydrogen level sensor and the removal of extra heater input.

  14. YesWorkflow: A User-Oriented, Language-Independent Tool for Recovering Workflow Information from Scripts

    Directory of Open Access Journals (Sweden)

    Timothy McPhillips

    2015-02-01

    Full Text Available Scientific workflow management systems offer features for composing complex computational pipelines from modular building blocks, executing the resulting automated workflows, and recording the provenance of data products resulting from workflow runs. Despite the advantages such features provide, many automated workflows continue to be implemented and executed outside of scientific workflow systems due to the convenience and familiarity of scripting languages (such as Perl, Python, R, and MATLAB, and to the high productivity many scientists experience when using these languages. YesWorkflow is a set of software tools that aim to provide such users of scripting languages with many of the benefits of scientific workflow systems. YesWorkflow requires neither the use of a workflow engine nor the overhead of adapting code to run effectively in such a system. Instead, YesWorkflow enables scientists to annotate existing scripts with special comments that reveal the computational modules and dataflows otherwise implicit in these scripts. YesWorkflow tools extract and analyze these comments, represent the scripts in terms of entities based on the typical scientific workflow model, and provide graphical renderings of this workflow-like view of the scripts. Future version of YesWorkflow will also allow the prospective provenance of the data products of these scripts to be queried in ways similar to those available to users of scientific workflow systems.

  15. Wireless remote control clinical image workflow: utilizing a PDA for offsite distribution

    Science.gov (United States)

    Liu, Brent J.; Documet, Luis; Documet, Jorge; Huang, H. K.; Muldoon, Jean

    2004-04-01

    Last year we presented in RSNA an application to perform wireless remote control of PACS image distribution utilizing a handheld device such as a Personal Digital Assistant (PDA). This paper describes the clinical experiences including workflow scenarios of implementing the PDA application to route exams from the clinical PACS archive server to various locations for offsite distribution of clinical PACS exams. By utilizing this remote control application, radiologists can manage image workflow distribution with a single wireless handheld device without impacting their clinical workflow on diagnostic PACS workstations. A PDA application was designed and developed to perform DICOM Query and C-Move requests by a physician from a clinical PACS Archive to a CD-burning device for automatic burning of PACS data for the distribution to offsite. In addition, it was also used for convenient routing of historical PACS exams to the local web server, local workstations, and teleradiology systems. The application was evaluated by radiologists as well as other clinical staff who need to distribute PACS exams to offsite referring physician"s offices and offsite radiologists. An application for image workflow management utilizing wireless technology was implemented in a clinical environment and evaluated. A PDA application was successfully utilized to perform DICOM Query and C-Move requests from the clinical PACS archive to various offsite exam distribution devices. Clinical staff can utilize the PDA to manage image workflow and PACS exam distribution conveniently for offsite consultations by referring physicians and radiologists. This solution allows the radiologist to expand their effectiveness in health care delivery both within the radiology department as well as offisite by improving their clinical workflow.

  16. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    support for compositional reasoning. We use the formalism of component-based timed-arc Petri Nets (CTAPN) for modular modelling of collaborative healthcare workflows and demonstrate how the model checker TAPAAL supports the verification of their functional and non-functional requirements. To this end, we...

  17. 一种基于ODE的服务组合自动化部署方案%An Automatic Deployment Scheme for Service Composition Based on ODE

    Institute of Scientific and Technical Information of China (English)

    金仙力; 杨庚

    2012-01-01

    通过对Apache ODE结构以及部署、执行BPEL流程的原理分析,提出一种Apache ODE引擎环境下服务组合的自动化部署方案.实例测试结果表明了该方案的可行性与有效性.%Based on the analysis of the structure and the deployment of Apache ODE and the principle of BPEL process execution, this paper proposes an automatic deployment scheme for service composition in the environment of Apache ODE engine. The results of example test show that this scheme is feasible and valid.

  18. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    Science.gov (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  19. Scientific Workflow Applications on Amazon EC2

    CERN Document Server

    Juve, Gideon; Vahi, Karan; Mehta, Gaurang; Berriman, Bruce; Berman, Benjamin P; Maechling, Phil

    2010-01-01

    The proliferation of commercial cloud computing providers has generated significant interest in the scientific computing community. Much recent research has attempted to determine the benefits and drawbacks of cloud computing for scientific applications. Although clouds have many attractive features, such as virtualization, on-demand provisioning, and "pay as you go" usage-based pricing, it is not clear whether they are able to deliver the performance required for scientific applications at a reasonable price. In this paper we examine the performance and cost of clouds from the perspective of scientific workflow applications. We use three characteristic workflows to compare the performance of a commercial cloud with that of a typical HPC system, and we analyze the various costs associated with running those workflows in the cloud. We find that the performance of clouds is not unreasonable given the hardware resources provided, and that performance comparable to HPC systems can be achieved given similar resour...

  20. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  1. NeuroManager: A workflow analysis based simulation management engine for computational neuroscience

    Directory of Open Access Journals (Sweden)

    David Bruce Stockton

    2015-10-01

    Full Text Available We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach 1 provides flexibility to adapt to a variety of neuroscience simulators, 2 simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and 3 improves tracking of simulator/simulation evolution. We implemented NeuroManager in Matlab, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in twenty-two stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to Matlab's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  2. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  3. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  4. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  5. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an...... Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  6. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  7. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  8. Creating Bioinformatic Workflows within the BioExtract Server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows generally require access to multiple, distributed data sources and analytic tools. The requisite data sources may include large public data repositories, community...

  9. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter;

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment of the ...... where a baked goods company seeks to improve production time while simultaneously minimising the cost and use of resources....

  10. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  11. KDE Bioscience: platform for bioinformatics analysis workflows.

    Science.gov (United States)

    Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue

    2006-08-01

    Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research. PMID:16260186

  12. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  13. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  14. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  15. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    . Finally, we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  16. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  17. Evaluation of Workflow Management Systems - A Meta Model Approach

    OpenAIRE

    Michael Rosemann; Michael zur Muehlen

    1998-01-01

    The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. Af...

  18. A Middleware Independent Grid Workflow Builder for Scientific Applications

    OpenAIRE

    Johnson, David; Meacham, Ken. E; Kornmayer, H.

    2009-01-01

    particular workflow engines built into Grid middleware, or are application specific and are designed to interact with specific software implementations. g-Eclipse is a middleware independent Grid workbench that aims to provide a unified abstraction of the Grid and includes a Grid workflow builder to allow users to author and deploy workflows to the Grid. This paper describes the g-Eclipse Workflow Builder and its implementations for two Grid middlewares, gLite and GRIA, and a case study utili...

  19. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an...... approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory of the...

  20. Understanding dental CAD/CAM for restorations--the digital workflow from a mechanical engineering viewpoint.

    Science.gov (United States)

    Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P

    2015-01-01

    As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology. PMID:25911827

  1. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  2. Sharing of Cluster Resources among Multiple Workflow Applications

    Directory of Open Access Journals (Sweden)

    Uma Boregowda

    2014-04-01

    Full Text Available Many computational solutions can be expressed as workflows. A Cluster of processors is a shared resou rce among several users and hence the need for a schedul er which deals with multi - user j obs presented as workflows . T he scheduler must find the number of processors to be allotted for each workflow and schedule tasks on allotted processors. In this work, a new method to find optimal and maximum number of processors that can be allotted for a workflow is proposed. Regression analysis is used to find the best possible way to share av ailable processors , among suitable number of submitted workflows . An instance of a scheduler is created for each workflow , which schedules tasks on the allotted pro cessors. Towards this end, a new framework to receive online submission of workflow s, to allot processors to each workflow and schedule tasks, is proposed and experimented using a discrete - event based simulator . This space - sharing of proc essors among multi ple workflow s shows better performance than the other methods found in literature. Because of sp ace - sharing, an instance of a scheduler must be used for each workflow within the allotted processors. Since the number of processors for each workflow is know n only during runtime, a static schedule can not be used. Hence a hybrid scheduler which tries to combine the advantages of static and dynamic scheduler is proposed. Thus the proposed fram ework is a promising solution to multiple workflows scheduling on cl uster

  3. Agent-Based Workflow Systems in Electronic Distance Education.

    Science.gov (United States)

    Dlodlo, Nomusa; Dlodlo, Joseph B.; Masiye, Bighton S.

    Current workflow systems largely assume a closed network where all the software is available on a homogenous platform and all participants are locally linked together at the same time. The field of Electronic Distance Education (EDE) on the other hand, requires the next-generation workflow that will integrate workflows from a distributed…

  4. 基于流演算的Web服务自动组合方法%An Approach for Automatic Web Services Composition Based on Fluent Calculus

    Institute of Scientific and Technical Information of China (English)

    陈志勇; 李庆忠; 王文明; 崔立真; 丛国进

    2013-01-01

    近年来,基于语义的Web服务组合,尤其是Web服务的自动组合方法已成为服务计算领域的一个研究热点.实现了从一个OWL-S过程模型到流演算概念的映射,并给出了相应的转换算法.在此基础上,提出了一个新颖的、基于流演算形式化体系的Web服务自动组合方法.该方法采用前推推理机制对状态和动作进行推理,有效地克服了以传统的情景演算为代表的人工智能规划算法执行效率较低的问题.设计实现了一个实验性的原型系统,结合一个旅游行程规划的实例说明了本文提出的方法的有效性.对提出的BCABFC(Backward-Chaining Algorithm Based On Fluent Calculus)算法与基于情景演算的同类算法进行性能比较,实验结果表明该算法具有较好的性能.%In recent years, semantics-based Web Services composition, especially automated composition method has become popularity in the research area of Service Computing. This paper has identified a mapping between an OWL-S process ontology and the fluent calculus concepts. We present an algorithm to translate OWL-S service descriptions into an equivalent fluent calculus service specification. This paper presents a novel approach for automatic Web service composition method based on the formalism of fluent calculus. In our approach, the Web service composition process is viewed as an AI planning problem in the fluent calculus formalism. We show how the planning capabilities of the fluent calculus can be used to automatically generate an abstract composition model in terms of user personalized requests. This method applies the principle of progression for reasoning the status and action of the object. As a result, it brings a higher efficiency than traditional AI planning algorithms characterized by Situation Calculus. For testing our composition method, we have designed and implemented an experimental prototype and demonstrate its effectiveness with the help of an application

  5. Automatically varying the composition of a mixed refrigerant solution for single mixed refrigerant LNG (liquefied natural gas) process at changing working conditions

    International Nuclear Information System (INIS)

    The SMR (single mixed refrigerant) process is widely used in the small- and medium-scale liquefaction of NG (natural gas). Operating the MR (mixed-refrigerant) process outside of the design specifications is difficult but essential to save energy. Nevertheless, it is difficult to realize because the process needs to alter the working refrigerant composition. To address this challenge, this study investigated the performance diagnosis mechanism for SMR process. A control strategy was then proposed to control the changes in working refrigerant composition under different working conditions. This strategy separates the working refrigerant flow in the SMR process into three flows through two phase separators before it flows into the cold box. The first liquid flow is rich in the high-temperature component (isopentane). The second liquid flow is rich in the middle-temperature components (ethylene and propane), and the gas flow is rich in the low-temperature components (nitrogen and methane). By adjusting the flow rates, it is easy to decouple the control variables and automate the system. Finally, this approach was validated by process simulation and shown to be highly adaptive and exergy efficient in response to changing working conditions. - Highlights: • The performance diagnosis mechanism of SMR LNG process is studied. • A measure to automatically change the operation composition as per the working conditions is proposed for SMR process. • SMR process simulation is performed to verify the validity of the control solution. • The control solution notably improves the energy efficiency of SMR process at changing working condition

  6. 基于有向层次图的Web服务自动组合方法%Directed level graph-based approach to automatic Web services composition

    Institute of Scientific and Technical Information of China (English)

    冯兴杰; 王辉; 许亚娟

    2011-01-01

    To solve the problem of automatic Web services composition with multiple inputs/outputs, an approach based on directed level graph was proposed. It provided an optimal composition sequence through these steps as follows: 1) Built a directed level graph by inputs/outputs of user request; 2) Built a complete reduction graph of the directed level graph;3) Searched all reachable paths for every node of complete reduction graph; 4) Converted the optimal path for user request into services composition sequence. This approach can generate all composition sequences with least steps and an optimal composition sequence according to the quality of services. Compared with traditional graph-based approach, it reduces search space and avoids cycle searching and can be applied in a large scale of Web services repository.%为解决多输入/输出的Web服务自动组合问题,提出了基于有向层次图的Web服务自动组合方法,主要步骤如下:1)根据用户请求的输入/输出参数集生成有向层次图;2)在有向层次图中构造完全规约图;3)在完全规约图中计算每一顶点的所有可达路径;4)为用户请求选择最优路径,并转化为Web服务组合序列.该方法能够求得最短步数内的所有Web服务组合序列,根据Web服务的服务质量(QoS)获得最优的组合序列,从而满足多输入/输出的用户请求.与基于图的Web服务组合方法相比,减少了搜索空间,适用于大规模的Web服务库.

  7. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  8. Solutions for complex, multi data type and multi tool analysis: principles and applications of using workflow and pipelining methods.

    Science.gov (United States)

    Munro, Robin E J; Guo, Yike

    2009-01-01

    Analytical workflow technology, sometimes also called data pipelining, is the fundamental component that provides the scalable analytical middleware that can be used to enable the rapid building and deployment of an analytical application. Analytical workflows enable researchers, analysts and informaticians to integrate and access data and tools from structured and non-structured data sources so that analytics can bridge different silos of information; compose multiple analytical methods and data transformations without coding; rapidly develop applications and solutions by visually constructing analytical workflows that are easy to revise should the requirements change; access domain-specific extensions for specific projects or areas, for example, text extraction, visualisation, reporting, genetics, cheminformatics, bioinformatics and patient-based analytics; automatically deploy workflows directly into web portals and as web services to be part of a service-oriented architecture (SOA). By performing workflow building, using a middleware layer for data integration, it is a relatively simple exercise to visually design an analytical process for data analysis and then publish this as a service to a web browser. All this is encapsulated into what can be referred to as an 'Embedded Analytics' methodology which will be described here with examples covering different scientifically focused data analysis problems. PMID:19597790

  9. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  10. SegMine workflows for semantic microarray data analysis in Orange4WS

    Directory of Open Access Journals (Sweden)

    Kulovesi Kimmo

    2011-10-01

    Full Text Available Abstract Background In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases. Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. Results We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Conclusions Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  11. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  12. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  13. Distributed interoperable workflow support for electronic commerce

    OpenAIRE

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business activities including the support of financial transactions and electronic contracts. This environment has as its aim to provide key infrastructure services for mediating and monitoring electronic co...

  14. Computing Workflows for Biologists: A Roadmap

    OpenAIRE

    Shade, Ashley; Teal, Tracy K.

    2015-01-01

    Extremely large datasets have become routine in biology. However, performing a computational analysis of a large dataset can be overwhelming, especially for novices. Here, we present a step-by-step guide to computing workflows with the biologist end-user in mind. Starting from a foundation of sound data management practices, we make specific recommendations on how to approach and perform computational analyses of large datasets, with a view to enabling sound, reproducible biological research.

  15. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  16. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  17. ASP, Amalgamation and the Conceptual Blending Workflow

    OpenAIRE

    Eppe, Manfred; Maclean, Ewen; Confalonieri, Roberto; Kutz, Oliver; Schorlemmer, Marco; Plaza, Enric

    2015-01-01

    We present an amalgamation technique used for conceptual blending – a concept invention method that is advocated in cognitive science as a fundamental, and uniquely human engine for creative thinking. Herein, we employ the search capabilities of ASP to find commonalities among input concepts as part of the blending process, and we show how our approach fits within a generalised conceptual blending workflow. Specifically, we orchestrate ASP with imperative programming languages like Python, to...

  18. Case Study: Using Perl and CGI Scripts to Automate a Quality Control Workflow for Scanned Congressional Documents

    OpenAIRE

    Doreva Belfiore

    2012-01-01

    The Law Library Digitization Project of the Rutgers University School of Law in Camden, New Jersey, developed a series of scripts in Perl and CGI that take advantage of the open-source module PerlMagick to automatically review the image quality of scanned government documents. By implementing these procedures, Rutgers was able to save staff working hours for document quality control by an estimated 25% percent from the previous manual-only workflow. These scripts can be adapted by novice Perl...

  19. Workflow - oběh a zpracování účetních dokumentů

    OpenAIRE

    Kovářová, Tereza

    2014-01-01

    The bachelor thesis is focused on the circulation and processing of accounting documents. In the first part there are business processes, their definitions, classification and the ways of their improvement. The second part describes the issues of workflow from systems for automatization of business processes up to the process of circulation and processing of accounting documents. The main aim of this thesis is to analyze the circulation and processing of accounting documents in the chosen com...

  20. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  1. Semi-automatic identification of counterfeit offers in online shopping platforms

    OpenAIRE

    Wartner, Christian; Arnold, Patrick; Rahm, Erhard

    2015-01-01

    Product counterfeiting is a serious problem causing the industry estimated losses of billions of dollars every year. With the increasing spread of e-commerce, the number of counterfeit products sold online increased substantially. We propose the adoption of a semi-automatic workflow to identify likely counterfeit offers in online platforms and to present these offers to a domain expert for manual verification. The workflow includes steps to generate search queries for relevant product offers,...

  2. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  3. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  4. Analysis of the use of a workflow engine for OTRUM system software

    OpenAIRE

    Altamimi, Mohamed

    2007-01-01

    Workflow engines are attracting more and more attention. Applications based on workflow engine technology are currently developed and deployed by many companies, such as OTRUM Company. In this project, we focus on the analysis and development of an efficient workflow engine for interactive TV. The research project will realize workflow engine solutions based on three choices including commercial workflow engine, open source workflow engine, and a workflow engine implemented ...

  5. A Network Inference Workflow Applied to Virulence-Related Processes in Salmonella typhimurium

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Ronald C.; Singhal, Mudita; Weller, Jennifer B.; Khoshnevis, Saeed; Shi, Liang; McDermott, Jason E.

    2009-04-20

    Inference of the structure of mRNA transcriptional regulatory networks, protein regulatory or interaction networks, and protein activation/inactivation-based signal transduction networks are critical tasks in systems biology. In this article we discuss a workflow for the reconstruction of parts of the transcriptional regulatory network of the pathogenic bacterium Salmonella typhimurium based on the information contained in sets of microarray gene expression data now available for that organism, and describe our results obtained by following this workflow. The primary tool is one of the network inference algorithms deployed in the Software Environment for BIological Network Inference (SEBINI). Specifically, we selected the algorithm called Context Likelihood of Relatedness (CLR), which uses the mutual information contained in the gene expression data to infer regulatory connections. The associated analysis pipeline automatically stores the inferred edges from the CLR runs within SEBINI and, upon request, transfers the inferred edges into either Cytoscape or the plug-in Collective Analysis of Biological of Biological Interaction Networks (CABIN) tool for further post-analysis of the inferred regulatory edges. The following article presents the outcome of this workflow, as well as the protocols followed for microarray data collection, data cleansing, and network inference. Our analysis revealed several interesting interactions, functional groups, metabolic pathways, and regulons in S. typhimurium.

  6. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  7. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its even......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  8. UML based modeling of medical applications workflow in maxillofacial surgery

    OpenAIRE

    Toma, M; Busam, A; Ortmaier, T; Raczkowsky, J.; Höpner, C; Marmulla1, R.

    2007-01-01

    This paper presents our research in medical workflow modeling for computer- and robot-based surgical intervention in maxillofacial surgery. Our goal is to provide a method for clinical workflow modeling including workflow definition for pre- and intra-operative steps, analysis of new methods for combining conventional surgical procedures with robot- and computer-assisted procedures and facilitate an easy implementation of hard- and software systems.

  9. Text mining meets workflow: linking U-Compare with Taverna

    OpenAIRE

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare t...

  10. CamBAfx: workflow design, implementation and application for neuroimaging

    OpenAIRE

    Cinly Ooi; Bullmore, Edward T; Alle-Meije Wink; Levent Sendur; Anna Barnes; Sophie Achard; John Aspden; Sanja Abbott; Shigang Yue; Manfred Kitzbichler; David Meunier; Voichita Maxim; Raymond Salvador; Julian Henty; Roger Tait

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platf...

  11. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging

    OpenAIRE

    Ooi, Cinly; Bullmore, Edward T; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platf...

  12. CPOE/EHR-Driven Healthcare Workflow Generation and Scheduling

    OpenAIRE

    Han, Minmin; Song, Xiping; DeHaan, Jan; Cao, Hui; Kennedy, Rosemary; Gugerty, Brian

    2006-01-01

    Automated healthcare workflow generation and scheduling is an approach to ensure the use of the evidence-based protocols. Generating efficient and practical workflows is challenging due to the dynamic nature of healthcare practice and operations. We propose to use Computerized Physician Order Entry (CPOE) and Electronic Health Record (EHR) components to generate workflows (consisting of scheduled work items) to aid healthcare (nursing) operations. Currently, we are prototyping and developing ...

  13. Dynamic Workflows and Advanced Data Management for Problem Solving Environments

    OpenAIRE

    Moisa, Dan

    2004-01-01

    Workflow management in problem solving environments (PSEs) is an emerging topic that aims to combine both data-oriented and execution-oriented views of scientific experiments, and closely integrate the processes underlying the practice of computational science with the software artifacts constituted by the PSE. This thesis presents a workflow management solution called BREW (BetteR Experiments through Workflow management) that provides functionality along four dimensions: components and insta...

  14. Workflow Partitioning and Deployment on the Cloud using Orchestra

    OpenAIRE

    Jaradat, Ward; Dearle, Alan; Barker, Adam

    2014-01-01

    Orchestrating service-oriented workflows is typically based on a design model that routes both data and control through a single point - the centralised workflow engine. This causes scalability problems that include the unnecessary consumption of the network bandwidth, high latency in transmitting data between the services, and performance bottlenecks. These problems are highly prominent when orchestrating workflows that are composed from services dispersed across distant geographical locatio...

  15. A Specification Language for the WIDE Workflow Model

    OpenAIRE

    Chan, Daniel K.C.; Vonk, Jochem; Sánchez, Gabriel; Paul W. P. J. Grefen; Apers, Peter M.G.

    1998-01-01

    This paper presents a workflow specification language developed in the WIDE project. The language provides a rich organisation model, an information model including presentation details, and a sophisticated process model. Workflow application developers should find the language a useful and compact means to capture and investigate design details. Workflow system developers would discover the language a good vehicle to study the interaction between different features as well as facilitate the ...

  16. Building and Documenting Workflows with Python-Based Snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    Snakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area. Snakemake e...

  17. Effective and Efficient Similarity Search in Scientific Workflow Repositories

    OpenAIRE

    Starlinger, Johannes; Cohen-Boulakia, Sarah; Khanna, Sanjeev; Davidson, Susan; Leser, Ulf

    2015-01-01

    Scientific workflows have become a valuable tool for large-scale data processing and analysis. This has led to the creation of specialized online repositories to facilitate worflkow sharing and reuse. Over time, these repositories have grown to sizes that call for advanced methods to support workflow discovery, in particular for similarity search. Effective similarity search requires both high quality algorithms for the comparison of scientific workflows and efficient strategies for indexing,...

  18. Database management issues in workflow systems: a summary

    OpenAIRE

    Put, Ferdinand

    1996-01-01

    Currently most workflow systems use a database management system as supporting technology. But very little can be found about the actual database modeling issues in the context of workflow management. Also in the recent reference model of the Workflow Management Coalition (WFMC) nothing is mentioned about the position of a database management system, nor about modeling. This paper tries to indicate where databases come into action, and what specific problems are encountered. Discusses are...

  19. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  20. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  1. A Distributed Workflow Platform for Simulation

    OpenAIRE

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Cet article présente une approche pour concevoir, réaliser et déployer une plateforme de simulation basée sur les workflows distribués. Elle permet l'intégration de logiciels existant, par exemple Matlab, Scilab, Python, OpenFOAM, Paraview et de programmes définis par les utilisateurs. La contribution est ici le support de la tolérance aux pannes par les applications et le traitement des exceptions, c-à-d la résilience.

  2. Evolving Workflow Graphs Using Typed Genetic Programming

    Czech Academy of Sciences Publication Activity Database

    Křen, T.; Pilát, M.; Neruda, Roman

    Los Alamitos: IEEE, 2015, s. 1407-1414. ISBN 978-1-4799-7560-0. [SSCI 2015. Symposium Series on Computational Intelligence. Cape Town (ZA), 08.12.2015-10.12.2015] R&D Projects: GA ČR GA15-19877S; GA MŠk ED1.1.00/02.0070 Grant ostatní: GA UK(CZ) 187115; GA UK(CZ) SVV 260224; GA MŠk(CZ) LM2011033 Institutional support: RVO:67985807 Keywords : typed genetic programming * meta- learning * workflow graphs Subject RIV: IN - Informatics, Computer Science

  3. SPATIAL DATA QUALITY AND A WORKFLOW TOOL

    OpenAIRE

    Meijer, M; Vullings, L.A.E.; J. D. Bulens; F. I. Rip; M. Boss; Hazeu, G.; Storm, M.

    2015-01-01

    Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help th...

  4. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  5. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...... have proved that it is sufficiently expressive to model ω-regular languages for infinite runs. The model has been extended with nested sub-graphs to express hierarchy, multi-instance sub processes to model replicated behavior and support for data. The second contribution of the thesis is to provide...

  6. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  7. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  8. A Safety Analysis Approach to Clinical Workflows: Application and Evaluation

    Directory of Open Access Journals (Sweden)

    Lamis Al-Qora’n

    2014-11-01

    Full Text Available Clinical workflows are safety critical workflows as they have the potential to cause harm or death to patients. Their safety needs to be considered as early as possible in the development process. Effective safety analysis methods are required to ensure the safety of these high-risk workflows, because errors that may happen through routine workflow could propagate within the workflow to result in harmful failures of the system’s output. This paper shows how to apply an approach for safety analysis of clinical workflows to analyse the safety of the workflow within a radiology department and evaluates the approach in terms of usability and benefits. The outcomes of using this approach include identification of the root causes of hazardous workflow failures that may put patients’ lives at risk. We show that the approach is applicable to this area of healthcare and is able to present added value through the detailed information on possible failures, of both their causes and effects; therefore, it has the potential to improve the safety of radiology and other clinical workflows.

  9. Process Makna - A Semantic Wiki for Scientific Workflows

    CERN Document Server

    Paschke, Adrian

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Process Makna Semantic Wiki system is a novel combination of an business process management system adapted for scientific workflows with a Corporate Semantic Web Wiki user interface supporting knowledge intensive human interaction tasks during scientific workflow execution.

  10. Deploying and sharing U-Compare workflows as web services

    OpenAIRE

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, BalaKrishna; Thompson, Paul; Ananiadou, Sophia

    2013-01-01

    Background U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare’s components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applic...

  11. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  12. Advanced Architectures for Transactional Workflows or Advanced Transactions in Workflow Architectures

    NARCIS (Netherlands)

    Grefen, Paul

    1999-01-01

    In this short paper, we outline the workflow management systems research in the Information Systems division at the University of Twente. We discuss the two main themes in this research: architecture design and advanced transaction management. Attention is paid to the coverage of these themes in the

  13. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  14. Lowering the Barriers to Integrative Aquatic Ecosystem Science: Semantic Provenance, Open Linked Data, and Workflows

    Science.gov (United States)

    Harmon, T.; Hofmann, A. F.; Utz, R.; Deelman, E.; Hanson, P. C.; Szekely, P.; Villamizar, S. R.; Knoblock, C.; Guo, Q.; Crichton, D. J.; McCann, M. P.; Gil, Y.

    2011-12-01

    Environmental cyber-observatory (ECO) planning and implementation has been ongoing for more than a decade now, and several major efforts have recently come online or will soon. Some investigators in the relevant research communities will use ECO data, traditionally by developing their own client-side services to acquire data and then manually create custom tools to integrate and analyze it. However, a significant portion of the aquatic ecosystem science community will need more custom services to manage locally collected data. The latter group represents enormous intellectual capacity when one envisions thousands of ecosystems scientists supplementing ECO baseline data by sharing their own locally intensive observational efforts. This poster summarizes the outcomes of the June 2011 Workshop for Aquatic Ecosystem Sustainability (WAES) which focused on the needs of aquatic ecosystem research on inland waters and oceans. Here we advocate new approaches to support scientists to model, integrate, and analyze data based on: 1) a new breed of software tools in which semantic provenance is automatically created and used by the system, 2) the use of open standards based on RDF and Linked Data Principles to facilitate sharing of data and provenance annotations, 3) the use of workflows to represent explicitly all data preparation, integration, and processing steps in a way that is automatically repeatable. Aquatic ecosystems workflow exemplars are provided and discussed in terms of their potential broaden data sharing, analysis and synthesis thereby increasing the impact of aquatic ecosystem research.

  15. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  16. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  17. Workflow management for a cosmology collaboratory

    International Nuclear Information System (INIS)

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most 'explosive' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Manuna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. The authors describe the workflow management framework for the project, discuss security and resource allocation requirements and review emerging tools to support this important aspect of collaborative work

  18. Workflow Management for a Cosmology Collaboratory

    Institute of Scientific and Technical Information of China (English)

    StewartC.Loken; CharlesMcParland

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problms in particle and nuclear physics.Itsd goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles)in precision measurements of cosmological parameters.Over the past several years,astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and,during the 4 to 8 weeks of their most"explosive~ activity,measure their changing magnitude and spectra.The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and MaunaKea,Hawaii and Mt.Palomar,California,This new program provides a demanding testbed for the integration of computational,data management and collaboratory technologies.A citical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments .This paper describes the workflow management framework for the project,discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  19. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  20. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  1. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  2. DOCFLOW: AN INTEGRATED DOCUMENT WORKFLOW FOR BUSINESS PROCESS MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Boonsit Yimwadsana

    2011-01-01

    Full Text Available Document management and workflow management systems have been widely used in large business enterprises to improve productivity. However, they still do not gain large acceptance in small and mediumsized businesses due to their cost and complexity. In addition, document management and workflow management systems are often used separately because they solve different problems. Only some part of document management systems should be tied together with workflow management systems. However, in most business environment, documents actually flow according to workflow definitions. Our work, thus, combines the two concepts together and simplifies the management of both document and workflow to fit business users. Our application, DocFlow, is designed with simplicity in mind while still maintaining necessary workflow and document management standard features with security. Approval mechanism is naturally included in the workflow, and the approval can be performed by a group of actors such that only one of the team members is sufficient to make the group's decision. A case study of news publishing process is shown to demonstrate how DocFlow can be used to create a workflow that fits the news publishing process.

  3. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  4. Research of Web-based Workflow Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The state of the art of workflow management techniques in research is introduced. The research and development trends of Workflow Manag ement System (WFMS) are presented. On basis of analysis and comparison of kinds of WFMSs, a WFMS based on Web technology and distributed object management is pr oposed. Finally, the application of the WFMS in supply chain management is descr ibed in detail.

  5. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  6. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola; Bhattacharyya, Anirban

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to...

  7. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  8. The influence of workflow systems on team learning

    NARCIS (Netherlands)

    Offenbeek, Marjolein A.G. van

    1999-01-01

    The question is raised what influence a team’s use of a workflow system will have on teamlearning. In office environments where the work is organised in semi-autonomous teams thatare responsible for whole processes, workflow systems are being implemented to effectivelyand efficiently realise the con

  9. Grid-enabled Workflows for Industrial Product Design

    OpenAIRE

    Boniface, M.J.; Ferris, J.; Ghanem, M; Azam, N

    2006-01-01

    This paper presents a generic approach for developing and using Grid-based workflow technology for enabling cross-organizational engineering applications. Using industrial product design examples from the automotive and aerospace industries we highlight the main requirements and challenges addressed by our approach and describe how it can be used for enabling interoperability between heterogeneous workflow engines.

  10. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  11. Argo: enabling the development of bespoke workflows and services for disease annotation.

    Science.gov (United States)

    Batista-Navarro, Riza; Carter, Jacob; Ananiadou, Sophia

    2016-01-01

    Argo (http://argo.nactem.ac.uk) is a generic text mining workbench that can cater to a variety of use cases, including the semi-automatic annotation of literature. It enables its technical users to build their own customised text mining solutions by providing a wide array of interoperable and configurable elementary components that can be seamlessly integrated into processing workflows. With Argo's graphical annotation interface, domain experts can then make use of the workflows' automatically generated output to curate information of interest.With the continuously rising need to understand the aetiology of diseases as well as the demand for their informed diagnosis and personalised treatment, the curation of disease-relevant information from medical and clinical documents has become an indispensable scientific activity. In the Fifth BioCreative Challenge Evaluation Workshop (BioCreative V), there was substantial interest in the mining of literature for disease-relevant information. Apart from a panel discussion focussed on disease annotations, the chemical-disease relations (CDR) track was also organised to foster the sharing and advancement of disease annotation tools and resources.This article presents the application of Argo's capabilities to the literature-based annotation of diseases. As part of our participation in BioCreative V's User Interactive Track (IAT), we demonstrated and evaluated Argo's suitability to the semi-automatic curation of chronic obstructive pulmonary disease (COPD) phenotypes. Furthermore, the workbench facilitated the development of some of the CDR track's top-performing web services for normalising disease mentions against the Medical Subject Headings (MeSH) database. In this work, we highlight Argo's support for developing various types of bespoke workflows ranging from ones which enabled us to easily incorporate information from various databases, to those which train and apply machine learning-based concept recognition models

  12. Web service automatic composition based on semantic relationship graph%基于语义关系图的Web服务自动组合方法

    Institute of Scientific and Technical Information of China (English)

    冯建周; 孔令富; 王晓寰

    2012-01-01

    针对基于图搜索实现Web服务自动组合存在搜索空间太大以及难以表达各种组合结构的问题,提出一种基于语义匹配关系确定组合结构的方法。该方法首先对Web服务进行形式化的语义描述,然后基于语义匹配关系,将服务库中只与用户请求的输入输出关联的服务构成语义关系图。在此基础上,基于语义匹配关系定义各种组合结构模型,以综合语义匹配度为寻优目标改进广度优先搜索算法,定义不同结构的语义匹配度计算方法,生成一条综合语义匹配度最优的Web服务组合路径。通过实例验证了该算法的可行性。%The method based on graph search was a simple and direct way to realize the Web service automatic composition, hut the search space was too large and it was difficult to express various combination structures among serv ices. To solve this problem, a method based on semantic matching relationship to determine combination structure was presented. Formal description of the Web services semantics were presented, and then based on semantic matehing relationship, the semantic relationship graph was established by services which were only related to user provided input and expected output. On this basis, various combination structure models were defined based on semantic matching relationship, and taking integrated semantic matching degree as optimal goal, the breadth-first search algorithm was improved, the calculation method of semantic matching degree in various combination structure was defined, and the service combination path which owned the optimal integrated semantic matching degree was generated. The feasibility of proposed algorithm was verified though an example.

  13. Phonon Gas Model (PGM) workflow in the VLab Science Gateway

    Science.gov (United States)

    da Silveira, P.; Zhang, D.; Wentzcovitch, R. M.

    2013-12-01

    This contribution describes a scientific workflow for first principles computations of free energy of crystalline solids using the phonon gas model (PGM). This model was recently implemented as a hybrid method combining molecular dynamics and phonon normal mode analysis to extract temperature dependent phonon frequencies and life times beyond perturbation theory. This is a demanding high throughout workflow and is currently being implemented in VLab Cyberinfrastructure [da Silveira et al., 2008], which has recently been integrated to the XSEDE. First we review the underlying PGM, its practical implementation, and calculation requirements. We then describe the workflow management and its general method for handling actions. We illustrate the PGM application with a calculation of MgSiO3-perovskite's anharmonic phonons. We conclude with an outlook of workflows to compute other material's properties that will use the PGM workflow. Research supported by NSF award EAR-1019853.

  14. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  15. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  16. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  17. A systematic workflow process for heavy oil characterization : experimental techniques and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Memon, A.I.; Gao, J.; Taylor, S.D.; Davies, T.L.; Jia, N. [Schlumberger, Edmonton, AB (Canada)

    2010-07-01

    Fluid characterization of heavy oil and bitumen is necessary when choosing the best extraction, production, and processing methods. High viscosity, low API, low saturation pressure, and low gas to oil ratios impose challenges in measuring fluid properties of heavy oil. This paper summarized the heavy oil fluid characterization technique that includes fluid sample handling, PVT analysis, fluid viscosity, emulsion and rheology, slow kinetics of gas evolution during constant composition expansion (CCE) experiments, solvent solubility, steam stripping, and high temperature vapor-liquid equilibrium of oil-solvent-steam systems. The paper also proposed a heavy oil characterization workflow to address a broad range of heavy oil production scenarios, including cold depletion, steam flow and heavy oil flow assurance. The workflow was based on various experimental techniques and challenges facing heavy oil sample preparation. Fluid property measurement requirements for each production technique were also compiled. 16 refs., 13 figs.

  18. Using Cloud-Aware Provenance to Reproduce Scientific Workflow Execution on Cloud

    OpenAIRE

    Hasham, Khawar; Munir, Kamran; McClatchey, Richard

    2015-01-01

    Provenance has been thought of a mechanism to verify a workflow and to provide workflow reproducibility. This provenance of scientific workflows has been effectively carried out in Grid based scientific workflow systems. However, recent adoption of Cloud-based scientific workflows present an opportunity to investigate the suitability of existing approaches or propose new approaches to collect provenance information from the Cloud and to utilize it for workflow repeatability in the Cloud infra...

  19. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  20. An Integrated Workflow for DNA Methylation Analysis

    Institute of Scientific and Technical Information of China (English)

    Pingchuan Li; Feray Demirci; Gayathri Mahalingam; Caghan Demirci; Mayumi Nakano; Blake C.Meyers

    2013-01-01

    The analysis of cytosine methylation provides a new way to assess and describe epigenetic regulation at a whole-genome level in many eukaryotes.DNA methylation has a demonstrated role in the genome stability and protection,regulation of gene expression and many other aspects of genome function and maintenance.BS-seq is a relatively unbiased method for profiling the DNA methylation,with a resolution capable of measuring methylation at individual cytosines.Here we describe,as an example,a workflow to handle DNA methylation analysis,from BS-seq library preparation to the data visualization.We describe some applications for the analysis and interpretation of these data.Our laboratory provides public access to plant DNA methylation data via visualization tools available at our "Next-Gen Sequence" websites (http://mpss.udel.edu),along with small RNA,RNA-seq and other data types.

  1. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  2. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  3. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  4. Advanced Architectures for Transactional Workflows or Advanced Transactions in Workflow Architectures

    OpenAIRE

    Grefen, Paul

    1999-01-01

    In this short paper, we outline the workflow management systems research in the Information Systems division at the University of Twente. We discuss the two main themes in this research: architecture design and advanced transaction management. Attention is paid to the coverage of these themes in the context of the completed Mercurius and WIDE projects and in the new CrossFlow project. In the latter project, contracts are introduced as a new theme to support electronic commerce aspects in work...

  5. Automatic Building Process of Self-Closed Modified N-tree

    Directory of Open Access Journals (Sweden)

    Yu Li

    2013-07-01

    Full Text Available Some features of prevailed workflow like Petri net and Grid workflow make them cannot adapt to the dynamic operation. So, we proposed a modified N-tree model to control a workflow. Modified N-tree model can remedy some problems exist in these prevailed workflow models. Firstly, we approve the proposed modified N-tree model is self-closed. This feature makes sure that this workflow can accomplish its tasks, when we change nodes of a well-running modified N-tree workflow before or while its execution. It is the prerequisite of dynamic characteristics of modified N-tree model. And, then we give a method to change this tree dynamically based on the self-closed merit. Finally, based on the dynamic characteristics of this model, we give a method to build on this N-tree workflow model automatically by using left root (LR analysis method proposed by Mr. D.Knuth. This is the most important performance of this model..  

  6. CA-PLAN, a Service-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Shung-Bin Yan; Feng-Jian Wang

    2005-01-01

    Workflow management systems (WfMSs) are accepted worldwide due to their ability to model and control business processes. Previously, we defined an intra-organizational workflow specification model, Process LANguage (PLAN).PLAN, with associated tools, allowed a user to describe a graph specification for processes, artifacts, and participants in an organization. PLAN has been successfully implemented in Agentflow to support workflow (Agentflow) applications. PLAN,and most current WfMSs are designed to adopt a centralized architecture so that they can be applied to a single organization.However, in such a structure, participants in Agentflow applications in different organizations cannot serve each other with workflows.In this paper, a service-oriented cooperative workflow model, Cooperative Agentflow Process LANguage (CA-PLAN) is presented. CA-PLAN proposes a workflow component model to model inter-organizational processes. In CA-PLAN, an interorganizational process is partitioned into several intra-organizational processes. Each workflow system inside an organization is modeled as an Integrated Workflow Component (IWC). Each IWC contains a process service interface, specifying process services provided by an organization, in conjunction with a remote process interface specifying what remote processes are used to refer to remote process services provided by other organizations, and intra-organizational processes. An IWC is a workflow node and participant. An inter-organizational process is made up of connections among these process services and remote processes with respect to different IWCs. In this paper, the related service techniques and supporting tools provided in Agentflow systems are presented.

  7. Robust methods for automatic image-to-world registration in cone-beam CT interventional guidance

    OpenAIRE

    Dang, H; Otake, Y.; Schafer, S.; Stayman, J. W.; Kleinszig, G.; Siewerdsen, J. H.

    2012-01-01

    Purpose: Real-time surgical navigation relies on accurate image-to-world registration to align the coordinate systems of the image and patient. Conventional manual registration can present a workflow bottleneck and is prone to manual error and intraoperator variability. This work reports alternative means of automatic image-to-world registration, each method involving an automatic registration marker (ARM) used in conjunction with C-arm cone-beam CT (CBCT). The first involves a Known-Model re...

  8. CMS Alignement and Calibration workflows lesson learned and future plans

    CERN Document Server

    De Guio, Federico

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  9. A Workflow Process Mining Algorithm Based on Synchro-Net

    Institute of Scientific and Technical Information of China (English)

    Xing-Qi Huang; Li-Fu Wang; Wen Zhao; Shi-Kun Zhang; Chong-Yi Yuan

    2006-01-01

    Sometimes historic information about workflow execution is needed to analyze business processes. Process mining aims at extracting information from event logs for capturing a business process in execution. In this paper a process mining algorithm is proposed based on Synchro-Net which is a synchronization-based model of workflow logic and workflow semantics. With this mining algorithm based on the model, problems such as invisible tasks and short-loops can be dealt with at ease. A process mining example is presented to illustrate the algorithm, and the evaluation is also given.

  10. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  11. Mapping open access to e-resources workflows

    OpenAIRE

    Stone, Graham; Awre, Chris; Stainthorp, Paul

    2016-01-01

    Open Access workflows are often seen as a separate add-on set of processes. However, libraries already have processes in place to manage the e-resource life cycle. Therefore, as part of work package 8 (Library processes and open access), the HHuLOA team decided to investigate how open access workflows could be embedded into e-resource management. This poster accompanies the blog post at: https://library3.hud.ac.uk/blogs/hhuloa/2016/05/11/mapping-open-access-to-e-resources-workflows/

  12. Service-based flexible workflow system for virtual enterprise

    Institute of Scientific and Technical Information of China (English)

    WU Shao-fei

    2008-01-01

    Using the services provided by virtual enterprises, we presented a solution to implement flexible inter-enterprise workflow management. Services were the responses of events that can be accessed programmatically on the Internet by HTTP protocol. Services were obtained according to some standardized service templates. The workflow engine's flexible control to a request was bound to appropriate services and their providers by using a constraint-based, dynamic binding mechanism. Hence, a flexible and collaborative business was achieved. The workflow management system supports virtual enterprise, and the styles of virtual enterprises can be adjusted readily to adapt various situations.

  13. Workflow for the use of a high-resolution image detector in endovascular interventional procedures

    Science.gov (United States)

    Rana, R.; Loughran, B.; Swetadri Vasan, S. N.; Pope, L.; Ionita, C. N.; Siddiqui, A.; Lin, N.; Bednarek, D. R.; Rudin, S.

    2014-03-01

    Endovascular image-guided intervention (EIGI) has become the primary interventional therapy for the most widespread vascular diseases. These procedures involve the insertion of a catheter into the femoral artery, which is then threaded under fluoroscopic guidance to the site of the pathology to be treated. Flat Panel Detectors (FPDs) are normally used for EIGIs; however, once the catheter is guided to the pathological site, high-resolution imaging capabilities can be used for accurately guiding a successful endovascular treatment. The Micro-Angiographic Fluoroscope (MAF) detector provides needed high-resolution, high-sensitivity, and real-time imaging capabilities. An experimental MAF enabled with a Control, Acquisition, Processing, Image Display and Storage (CAPIDS) system was installed and aligned on a detector changer attached to the C-arm of a clinical angiographic unit. The CAPIDS system was developed and implemented using LabVIEW software and provides a user-friendly interface that enables control of several clinical radiographic imaging modes of the MAF including: fluoroscopy, roadmap, radiography, and digital-subtraction-angiography (DSA). Using the automatic controls, the MAF detector can be moved to the deployed position, in front of a standard FPD, whenever higher resolution is needed during angiographic or interventional vascular imaging procedures. To minimize any possible negative impact to image guidance with the two detector systems, it is essential to have a well-designed workflow that enables smooth deployment of the MAF at critical stages of clinical procedures. For the ultimate success of this new imaging capability, a clear understanding of the workflow design is essential. This presentation provides a detailed description and demonstration of such a workflow design.

  14. Workflow oriented hanging protocols for radiology workstation

    Science.gov (United States)

    Moise, Adrian; Atkins, M. Stella

    2002-05-01

    The goal is to provide a smooth, efficient and automatic display for interpretation of medical images by using a new generation of hanging protocols (HPs). HPs refer to a set of rules defining the way images are arranged on the computer screen immediately after opening a case. HPs usually include information regarding placement of the sequences, viewing mode, layout, window width and level (W/L) settings, zoom and pan. We present the results of a survey of 8 radiologists on (1) the necessity of using HPs, (2) the applicability of a hierarchical organization of HPs and (3) the number of HPs required for interpretation. We discuss some limitations and challenges associated with the HP including automatic placement of the series on the screen despite non-standard series labeling, generation of pseudo-series, creation of the 'study context' and identification of relevant priors, and image display standardization with automatic orientation and shuttering. The paper also addresses the HP selection based on the workstation's hardware such as number and type of monitors, size of the study, and presence of image processing routines tailored to the information needs and level of expertise of particular users. Our 'heads-up' approach is meant to free the user's conscious processing for reasoning such as detection of patterns so allowing for the execution of the tasks in an efficient, yet highly adaptive manner, sensitive to shifting concepts. Automation of routine tasks is maximized through the creation of shortcuts and macros embedded in features like multi-stage HP.

  15. DMS systémy a workflow

    OpenAIRE

    Jakeš, Jiří

    2008-01-01

    Tato práce pojednává o systémch na správu dokumentů (DMS) a podpoře vnitropodnikových prosesů pomocí integrovaných workflow modulů. Práce zahrnuje hlavní důvody pro zavedení DMS, benefity plynoucí z jeho zavedení, funkcionalitu typických DMS, vymezuje jednotlivé komponenty systému, mapuje stav na příslušném trhu a uvádí trendy, ke kterým bude vývoj techto systémů směřovat. Práce vychází především z praktických poznatků a snaží se najít přechod od technologie k businessu....

  16. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  17. SPATIAL DATA QUALITY AND A WORKFLOW TOOL

    Directory of Open Access Journals (Sweden)

    M. Meijer

    2015-08-01

    Full Text Available Although by many perceived as important, spatial data quality has hardly ever been taken centre stage unless something went wrong due to bad quality. However, we think this is going to change soon. We are more and more relying on data driven processes and due to the increased availability of data, there is a choice in what data to use. How to make that choice? We think spatial data quality has potential as a selection criterion. In this paper we focus on how a workflow tool can help the consumer as well as the producer to get a better understanding about which product characteristics are important. For this purpose, we have developed a framework in which we define different roles (consumer, producer and intermediary and differentiate between product specifications and quality specifications. A number of requirements is stated that can be translated into quality elements. We used case studies to validate our framework. This framework is designed following the fitness for use principle. Also part of this framework is software that in some cases can help ascertain the quality of datasets.

  18. Confusion Analysis and Detection for Workflow Nets

    Directory of Open Access Journals (Sweden)

    Xiao-liang Chen

    2014-01-01

    Full Text Available Option processes often occur in a business procedure with respect to resource competition. In a business procedure modeled with a workflow net (WF-net, all decision behavior and option operations for business tasks are modeled and performed by the conflicts in corresponding WF-net. Concurrency in WF-nets is applied to keep a high-performance operation of business procedures. However, the firing of concurrent transitions in a WF-net may lead to the disappearance of conflicts in the WF-net. The phenomenon is usually called confusions that produces difficulties for the resolution of conflicts. This paper investigates confusion detection problems in WF-nets. First, confusions are formalized as a class of marked subnets with special conflicting and concurrent features. Second, a detection approach based on the characteristics of confusion subnets and the integer linear programming (ILP is developed, which is not required to compute the reachability graph of a WF-net. Examples of the confusion detection in WF-nets are presented. Finally, the impact of confusions on the properties of WF-nets is specified.

  19. Delegation in Role Based Access Control Model for Workflow Systems

    Directory of Open Access Journals (Sweden)

    Prasanna H Bammigatti

    2008-03-01

    Full Text Available Role -based access control (RBAC has been introduced in the last few years, and offers a powerful means of specifying access control decisions. The model of RBAC usually assumes that, if there is a role hierarchy then access rights are inherited upwards through the hierarchy. In organization workflow the main threat is of access control. The Role based access control is one of the best suitable access control model one can think of. It is not only the role hierarchies but also other control factors that affect the access control in the workflow. The paper discusses the control factors and role hierarchies in workflow and brings a new model of RBAC. This paper also over comes the conflicts and proves that the system is safe by applying the new model to the workflow

  20. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  1. A Community-Driven Workflow Recommendation and Reuse Infrastructure Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX...

  2. Network resource control for grid workflow management systems

    NARCIS (Netherlands)

    R. Strijkers; M. Cristea; V. Korkhov; D. Marchal; A. Belloum; C. de Laat; R. Meijer

    2010-01-01

    Grid workflow management systems automate the orchestration of scientific applications with large computational and data processing needs, but lack control over network resources. Consequently, the management system cannot prevent multiple communication intensive applications to compete for network

  3. Supporting exploration and collaboration in scientific workflow systems

    Science.gov (United States)

    Marini, L.; Kooper, R.; Bajcsy, P.; Myers, J.

    2007-12-01

    As the amount of observation data captured everyday increases, running scientific workflows will soon become a fundamental step of scientific inquiry. Current scientific workflow systems offer ways to link together data, software and computational resources, but often accomplish this by requiring a deep understanding of the system with a steep learning curve. Thus, there is a need to lower user adoption barriers for workflow systems and improve the plug-and-play functionality of these systems. We created a system that allows the user to easily create and share workflows, data and algorithms. Our goal of lowering user adoption barriers is to support discoveries and to provide means for conducting research more efficiently. Current paradigms for workflow creation focus on the visual programming using a graph based metaphor. This can be a powerful metaphor in the hands of expert users, but can become daunting when graphs become large, the steps in the graph include engineering level steps such as loading and visualizing data, and the users are not very familiar with all the possible tools available. We present a different method of workflow creation that co- exists with the standard graph based editors. The method builds on exploratory interface using a macro- recording style, and focuses on the data being analyzed during the step by step creation of the workflow. Instead of storing data in system specific data structures, the use of more flexible open standards that are platform independent would create systems that are easier to extend and that provide a simple interface for external applications to query and analyze the data and metadata produced. We have explored and implemented a system that stores workflows and related metadata using the Resource Description Framework (RDF) metadata model and that is build on top of the Tupelo data and metadata archiving system. The scientific workflow system connects to shared content repositories, where users can easily share

  4. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  5. Optimierung datenintensiver Workflows: Konzepte und Realisierung eines heuristischen, regelbasierten Optimierers

    OpenAIRE

    Vrhovnik, Marko

    2011-01-01

    Um die Modellierung datenintensiver Workflows, die große relationale Datenmengen verarbeiten, zu vereinfachen, wurden Workflowbeschreibungssprachen, wie BPEL, von führenden Herstellern von Workflow- und Datenbankmanagementsystemen um SQL-Funktionalität erweitert. Dadurch müssen Datenverarbeitungsoperationen, wie SQL-Anweisungen oder Aufrufe benutzerdefinierter Prozeduren, nicht mehr in Web-Services gekapselt werden, sondern können direkt auf der Workflowebene definiert werden. Daraus resultie...

  6. An Architecture for Decentralised Orchestration of Web Service Workflows

    OpenAIRE

    Jaradat, Ward; Dearle, Alan; Barker, Adam

    2013-01-01

    Service-oriented workflows are typically executed using a centralised orchestration approach that presents significant scalability challenges. These challenges include the consumption of network bandwidth, degradation of performance, and single-points of failure. We provide a decentralised orchestration architecture that attempts to address these challenges. Our architecture adopts a design model that permits the computation to be moved "closer" to services in a workflow. This is achieved by ...

  7. Organisatorische Flexibilität durch Workflow-Management-Systeme?

    OpenAIRE

    Kirn, Stefan

    2008-01-01

    Mit dem Einsatz von Workflow-Management-Systemen wird allgemein eine Verbesserung der organisatorischen Flexibilität verbunden. Das ist dann von wesentlicher Bedeutung, wenn, wie in der Dienstleistung, dem Kunden maßgeschneiderte Produkte angeboten werden sollen. Ausgehend von den theoretischen Grundlagen zur Flexibilität prozeßorientierter Organisationen untersucht der Beitrag anhand empirischer Daten die flexibilitätsrelevanten Eigenschaften von Workflow-Management-Systemen. Diese hängen we...

  8. Workflow management systems, their security and access control mechanisms

    OpenAIRE

    Chehrazi, Golriz

    2007-01-01

    This paper gives an overview of workflow management systems (WfMSs) and their security requirements with focus on access mechanisms. It is a descriptive paper in which we examine the state of the art of workflow systems, describe what security risks affect WfMSs in particular, and how these can be diminiuished. WfMSs manage, illustrate and support business processes. They contribute to the performance, automation and optimization of processes, which is important in the global economy today. ...

  9. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    OpenAIRE

    SARAVANAN .M.S,; RAMA SREE .R.J

    2011-01-01

    This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM). A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of...

  10. Optimization of tomographic reconstruction workflows on geographically distributed resources.

    Science.gov (United States)

    Bicer, Tekin; Gürsoy, Dogˇa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T

    2016-07-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  11. Scheduling Computational Workflows on Failure-Prone Platforms

    OpenAIRE

    Aupy, Guillaume; Benoit, Anne; Casanova, Henri; Robert, Yves

    2015-01-01

    We study the scheduling of computational workflows on compute resources that experience exponentially distributed failures. When a failure occurs, roll-back and recovery is used to resume the execution from the last checkpointed state. The scheduling problem is to minimize the expected execution time by deciding in which order to execute the tasks in the workflow and whether to checkpoint or not checkpoint a task after it completes. We give a polynomial-time algorithm for fork graphs and show...

  12. INTRODUCTION OF WINDOWS WORKFLOW FOUNDATION INTO EXISTING APPLICATIONS

    OpenAIRE

    Kržič, Jernej

    2008-01-01

    In this dissertation we discuss the process of implementing the Windows Workflow foundation model in existing applications. We present several possible approaches to solving this problem and their advantages, as well as limitations. First, we describe the main concepts behind Business Process Management and the Windows Workflow foundation programming model. Other related technologies are also described, including .NET Framwork, Windows Communication Foundation, ASP.NET etc. The practical p...

  13. Process Makna - A Semantic Wiki for Scientific Workflows

    OpenAIRE

    Paschke, Adrian; Zhao, Zhili

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Proces...

  14. Spheres of isolation: adaptation of isolation levels to transactional workflow

    OpenAIRE

    Guabtni, Adnene; Charoy, François; Godart, Claude

    2005-01-01

    In Workflow Management Systems (WFMSs), transaction isolation is managed most of the time by the underlying database system using ANSI SQL strategies. These strategies do not take sufficiently into account process aspects. Our work consists in studying with more depth the relation between isolation strategy and process dimension as well as the real isolation needs in workflow environments. To carry out these needs, we define `spheres of isolation' inspired from `spheres of control' proposed b...

  15. A Taxonomy of Workflow Management Systems for Grid Computing

    OpenAIRE

    Yu, Jia; Buyya, Rajkumar

    2005-01-01

    With the advent of Grid and application technologies, scientists and engineers are building more and more complex applications to manage and process large data sets, and execute scientific experiments on distributed resources. Such application scenarios require means for composing and executing complex workflows. Therefore, many efforts have been made towards the development of workflow management systems for Grid computing. In this paper, we propose a taxonomy that characterizes and classifi...

  16. Workflow-based semantics for peer-to-peer specifications

    Institute of Scientific and Technical Information of China (English)

    Antonio BROGI; Razvan POPESCU

    2008-01-01

    In this paper we introduce SMoL, a simplified BPEL-like language for specifying peer and service beha-viour in P2P systems. We then define a transformational semantics of SMoL in terms of Yet Another Workflow Language (YAWL) workflows, which enables the simu-lation (e.g., testing possible execution scenarios) and ana-lysis (e.g., verifying reachability or lock freedom) of the behaviour of P2P peers and services.

  17. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    OpenAIRE

    Mohamed El Khadiri; Abdelaziz El Fazziki

    2012-01-01

    This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI) of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient...

  18. Implementation and evaluation of a new workflow for registration and segmentation of pulmonary MRI data for regional lung perfusion assessment

    International Nuclear Information System (INIS)

    Recently it has been shown that regional lung perfusion can be assessed using time-resolved contrast-enhanced magnetic resonance (MR) imaging. Quantification of the perfusion images has been attempted, based on definition of small regions of interest (ROIs). Use of complete lung segmentations instead of ROIs could possibly increase quantification accuracy. Due to the low signal-to-noise ratio, automatic segmentation algorithms cannot be applied. On the other hand, manual segmentation of the lung tissue is very time consuming and can become inaccurate, as the borders of the lung to adjacent tissues are not always clearly visible. We propose a new workflow for semi-automatic segmentation of the lung from additionally acquired morphological HASTE MR images. First the lung is delineated semi-automatically in the HASTE image. Next the HASTE image is automatically registered with the perfusion images. Finally, the transformation resulting from the registration is used to align the lung segmentation from the morphological dataset with the perfusion images. We evaluated rigid, affine and locally elastic transformations, suitable optimizers and different implementations of mutual information (MI) metrics to determine the best possible registration algorithm. We located the shortcomings of the registration procedure and under which conditions automatic registration will succeed or fail. Segmentation results were evaluated using overlap and distance measures. Integration of the new workflow reduces the time needed for post-processing of the data, simplifies the perfusion quantification and reduces interobserver variability in the segmentation process. In addition, the matched morphological data set can be used to identify morphologic changes as the source for the perfusion abnormalities

  19. BReW: Blackbox Resource Selection for e-Science Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh [Univ. of Southern California, Los Angeles, CA (United States); Soroush, Emad [Univ. of Washington, Seattle, WA (United States); Van Ingen, Catharine [Microsoft Research, San Francisco, CA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-04

    Workflows are commonly used to model data intensive scientific analysis. As computational resource needs increase for eScience, emerging platforms like clouds present additional resource choices for scientists and policy makers. We introduce BReW, a tool enables users to make rapid, highlevel platform selection for their workflows using limited workflow knowledge. This helps make informed decisions on whether to port a workflow to a new platform. Our analysis of synthetic and real eScience workflows shows that using just total runtime length, maximum task fanout, and total data used and produced by the workflow, BReW can provide platform predictions comparable to whitebox models with detailed workflow knowledge.

  20. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  1. Collaboration Policies: Access Control Management in Decentralized Heterogeneous Workflows

    Directory of Open Access Journals (Sweden)

    Mine Altunay

    2006-07-01

    Full Text Available Service-oriented computing promotes collaboration by defining the standards layer that allows compatibility between disparate domains. Workflows, by taking advantage of the service oriented framework, provide the necessary tools to harness services in order to tackle complicated problems. As a result, a service is no longer exposed to a small pre-determined homogeneous pool of users; instead it has a large, undefined, and heterogeneous pool of users. This paradigm shift in computing results in increased service exposure. The interactions among the services of a workflow must be carefully evaluated against the security risks associated with them. Classical security problems, such as delegation of rights, conflict of interest, and access control in general, become more complicated due to multiple autonomous security domains and the absence of pre- established trust relationships among the domains. Our work tackles these problems in two aspects: it provides a service owner with the necessary means to express and evaluate its trust requirements from a workflow (collaboration policies, and it incorporates these trust requirements into the workflow-planning framework (workflow authorization framework. Our policy-based framework allows bilateral peer-level trust evaluations that are based on each peer’s collaboration policies, and incorporates the outcome of these evaluations into the workflow planning logic. As a result, our work provides the necessary tools for promoting multi-party ad-hoc collaborations, and aims to reduce the reluctance and hesitation towards these collaborations by attacking the security risks associated with them.

  2. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  3. LQCD workflow execution framework: Models, provenance and fault-tolerance

    Science.gov (United States)

    Piccoli, Luciano; Dubey, Abhishek; Simone, James N.; Kowalkowlski, James B.

    2010-04-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  4. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases. PMID:26721184

  5. Applying Business Process Re-engineering Patterns to optimize WS-BPEL Workflows

    Science.gov (United States)

    Buys, Jonas; de Florio, Vincenzo; Blondia, Chris

    With the advent of XML-based SOA, WS-BPEL shortly turned out to become a widely accepted standard for modeling business processes. Though SOA is said to embrace the principle of business agility, BPEL process definitions are still manually crafted into their final executable version. While SOA has proven to be a giant leap forward in building flexible IT systems, this static BPEL workflow model is somewhat paradoxical to the need for real business agility and should be enhanced to better sustain continual process evolution. In this paper, we point out the potential of adding business intelligence with respect to business process re-engineering patterns to the system to allow for automatic business process optimization. Furthermore, we point out that BPR macro-rules could be implemented leveraging micro-techniques from computer science. We present some practical examples that illustrate the benefit of such adaptive process models and our preliminary findings.

  6. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud.

    Science.gov (United States)

    Wolstencroft, Katherine; Haines, Robert; Fellows, Donal; Williams, Alan; Withers, David; Owen, Stuart; Soiland-Reyes, Stian; Dunlop, Ian; Nenadic, Aleksandra; Fisher, Paul; Bhagat, Jiten; Belhajjame, Khalid; Bacall, Finn; Hardisty, Alex; Nieva de la Hidalga, Abraham; Balcazar Vargas, Maria P; Sufi, Shoaib; Goble, Carole

    2013-07-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud environments), using the Taverna Server. In bioinformatics, Taverna workflows are typically used in the areas of high-throughput omics analyses (for example, proteomics or transcriptomics), or for evidence gathering methods involving text mining or data mining. Through Taverna, scientists have access to several thousand different tools and resources that are freely available from a large range of life science institutions. Once constructed, the workflows are reusable, executable bioinformatics protocols that can be shared, reused and repurposed. A repository of public workflows is available at http://www.myexperiment.org. This article provides an update to the Taverna tool suite, highlighting new features and developments in the workbench and the Taverna Server. PMID:23640334

  7. Run-time revenue maximization for composite web services with response time commitments

    NARCIS (Netherlands)

    M. Živković; J.W. Bosman; H. van den Berg; R. van der Mei; H.B. Meeuwissen; R. Núñez-Queija

    2012-01-01

    We investigate dynamic decision mechanisms for composite web services maximizing the expected revenue for the providers of composite services. A composite web service is represented by a (sequential) workflow, and for each task within this workflow, a number of service alternatives may be available.

  8. The CESM Workflow Re-Engineering Project

    Science.gov (United States)

    Strand, G.

    2015-12-01

    The Community Earth System Model (CESM) Workflow Re-Engineering Project is a collaborative project between the CESM Software Engineering Group (CSEG) and the NCAR Computation and Information Systems Lab (CISL) Application Scalability and Performance (ASAP) Group to revamp how CESM saves its output. The CMIP3 and particularly CMIP5 experiences in submitting CESM data to those intercomparison projects revealed that the output format of the CESM is not well-suited for the data requirements common to model intercomparison projects. CESM, for efficiency reasons, creates output files containing all fields for each model time sampling, but MIPs require individual files for each field comprising all model time samples. This transposition of model output can be very time-consuming; depending on the volume of data written by the specific simulation, the time to re-orient the data can be comparable to the time required for the simulation to complete. Previous strategies including using serial tools to perform this transposition, but they are now far too inefficient to deal with the many terabytes of output a single simulation can generate. A new set of Python tools, using data parallelism, have been written to enable this re-orientation, and have achieved markedly improved I/O performance. The perspective of a data manager/data producer in the use of these new tools is presented, and likely future work on their development and use will be shown. These tools are a critical part of the NCAR CESM submission to the upcoming CMIP6, with the intention that a much more timely and efficient submission of the expected petabytes of data will be accomplished in the given time frame.

  9. Inverse IMRT workflow process at Austin health

    International Nuclear Information System (INIS)

    Full text: The work presented here will review the strategies adopted at Austin Health to bring IMRT into clinical use. IMRT is delivered using step and shoot mode on an Elekta Precise machine with 40 pairs of 1cm wide MLC leaves. Planning is done using CMS Focus/XiO. A collaborative approach for RO's, Physicists and RTs from concept to implementation was adopted. An overview will be given of the workflow for the clinic, the equipment used, tolerance levels and the lessons learned. 1. Strategic Planning for IMRT 2. Training a. MSKCC (New York) b.ESTRO (Amsterdam) c.Elekta (US and UK) 3. Linac testing and data acquisition a. Equipment and software review and selection b. Linac reliability/geometric and mechanical checks c. Draft Patient QA procedure d. EPI Image matching checks and procedures 4. Planning system checks a. export of dose matrix (options) b. dose calculation choices 5. IMRT Research Initiatives a. IMRT Planning Studies, Stabilisation, On-line Imaging 6. Equipment Procurement and testing a. Physics and Linac Equipment, Hardware, Software/Licences, Stabilisation 7. Establishing a DICOM Environment a. Prescription sending, Image transfer for EPI checks b. QA Files 8. Physics QA (Pre-Treatment) a.Clinical plan review; DVH checks b. geometry; dosimetry checks; DICOM checks c. 2D Distance to agreement; mm difference reports; Gamma function index 9. Documentation a.Protocol Development i. ICRU 50/62 reporting and prescribing b. QA for Physics c. QA for RT's d. Generation of a report for RO/patient history. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  10. Composites

    OpenAIRE

    Zhao, Hanqing; Guo, Yuanzheng

    2014-01-01

    This thesis was a literature study concerning composites. With composites becoming increasingly popular in various areas such as aerospace industry and construction, the research about composites has a significant meaning accordingly. This thesis was aim at introducing some basic information of polymer matrix composites including raw mate-rial, processing, testing, applications and recycling to make a rough understanding of this kind of material for readers. Polymeric matrices, fillers,...

  11. Composition

    DEFF Research Database (Denmark)

    2014-01-01

    Memory Pieces are open compositions to be realised solo by an improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". Caution: streaming the sound files will in some cases only provide a few minutes' sample. Please DOWNLOAD them to hear them...

  12. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2010-01-01

    New Year is an open composition to be realised by improvising musicians. It is included in "From the Danish Seasons" (see under this title). See more about my composition practise in the entry "Composition - General Introduction". This work is licensed under a Creative Commons "by-nc" License. You...

  13. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2011-01-01

    Strategies are open compositions to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". Caution: streaming the sound files will in some cases only provide a few minutes' sample. Please DOWNLOAD them to hear them in full...

  14. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2014-01-01

    Cue Rondo is an open composition to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". Caution: streaming the sound/video files will in some cases only provide a few minutes' sample, or the visuals will not appear at all...

  15. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  16. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  17. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  18. CamBAfx: workflow design, implementation and application for neuroimaging

    Directory of Open Access Journals (Sweden)

    Cinly Ooi

    2009-08-01

    Full Text Available CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers and those who design them (designers. It provides a front-end (user interface optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  19. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future. PMID:22317594

  20. Integration of the radiotherapy irradiation planning in the digital workflow

    International Nuclear Information System (INIS)

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  1. Trends in Use of Scientific Workflows: Insights from a Public Repository and Recommendations for Best Practice

    Directory of Open Access Journals (Sweden)

    Richard Littauer

    2012-12-01

    Full Text Available Scientific workflows are typically used to automate the processing, analysis and management of scientific data. Most scientific workflow programs provide a user-friendly graphical user interface that enables scientists to more easily create and visualize complex workflows that may be comprised of dozens of processing and analytical steps. Furthermore, many workflows provide mechanisms for tracing provenance and methodologies that foster reproducible science. Despite their potential for enabling science, few studies have examined how the process of creating, executing, and sharing workflows can be improved. In order to promote open discourse and access to scientific methods as well as data, we analyzed a wide variety of workflow systems and publicly available workflows on the public repository myExperiment. It is hoped that understanding the usage of workflows and developing a set of recommended best practices will lead to increased contribution of workflows to the public domain.

  2. CloudWF: A Computational Workflow System for Clouds Based on Hadoop

    Science.gov (United States)

    Zhang, Chen; de Sterck, Hans

    This paper describes CloudWF, a scalable and lightweight computational workflow system for clouds on top of Hadoop. CloudWF can run workflow jobs composed of multiple Hadoop MapReduce or legacy programs. Its novelty lies in several aspects: a simple workflow description language that encodes workflow blocks and block-to-block dependencies separately as standalone executable components; a new workflow storage method that uses Hadoop HBase sparse tables to store workflow information internally and reconstruct workflow block dependencies implicitly for efficient workflow execution; transparent file staging with Hadoop DFS; and decentralized workflow execution management relying on the MapReduce framework for task scheduling and fault tolerance. This paper describes the design and implementation of CloudWF.

  3. Handwriting Automaticity: The Search for Performance Thresholds

    Science.gov (United States)

    Medwell, Jane; Wray, David

    2014-01-01

    Evidence is accumulating that handwriting has an important role in written composition. In particular, handwriting automaticity appears to relate to success in composition. This relationship has been little explored in British contexts and we currently have little idea of what threshold performance levels might be. In this paper, we report on two…

  4. Brain-Wide Mapping of Axonal Connections: Workflow for Automated Detection and Spatial Analysis of Labeling in Microscopic Sections.

    Science.gov (United States)

    Papp, Eszter A; Leergaard, Trygve B; Csucs, Gergely; Bjaalie, Jan G

    2016-01-01

    Axonal tracing techniques are powerful tools for exploring the structural organization of neuronal connections. Tracers such as biotinylated dextran amine (BDA) and Phaseolus vulgaris leucoagglutinin (Pha-L) allow brain-wide mapping of connections through analysis of large series of histological section images. We present a workflow for efficient collection and analysis of tract-tracing datasets with a focus on newly developed modules for image processing and assignment of anatomical location to tracing data. New functionality includes automatic detection of neuronal labeling in large image series, alignment of images to a volumetric brain atlas, and analytical tools for measuring the position and extent of labeling. To evaluate the workflow, we used high-resolution microscopic images from axonal tracing experiments in which different parts of the rat primary somatosensory cortex had been injected with BDA or Pha-L. Parameters from a set of representative images were used to automate detection of labeling in image series covering the entire brain, resulting in binary maps of the distribution of labeling. For high to medium labeling densities, automatic detection was found to provide reliable results when compared to manual analysis, whereas weak labeling required manual curation for optimal detection. To identify brain regions corresponding to labeled areas, section images were aligned to the Waxholm Space (WHS) atlas of the Sprague Dawley rat brain (v2) by custom-angle slicing of the MRI template to match individual sections. Based on the alignment, WHS coordinates were obtained for labeled elements and transformed to stereotaxic coordinates. The new workflow modules increase the efficiency and reliability of labeling detection in large series of images from histological sections, and enable anchoring to anatomical atlases for further spatial analysis and comparison with other data. PMID:27148038

  5. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  6. Hybrid Workflow Policy Management for Heart Disease Identification

    CERN Document Server

    Kim, Dong-Hyun; Youn, Chan-Hyun

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we propose hybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS) workflow applications.

  7. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  8. Implementation of the electronic DDA workflow for NSSS system design

    International Nuclear Information System (INIS)

    For improving NSSS design quality, and productivity several cases of the nuclear developed nation's integrated management system, such as Mitsubishi's NUWINGS (Japan), AECL's CANDID (Canada) and Duke Powes's (USA) were investigated, and it was studied in this report that the system implementation of NSSS design document computerization and the major workflow process of the DDA (Document Distribution for Agreement). On the basis of the requirements of design document computerization which covered preparation, review, approval and distribution of the engineering documents, KAERI Engineering Information Management System (KEIMS) was implemented. Major effects of this report are to implement GUI panel for input and retrieval of the document index information, to setup electronic document workflow, and to provide quality assurance verification by tracing the workflow history. Major effects of NSSS design document computerization are the improvement of efficiency and reliability and the engineering cost reduction by means of the fast documents verification capability and electronic document transferring system. 2 tabs., 16 figs., 9 refs. (Author)

  9. Hybrid Workflow Policy Management for Heart Disease Identification

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Kim

    2009-12-01

    Full Text Available As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS in Grid computing are promisingsolution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we proposehybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS workflow applications.

  10. Implementation of Workflow Management System for Collaborative Process Planning

    Directory of Open Access Journals (Sweden)

    Su Ying-Ying

    2013-01-01

    Full Text Available Workflow management system has generally been accepted as a paradigm for supporting processes in complex organizations. Since process planning is a huge and complex work, several process planners should execute planning together. Collaborative process planning is inevitable for saving in time and cost of process planning through concurrent and collaborative engineering. Workflow technology, as the important branch of computer supported cooperative work, has strong advantages in organization management and flow optimization. In this research, the structure and business flow of collaborative process planning is analyzed. The function of workflow management system for collaborative process planning is illustrated and the system is implemented to effectively control and manage the flow of process planning.

  11. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  12. A HYBRID PETRI-NET MODEL OF GRID WORKFLOW

    Institute of Scientific and Technical Information of China (English)

    Ji Yimu; Wang Ruchuan; Ren Xunyi

    2008-01-01

    In order to effectively control the random tasks submitted and executed in grid workflow, a grid workflow model based on hybrid petri-net is presented. This model is composed of random petri-net, colored petri-net and general petri-net. Therein random petri-net declares the relationship between the number of grid users' random tasks and the size of service window and computes the server intensity of grid system. Colored petri-net sets different color for places with grid services and provides the valid interfaces for grid resource allocation and task scheduling. The experiment indicated that the model presented in this letter could compute the valve between the number of users' random tasks and the size of grid service window in grid workflow management system.

  13. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  14. ESO Reflex: A Graphical Workflow Engine for Astronomical Data Reduction

    Science.gov (United States)

    Hook, Richard; Romaniello, Martino; Ullgrén, Marko; Maisala, Sami; Solin, Otto; Oittinen, Tero; Savolainen, Villa; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Izzo, Carlo; Ballester, Pascal; Gabasch, Armin

    2008-03-01

    ESO Reflex is a software tool that provides a novel approach to astronomical data reduction. The reduction sequence is rendered and controlled as a graphical workflow. Users can follow and interact with the processing in an intuitive manner, without the need for complex scripting. The graphical interface also allows the modification of existing workflows and the creation of new ones. ESO Reflex can invoke standard ESO data reduction recipes in a flexible way. Python scripts, IDL procedures and shell commands can also be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. ESO Reflex was developed in the context of the Sampo project, a three-year effort led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. It is planned that the software will be released to the community in late 2008.

  15. Scientific Workflow Systems for 21st Century e-Science, New Bottle or New Wine?

    CERN Document Server

    Zhao, Yong; Foster, Ian

    2008-01-01

    With the advances in e-Sciences and the growing complexity of scientific analyses, more and more scientists and researchers are relying on workflow systems for process coordination, derivation automation, provenance tracking, and bookkeeping. While workflow systems have been in use for decades, it is unclear whether scientific workflows can or even should build on existing workflow technologies, or they require fundamentally new approaches. In this paper, we analyze the status and challenges of scientific workflows, investigate both existing technologies and emerging languages, platforms and systems, and identify the key challenges that must be addressed by workflow systems for e-science in the 21st century.

  16. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  17. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas;

    2013-01-01

    Declarative workflow languages have been a growing research subject over the past ten years, but applications of the declarative approach in industry are still uncommon. Over the past two years Exformatics A/S, a Danish provider of Electronic Case Management systems, has been cooperating with...... researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction to...

  18. Contextual cloud-based service oriented architecture for clinical workflow.

    Science.gov (United States)

    Moreno-Conde, Jesús; Moreno-Conde, Alberto; Núñez-Benjumea, Francisco J; Parra-Calderón, Carlos

    2015-01-01

    Given that acceptance of systems within the healthcare domain multiple papers highlighted the importance of integrating tools with the clinical workflow. This paper analyse how clinical context management could be deployed in order to promote the adoption of cloud advanced services and within the clinical workflow. This deployment will be able to be integrated with the eHealth European Interoperability Framework promoted specifications. Throughout this paper, it is proposed a cloud-based service-oriented architecture. This architecture will implement a context management system aligned with the HL7 standard known as CCOW. PMID:25991217

  19. Biologically Inspired Execution Framework for Vulnerable Workflow Systems

    CERN Document Server

    Safdar, Sohail; Qureshi, Muhammad Aasim; Akbar, Rehan

    2009-01-01

    The main objective of the research is to introduce a biologically inspired execution framework for workflow systems under threat due to some intrusion attack. Usually vulnerable systems need to be stop and put into wait state, hence to insure the data security and privacy while being recovered. This research ensures the availability of services and data to the end user by keeping the data security, privacy and integrity intact. To achieve the specified goals, the behavior of chameleons and concept of hibernation has been considered in combination. Hence the workflow systems become more robust using biologically inspired methods and remain available to the business consumers safely even in a vulnerable state.

  20. Design decisions in workflow management and quality of work.

    OpenAIRE

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of reference. It was found among a total sample of 66 employees that there was no change in the experience of work quality before and after the introduction of the WFM system. There are however, significant...

  1. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    OpenAIRE

    Amsterdamer, Yael; Davidson, Susan B.; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2011-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style...

  2. Evaluating data caching techniques in DMCF workflows using Hercules

    OpenAIRE

    Rodrigo Duro, Francisco José; Marozzo, Fabrizio; García Blas, Javier; Carretero Pérez, Jesús; Talia, Domenico; Trunfio, Paolo

    2015-01-01

    The Data Mining Cloud Framework (DMCF) is an environment for designing and executing data analysis workflows in cloud platforms. Currently, DMCF relies on the default storage of the public cloud provider for any I/O related operation. This implies that the I/O performance of DMCF is limited by the performance of the default storage. In this work we propose the usage of the Hercules system within DMCF as an ad-hoc storage system for temporary data produced inside workflow-based applications. H...

  3. Public Clouds Work Sharing Exact Workflows in Time Limits

    Directory of Open Access Journals (Sweden)

    C.Thamizhannai

    2014-12-01

    Full Text Available An algorithm that uses idle time of provisioned resources and budget surplus to replicate tasks. Deadlines being met and reduces the total execution time of applications as the budget available for replication increases. The description of tasks, data transfer time between tasks if running in different VMs (depicted in the arcs, and execution time of tasks in three different VM types (labeled S1, S2, and S3. The deadline for execution of such workflow is 30 time units and the allocation interval is 10 time units. Deadlines being met and reduces the total execution time of applications as the budget available for replication increases proposed two algorithms for cost-optimized, deadline-constrained execution of workflows in Clouds. On these settings, the IC-PCP algorithm, which is the state-of-the-art algorithm for provisioning and scheduling of workflows in Clouds. THE EIPR ALGORITHM The goal of the proposed Enhanced IC-PCP with Replication (EIPR algorithm is increasing the likelihood of completing the execution of a scientific workflow application within a user-defined deadline in a public Cloud environment. Task scheduling type of VMs to be used for workflow execution as well as start and finish time of each VM (provisioning. Placement of tasks Data transfer start and end time of scheduled tasks, but also the data transfers to the first scheduled task and from the last scheduled task. Task replication virtual machines to be ready to receive data and tasks in the moment that they are required to meet times estimated during the scheduling process. New criteria for ranking candidate tasks for replication and also workflow structure-aware scheduling of replicas, where the structure of the workflow application is considered not only during the selection of candidates for replication but also during the replica’s scheduling. We will also investigate how the replication-based approach can be used when the provisioning and scheduling process is

  4. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a...

  5. Automatic generation of 2D micromechanical finite element model of silicon–carbide/aluminum metal matrix composites: Effects of the boundary conditions

    DEFF Research Database (Denmark)

    Qing, Hai

    2013-01-01

    brittle damage model are developed within Abaqus/Standard Subroutine USDFLD, respectively. An Abaqus/Standard Subroutine MPC, which allows defining multi-point constraints, is developed to realize the symmetric boundary condition (SBC) and periodic boundary condition (PBC). A series of computational...... experiments are performed to study the influence of boundary condition, particle number and volume fraction of the representative volume element (RVE) on composite stiffness and strength properties....

  6. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  7. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    Directory of Open Access Journals (Sweden)

    SARAVANAN .M.S,

    2011-06-01

    Full Text Available This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM. A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. The process mining aims at extracting useful and meaningful information from event logs, which is a set of real executions of business process at any organizations. This paper briefly reviews the state-or-the-art of business processes developed so far and the techniques adopted. Also presents, the survey of workflow analysis in the view of business process can be broadly classified into four major categories, they are Business Process Modeling, Ontology based Business Process Management, Workflow based Business Process Controlling and Business Process Mining.

  8. CrossFlow: Integrating Workflow Management and Electronic Commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  9. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2002-01-01

    Electronic service outsourcing creates a new paradigm for automated enterprise collaboration. The service-oriented paradigm requires a high level of flexibility of current workflow management systems and support for Business-to-Business (B2B) collaboration to realize collaborative enterprises. This

  10. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    CERN Document Server

    Amsterdamer, Yael; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2012-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style provenance, by using Pig Latin to expose the functionality of modules, thus capturing internal state and fine-grained dependencies. A critical ingredient in our solution is the use of a novel form of provenance graph that models module invocations and yields a compact representation of fine-grained workflow provenance. It also enables a number of novel graph transformation operations, allowing to choose the desired level of granularity in provenance querying (ZoomIn and ZoomOut), and supporting "what-if" workflow analyti...

  11. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  12. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    We present a field study of oncology workflow, involving doctors, nurses and pharmacists at Danish hospitals and discuss the obstacles, enablers and challenges for the use of computer based clinical practice guidelines. Related to the CIGDec approach of Pesic and van der Aalst we then describe how...

  13. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. PMID:26603098

  14. AN APPROACH TO E-WORKFLOW SYSTEMS WITH THE USE OF PATTERNS

    OpenAIRE

    John Ndeta; Stamatia A. Katriou; Siakas, Kerstin V.

    2015-01-01

    In today’s highly competitive and rapidly changing environment, e-businesses constantly have to modify their business processes, i.e. the flow of documents and tasks in a business also known as workflow. More flexible Workflow Management Systems are required to support these constantly changing processes. In this research a platform independent architecture for the design of e-workflow systems is illustrated. The architecture includes an information pool, namely a Workflow Pattern Repository,...

  15. Patient-centered care requires a patient-oriented workflow model

    OpenAIRE

    Ozkaynak, Mustafa; Flatley Brennan, Patricia; Hanauer, David A.; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N.

    2013-01-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed ‘patient-oriented workflow.’ This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is...

  16. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  17. Design of an Integrated Role-Based Access Control Infrastructure for Adaptive Workflow Systems

    OpenAIRE

    C Narendra, Nanjangud

    2003-01-01

    With increasing numbers of organizations automating their business processes by using workflow systems, security aspects of workflow systems has become a heavily researched area. Also, most workflow processes nowadays need to be adaptive, i.e., constantly changing, to meet changing business conditions. However, little attention has been paid to integrating Security and Adaptive Workflow. In this paper, we investigate this important research topic, with emphasis on Role Based Access Control (R...

  18. Resource scheduling of workflow multi-instance migration based on the shuffled leapfrog algorithm

    OpenAIRE

    Yang Mingshun; Gao Xinqin; Cao Yuan; Liu Yong; Li Yan

    2015-01-01

    Purpose: When the workflow changed, resource scheduling optimization in the process of the current running instance migration has become a hot issue in current workflow flexible research; purpose of the article is to investigate the resource scheduling problem of workflow multi-instance migration. Design/methodology/approach: The time and cost relationships between activities and resources in workflow instance migration process are analyzed and a resource scheduling optimiza...

  19. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    Science.gov (United States)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  20. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  1. Workflows for ingest of research data into digital archives - tests with Archivematica

    Science.gov (United States)

    Kirchner, I.; Bertelmann, R.; Gebauer, P.; Hasler, T.; Hirt, M.; Klump, J. F.; Peters-Kotting, W.; Rusch, B.; Ulbricht, D.

    2013-12-01

    Publication of research data and future re-use of measured data require the long-term preservation of digital objects. The ISO OAIS reference model defines responsibilities for long-term preservation of digital objects and although there is software available to support preservation of digital data, there are still problems remaining to be solved. A key task in preservation is to make the datasets ready for ingest into the archive, which is called the creation of Submission Information Packages (SIPs) in the OAIS model. This includes the creation of appropriate preservation metadata. Scientists need to be trained to deal with different types of data and to heighten their awareness for quality metadata. Other problems arise during the assembly of SIPs and during ingest into the archive because file format validators may produce conflicting output for identical data files and these conflicts are difficult to resolve automatically. Also, validation and identification tools are notorious for their poor performance. In the project EWIG Zuse-Institute Berlin acts as an infrastructure facility, while the Institute for Meteorology at FU Berlin and the German research Centre for Geosciences GFZ act as two different data producers. The aim of the project is to develop workflows for the transfer of research data into digital archives and the future re-use of data from long-term archives with emphasis on data from the geosciences. The technical work is supplemented by interviews with data practitioners at several institutions to identify problems in digital preservation workflows and by the development of university teaching materials to train students in the curation of research data and metadata. The free and open-source software Archivematica [1] is used as digital preservation system. The creation and ingest of SIPs has to meet several archival standards and be compatible to the Metadata Encoding and Transmission Standard (METS). The two data producers use different

  2. Automatic multimodal real-time tracking for image plane alignment in interventional Magnetic Resonance Imaging

    International Nuclear Information System (INIS)

    Interventional magnetic resonance imaging (MRI) aims at performing minimally invasive percutaneous interventions, such as tumor ablations and biopsies, under MRI guidance. During such interventions, the acquired MR image planes are typically aligned to the surgical instrument (needle) axis and to surrounding anatomical structures of interest in order to efficiently monitor the advancement in real-time of the instrument inside the patient's body. Object tracking inside the MRI is expected to facilitate and accelerate MR-guided interventions by allowing to automatically align the image planes to the surgical instrument. In this PhD thesis, an image-based work-flow is proposed and refined for automatic image plane alignment. An automatic tracking work-flow was developed, performing detection and tracking of a passive marker directly in clinical real-time images. This tracking work-flow is designed for fully automated image plane alignment, with minimization of tracking-dedicated time. Its main drawback is its inherent dependence on the slow clinical MRI update rate. First, the addition of motion estimation and prediction with a Kalman filter was investigated and improved the work-flow tracking performance. Second, a complementary optical sensor was used for multi-sensor tracking in order to decouple the tracking update rate from the MR image acquisition rate. Performance of the work-flow was evaluated with both computer simulations and experiments using an MR compatible test bed. Results show a high robustness of the multi-sensor tracking approach for dynamic image plane alignment, due to the combination of the individual strengths of each sensor. (author)

  3. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  4. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from

  5. Automatic input rectification

    OpenAIRE

    Long, Fan; Ganesh, Vijay; Carbin, Michael James; Sidiroglou, Stelios; Rinard, Martin

    2012-01-01

    We present a novel technique, automatic input rectification, and a prototype implementation, SOAP. SOAP learns a set of constraints characterizing typical inputs that an application is highly likely to process correctly. When given an atypical input that does not satisfy these constraints, SOAP automatically rectifies the input (i.e., changes the input so that it satisfies the learned constraints). The goal is to automatically convert potentially dangerous inputs into typical inputs that the ...

  6. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  7. A Workflow-based RBAC Model for Web Services in Multiple Autonomous Domains

    Directory of Open Access Journals (Sweden)

    Zhenwu WANG

    2013-03-01

    Full Text Available A workflow-based RBAC model for web services (WFRBAC4WS has been proposed in this paper. The model organizes web services in different autonomous domains through workflow mechanism, and maps RBAC model to tasks of workflow model. The paper details the authorization procedure of WFRBAC4WS model, the lifetime management, the extension of authorization constraint and the formal descriptions of the proposed model. Compared with other RBAC models for web services, this model not only combines RBAC model to workflow, but also describes the interactions between workflow mechanism and RABC model in web services environment, the authorization work of this model is dynamically and comprehensively.

  8. Using SharePoint server for managing workflow in complex projects

    OpenAIRE

    ORAČ, ROMAN

    2011-01-01

    The aim of the thesis is to find efficient ways of using SharePoint server in the organization and to study in detail the use of workflows for project management. For this purpose, we set up a SharePoint server, which serves as a test environment, and design a workflow. SharePoint Server uses concept of workflow for project management. Workflows can consistently manage common business processes within an organization. Workflow can be described as a series of tasks, which gives the result. Wor...

  9. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  10. Review of Automatic Feature Extraction from High-Resolution Optical Sensor Data for UAV-Based Cadastral Mapping

    Directory of Open Access Journals (Sweden)

    Sophie Crommelinck

    2016-08-01

    Full Text Available Unmanned Aerial Vehicles (UAVs have emerged as a rapid, low-cost and flexible acquisition system that appears feasible for application in cadastral mapping: high-resolution imagery, acquired using UAVs, enables a new approach for defining property boundaries. However, UAV-derived data are arguably not exploited to its full potential: based on UAV data, cadastral boundaries are visually detected and manually digitized. A workflow that automatically extracts boundary features from UAV data could increase the pace of current mapping procedures. This review introduces a workflow considered applicable for automated boundary delineation from UAV data. This is done by reviewing approaches for feature extraction from various application fields and synthesizing these into a hypothetical generalized cadastral workflow. The workflow consists of preprocessing, image segmentation, line extraction, contour generation and postprocessing. The review lists example methods per workflow step—including a description, trialed implementation, and a list of case studies applying individual methods. Furthermore, accuracy assessment methods are outlined. Advantages and drawbacks of each approach are discussed in terms of their applicability on UAV data. This review can serve as a basis for future work on the implementation of most suitable methods in a UAV-based cadastral mapping workflow.

  11. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  12. Towards fully automatic object detection and segmentation

    Science.gov (United States)

    Schramm, Hauke; Ecabert, Olivier; Peters, Jochen; Philomin, Vasanth; Weese, Juergen

    2006-03-01

    An automatic procedure for detecting and segmenting anatomical objects in 3-D images is necessary for achieving a high level of automation in many medical applications. Since today's segmentation techniques typically rely on user input for initialization, they do not allow for a fully automatic workflow. In this work, the generalized Hough transform is used for detecting anatomical objects with well defined shape in 3-D medical images. This well-known technique has frequently been used for object detection in 2-D images and is known to be robust and reliable. However, its computational and memory requirements are generally huge, especially in case of considering 3-D images and various free transformation parameters. Our approach limits the complexity of the generalized Hough transform to a reasonable amount by (1) using object prior knowledge during the preprocessing in order to suppress unlikely regions in the image, (2) restricting the flexibility of the applied transformation to only scaling and translation, and (3) using a simple shape model which does not cover any inter-individual shape variability. Despite these limitations, the approach is demonstrated to allow for a coarse 3-D delineation of the femur, vertebra and heart in a number of experiments. Additionally it is shown that the quality of the object localization is in nearly all cases sufficient to initialize a successful segmentation using shape constrained deformable models.

  13. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  14. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    Science.gov (United States)

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow. PMID:22859881

  15. Network Design and Quality Checks in Automatic Orientation of Close-Range Photogrammetric Blocks

    OpenAIRE

    Elisa Dall'Asta; Klaus Thoeni; Marina Santise; Gianfranco Forlani; Anna Giacomini; Riccardo Roncella

    2015-01-01

    Due to the recent improvements of automatic measurement procedures in photogrammetry, multi-view 3D reconstruction technologies are becoming a favourite survey tool. Rapidly widening structure-from-motion (SfM) software packages offer significantly easier image processing workflows than traditional photogrammetry packages. However, while most orientation and surface reconstruction strategies will almost always succeed in any given task, estimating the quality of the result is, to some extent,...

  16. Composites

    Science.gov (United States)

    Taylor, John G.

    The Composites market is arguably the most challenging and profitable market for phenolic resins aside from electronics. The variety of products and processes encountered creates the challenges, and the demand for high performance in critical operations brings value. Phenolic composite materials are rendered into a wide range of components to supply a diverse and fragmented commercial base that includes customers in aerospace (Space Shuttle), aircraft (interiors and brakes), mass transit (interiors), defense (blast protection), marine, mine ducting, off-shore (ducts and grating) and infrastructure (architectural) to name a few. For example, phenolic resin is a critical adhesive in the manufacture of honeycomb sandwich panels. Various solvent and water based resins are described along with resin characteristics and the role of metal ions for enhanced thermal stability of the resin used to coat the honeycomb. Featured new developments include pultrusion of phenolic grating, success in RTM/VARTM fabricated parts, new ballistic developments for military vehicles and high char yield carbon-carbon composites along with many others. Additionally, global regional market resin volumes and sales are presented and compared with other thermosetting resin systems.

  17. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  18. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  19. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  20. Modular Workflow Engine for Distributed Services using Lightweight Java Clients

    CERN Document Server

    Vetter, R -M; Peetz, J -V

    2009-01-01

    In this article we introduce the concept and the first implementation of a lightweight client-server-framework as middleware for distributed computing. On the client side an installation without administrative rights or privileged ports can turn any computer into a worker node. Only a Java runtime environment and the JAR files comprising the workflow client are needed. To connect all clients to the engine one open server port is sufficient. The engine submits data to the clients and orchestrates their work by workflow descriptions from a central database. Clients request new task descriptions periodically, thus the system is robust against network failures. In the basic set-up, data up- and downloads are handled via HTTP communication with the server. The performance of the modular system could additionally be improved using dedicated file servers or distributed network file systems. We demonstrate the design features of the proposed engine in real-world applications from mechanical engineering. We have used ...

  1. Research on Architecture of Enterprise Modeling in Workflow System

    Institute of Scientific and Technical Information of China (English)

    李伟平; 齐慧彬; 薛劲松; 朱云龙

    2002-01-01

    The market that an enterprise is faced is changing and can 't be forecastedaccurately in this information time. In order to find the chances in the marketpractitioners have focused on business processes through their re-engineeringprogramme to improve enterprise efficiency. It is necessary to manage an enterpriseusing process-based method for the requirement of enhancing work efficiency and theability of competition in the market. And information system developers haveemphasized the use of standard models to accelerate the speed of configuration andimplementation of integrated systems for enterprises. So we have to model anenterprise with process-based modeling method. An architecture of enterprise modelingis presented in this paper. This architecture is composed of four views and supportingthe whole lifecycle of enterprise model. Because workflow management system is basedon process definition, this architecture can be directly used in the workflowmanagement system. The implement method of this model was thoroughly describedmeanwhile the workflow management software supporting the building and running themodel was also given.

  2. Technical Perspectives on Knowledge Management in Bioinformatics Workflow Systems

    Directory of Open Access Journals (Sweden)

    Walaa N. Ismail

    2015-01-01

    Full Text Available Workflow systems by it’s nature can help bioin-formaticians to plan for their experiments, store, capture and analysis of the runtime generated data. On the other hand, the life science research usually produces new knowledge at an increasing speed; Knowledge such as papers, databases and other systems knowledge that a researcher needs to deal with is actually a complex task that needs much of efforts and time. Thus the management of knowledge is therefore an important issue for life scientists. Approaches has been developed to organize biological knowledge sources and to record provenance knowledge of an experiment into a readily resource are presently being carried out. This article focuses on the knowledge management of in silico experimentation in bioinformatics workflow systems.

  3. A framework for streamlining research workflow in neuroscience and psychology

    Directory of Open Access Journals (Sweden)

    Jonas Kubilius

    2014-01-01

    Full Text Available Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for faster, more robust code development and collaboration for researchers.

  4. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  5. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  6. A Bayesian Approach to the Partitioning of Workflows

    CERN Document Server

    Chua, Freddy C

    2015-01-01

    When partitioning workflows in realistic scenarios, the knowledge of the processing units is often vague or unknown. A naive approach to addressing this issue is to perform many controlled experiments for different workloads, each consisting of multiple number of trials in order to estimate the mean and variance of the specific workload. Since this controlled experimental approach can be quite costly in terms of time and resources, we propose a variant of the Gibbs Sampling algorithm that uses a sequence of Bayesian inference updates to estimate the processing characteristics of the processing units. Using the inferred characteristics of the processing units, we are able to determine the best way to split a workflow for processing it in parallel with the lowest expected completion time and least variance.

  7. Sufficient and Necessary Condition to Decide Compatibility for a Class of Interorganizational Workflow Nets

    Directory of Open Access Journals (Sweden)

    Guanjun Liu

    2015-01-01

    Full Text Available Interorganizational Workflow nets (IWF-nets can well model many concurrent systems such as web service composition, in which multiple processes interact via sending/receiving messages. Compatibility of IWF-nets is a crucial criterion for the correctness of these systems. It guarantees that a system has no deadlock, livelock, or dead tasks. In our previous work we proved that the compatibility problem is PSPACE-complete for safe IWF-nets. This paper defines a subclass of IWF-nets that can model many cases about interactions. Necessary and sufficient condition is presented to decide their compatibility, and it depends on the net structures only. Finally, an algorithm is developed based on the condition.

  8. Bioinformatics Workflow for Clinical Whole Genome Sequencing at Partners HealthCare Personalized Medicine

    Directory of Open Access Journals (Sweden)

    Ellen A. Tsai

    2016-02-01

    Full Text Available Effective implementation of precision medicine will be enhanced by a thorough understanding of each patient’s genetic composition to better treat his or her presenting symptoms or mitigate the onset of disease. This ideally includes the sequence information of a complete genome for each individual. At Partners HealthCare Personalized Medicine, we have developed a clinical process for whole genome sequencing (WGS with application in both healthy individuals and those with disease. In this manuscript, we will describe our bioinformatics strategy to efficiently process and deliver genomic data to geneticists for clinical interpretation. We describe the handling of data from FASTQ to the final variant list for clinical review for the final report. We will also discuss our methodology for validating this workflow and the cost implications of running WGS.

  9. CrossFlow: Integrating Workflow Management and Electronic Commerce

    OpenAIRE

    Hoffner, Y.; Ludwig, H; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when augmented by virtual market technology, the dynamic configuration of the contract enactment infrastructures, and the provision of fine grained service monitoring and control. Standard ways of desc...

  10. Evolutionary multi-objective workflow scheduling in Cloud

    OpenAIRE

    Z. Zhu; Zhang, G.; M. Li; Liu, X.

    2015-01-01

    Cloud computing provides promising platforms for executing large applications with enormous computational resources to offer on demand. In a Cloud model, users are charged based on their usage of resources and the required quality of service (QoS) specifications. Although there are many existing workflow scheduling algorithms in traditional distributed or heterogeneous computing environments, they have difficulties in being directly applied to the Cloud environments since Cloud differs from t...

  11. Hybrid Workflow Policy Management for Heart Disease Identification

    OpenAIRE

    Dong-Hyun Kim; Woo-Ram Jung; Chan-Hyun Youn.

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is m...

  12. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  13. Analysis of Whole Transcriptome Sequencing Data: Workflow and Software.

    Science.gov (United States)

    Yang, In Seok; Kim, Sangwoo

    2015-12-01

    RNA is a polymeric molecule implicated in various biological processes, such as the coding, decoding, regulation, and expression of genes. Numerous studies have examined RNA features using whole transcriptome sequencing (RNA-seq) approaches. RNA-seq is a powerful technique for characterizing and quantifying the transcriptome and accelerates the development of bioinformatics software. In this review, we introduce routine RNA-seq workflow together with related software, focusing particularly on transcriptome reconstruction and expression quantification. PMID:26865842

  14. A Component Based Approach to Scientific Workflow Management

    OpenAIRE

    Goff, J. -M. Le; Kovacs, Z; Baker, N.; Brooks, P; R. McClatchey

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirem...

  15. A framework for streamlining research workflow in neuroscience and psychology

    OpenAIRE

    Jonas Kubilius

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tool...

  16. DynamicWorkflow in Grid-MAS Integration Context

    OpenAIRE

    Salle, Paola; Duvert, Frédéric; Hérin, Danièle; Stefano A. CERRI

    2007-01-01

    This paper addresses the architectural foundations of dynamic workflows in distributed multi-agent systems (MAS) integrated in Grid context. The purpose is to design an architecture at the same time taking into consideration tasks dependencies among agents, adaptation with respect to historic lessons learnt from past behaviour (memory) and the autonomous decisions when an unpredicted event occurs. In order to do this, given one ontology, called AGIO, which describes Agent-Grid Integration, we...

  17. A framework for streamlining research workflow in neuroscience and psychology

    OpenAIRE

    Kubilius, Jonas

    2014-01-01

    Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad-hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tool...

  18. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  19. Advanced Workflows for Fluid Transfer in Faulted Basins.

    OpenAIRE

    Thibaut Muriel; Jardin Anne; Faille Isabelle; Willien Françoise; Guichet Xavier

    2014-01-01

    The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones a...

  20. A graph model of data and workflow provenance

    OpenAIRE

    Acar, U.; Buneman, P.; J. Cheney; Van den Bussche, Jan; Kwasnikowska, Natalia; Vansummeren, Stijn

    2010-01-01

    Provenance has been studied extensively in both database and workflow management systems, so far with little convergence of definitions or models. Provenance in databases has generally been defined for relational or complex object data, by propagating fine-grained annotations or algebraic expressions from the input to the output. This kind of provenance has been found useful in other areas of computer science: annotation databases, probabilistic databases, schema and data integration, etc. In...

  1. A Complete Workflow for Development of Bangla OCR

    OpenAIRE

    Omee, Farjana Yeasmin; Himel, Shiam Shabbir; Bikas, Md. Abu Naser

    2012-01-01

    Developing a Bangla OCR requires bunch of algorithm and methods. There were many effort went on for developing a Bangla OCR. But all of them failed to provide an error free Bangla OCR. Each of them has some lacking. We discussed about the problem scope of currently existing Bangla OCR's. In this paper, we present the basic steps required for developing a Bangla OCR and a complete workflow for development of a Bangla OCR with mentioning all the possible algorithms required.

  2. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  3. Color variance in PDF-based production workflow environments

    Science.gov (United States)

    Riordan, Michael

    2006-02-01

    Based on the production practices of a representative sampling of graphic arts professionals, a series of tests were conducted to determine the potential color variance incurred during specific production-based PDF workflows. The impact of key production variables--including the use of ICC profiles, methods and settings used for PDF distillation, and printer/RIP color management handling for PDF rendering--were examined for RGB, CMYK and select spot colors to determine the potential magnitude of color variation under normal production conditions. The results of the study, quantified via paired comparison and delta E, showed that, while color variance could be kept to a minimum using very specific workflow configurations, significant color variation was incurred in many of the common workflow configurations representative of the production environments observed from the sample population. Further, even compliance to PDF-X1a and PDF-X3 specifications allowed for unwanted variation depending on specific production activities that preceded or followed the creation of the PDF-X file.

  4. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    Science.gov (United States)

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project. PMID:24825693

  5. Research on a dynamic workflow access control model

    Science.gov (United States)

    Liu, Yiliang; Deng, Jinxia

    2007-12-01

    In recent years, the access control technology has been researched widely in workflow system, two typical technologies of that are RBAC (Role-Based Access Control) and TBAC (Task-Based Access Control) model, which has been successfully used in the role authorizing and assigning in a certain extent. However, during the process of complicating a system's structure, these two types of technology can not be used in minimizing privileges and separating duties, and they are inapplicable when users have a request of frequently changing on the workflow's process. In order to avoid having these weakness during the applying, a variable flow dynamic role_task_view (briefly as DRTVBAC) of fine-grained access control model is constructed on the basis existed model. During the process of this model applying, an algorithm is constructed to solve users' requirements of application and security needs on fine-grained principle of privileges minimum and principle of dynamic separation of duties. The DRTVBAC model is implemented in the actual system, the figure shows that the task associated with the dynamic management of role and the role assignment is more flexible on authority and recovery, it can be met the principle of least privilege on the role implement of a specific task permission activated; separated the authority from the process of the duties completing in the workflow; prevented sensitive information discovering from concise and dynamic view interface; satisfied with the requirement of the variable task-flow frequently.

  6. The MPO API: A tool for recording scientific workflows

    International Nuclear Information System (INIS)

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access

  7. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  8. Optimal Workflow Scheduling in Critical Infrastructure Systems with Neural Networks

    Directory of Open Access Journals (Sweden)

    S. Vukmirović

    2012-04-01

    Full Text Available Critical infrastructure systems (CISs, such as power grids, transportation systems, communication networks and water systems are the backbone of a country’s national security and industrial prosperity. These CISs execute large numbers of workflows with very high resource requirements that can span through different systems and last for a long time. The proper functioning and synchronization of these workflows is essential since humanity’s well-being is connected to it. Because of this, the challenge of ensuring availability and reliability of these services in the face of a broad range of operating conditions is very complicated. This paper proposes an architecture which dynamically executes a scheduling algorithm using feedback about the current status of CIS nodes. Different artificial neural networks (ANNs were created in order to solve the scheduling problem. Their performances were compared and as the main result of this paper, an optimal ANN architecture for workflow scheduling in CISs is proposed. A case study is shown for a meter data management system with measurements from a power distribution management system in Serbia. Performance tests show that significant improvement of the overall execution time can be achieved by ANNs.

  9. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  10. The Automatic Statistician: A Relational Perspective

    OpenAIRE

    Hwang, Yunseong; Tong, Anh; Choi, Jaesik

    2015-01-01

    Gaussian Processes (GPs) provide a general and analytically tractable way of modeling complex time-varying, nonparametric functions. The Automatic Bayesian Covariance Discovery (ABCD) system constructs natural-language description of time-series data by treating unknown time-series data nonparametrically using GP with a composite covariance kernel function. Unfortunately, learning a composite covariance kernel with a single time-series data set often results in less informative kernel that ma...

  11. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    Science.gov (United States)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point

  12. Semi-automatic object geometry estimation for image personalization

    Science.gov (United States)

    Ding, Hengzhou; Bala, Raja; Fan, Zhigang; Eschbach, Reiner; Bouman, Charles A.; Allebach, Jan P.

    2010-01-01

    Digital printing brings about a host of benefits, one of which is the ability to create short runs of variable, customized content. One form of customization that is receiving much attention lately is in photofinishing applications, whereby personalized calendars, greeting cards, and photo books are created by inserting text strings into images. It is particularly interesting to estimate the underlying geometry of the surface and incorporate the text into the image content in an intelligent and natural way. Current solutions either allow fixed text insertion schemes into preprocessed images, or provide manual text insertion tools that are time consuming and aimed only at the high-end graphic designer. It would thus be desirable to provide some level of automation in the image personalization process. We propose a semi-automatic image personalization workflow which includes two scenarios: text insertion and text replacement. In both scenarios, the underlying surfaces are assumed to be planar. A 3-D pinhole camera model is used for rendering text, whose parameters are estimated by analyzing existing structures in the image. Techniques in image processing and computer vison such as the Hough transform, the bilateral filter, and connected component analysis are combined, along with necessary user inputs. In particular, the semi-automatic workflow is implemented as an image personalization tool, which is presented in our companion paper.1 Experimental results including personalized images for both scenarios are shown, which demonstrate the effectiveness of our algorithms.

  13. Real-time dataflow and workflow with the CMS tracker data

    International Nuclear Information System (INIS)

    The Tracker detector took data with cosmics rays at the Tracker Integration Facility (TIF) at CERN. First on-line monitoring tasks were executed at the Tracker Analysis Centre (TAC) which is a dedicated Control Room at TIF with limited computing resources. A set of software agents were developed to perform the real-time data conversion in a standard format, to archive data on tape at CERN and to publish them in the official CMS data bookkeeping systems. According to the CMS computing and analysis model, most of the subsequent data processing has to be done in remote Tier-1 and Tier-2 sites, so data were automatically transferred from CERN to the sites interested to analyze them, currently Fermilab, Bari and Pisa. Official reconstruction in the distributed environment was triggered in real-time by using the tool currently used for the processing of simulated events. Automatic end-user analysis of data was performed in a distributed environment, in order to derive the distributions of important physics variables. The tracker data processing is currently migrating to the Tier-0 CERN as a prototype for the global data taking chain. Tracker data were also registered into the most recent version of the data bookkeeping system, DBS-2, by profiting from the new features to handle real data. A description of the dataflow/workflow and of the tools developed is given, together with the results about the performance of the real-time chain. Almost 7.2 million events were officially registered, moved, reconstructed and analyzed in remote sites by using the distributed environment

  14. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  15. Considering Time in Orthophotography Production: from a General Workflow to a Shortened Workflow for a Faster Disaster Response

    Science.gov (United States)

    Lucas, G.

    2015-08-01

    This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic); these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed) their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the "normal" practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator); the second is simplified and make compromise over positional accuracy (using direct geo-referencing) and seamlines preparation in order to achieve

  16. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and...... model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility...... of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow....

  17. Automatic Implantable Cardiac Defibrillator

    Medline Plus

    Full Text Available Automatic Implantable Cardiac Defibrillator February 19, 2009 Halifax Health Medical Center, Daytona Beach, FL Welcome to Halifax Health Daytona Beach, Florida. Over the next hour you' ...

  18. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  19. Automatic Arabic Text Classification

    OpenAIRE

    Al-harbi, S; Almuhareb, A.; Al-Thubaity , A; Khorsheed, M. S.; Al-Rajeh, A.

    2008-01-01

    Automated document classification is an important text mining task especially with the rapid growth of the number of online documents present in Arabic language. Text classification aims to automatically assign the text to a predefined category based on linguistic features. Such a process has different useful applications including, but not restricted to, e-mail spam detection, web page content filtering, and automatic message routing. This paper presents the results of experiments on documen...

  20. An iterative workflow for mining the human intestinal metaproteome

    Directory of Open Access Journals (Sweden)

    Beauvallet Christian

    2011-01-01

    Full Text Available Abstract Background Peptide spectrum matching (PSM is the standard method in shotgun proteomics data analysis. It relies on the availability of an accurate and complete sample proteome that is used to make interpretation of the spectra feasible. Although this procedure has proven to be effective in many proteomics studies, the approach has limitations when applied on complex samples of microbial communities, such as those found in the human intestinal tract. Metagenome studies have indicated that the human intestinal microbiome contains over 100 times more genes than the human genome and it has been estimated that this ecosystem contains over 5000 bacterial species. The genomes of the vast majority of these species have not yet been sequenced and hence their proteomes remain unknown. To enable data analysis of shotgun proteomics data using PSM, and circumvent the lack of a defined matched metaproteome, an iterative workflow was developed that is based on a synthetic metaproteome and the developing metagenomic databases that are both representative for but not necessarily originating from the sample of interest. Results Two human fecal samples for which metagenomic data had been collected, were analyzed for their metaproteome using liquid chromatography-mass spectrometry and used to benchmark the developed iterative workflow to other methods. The results show that the developed method is able to detect over 3,000 peptides per fecal sample from the spectral data by circumventing the lack of a defined proteome without naive translation of matched metagenomes and cross-species peptide identification. Conclusions The developed iterative workflow achieved an approximate two-fold increase in the amount of identified spectra at a false discovery rate of 1% and can be applied in metaproteomic studies of the human intestinal tract or other complex ecosystems.

  1. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-01-01

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse. PMID:18999220

  2. Introducing OAWAL: crowdsourcing best practices for open access workflows in academic libraries

    OpenAIRE

    Emery, Jill; Stone, Graham

    2014-01-01

    Currently in the formative stage, the intent of OAWAL is to create an openly accessible wiki for librarians working on the management of open access workflow within their given institutions. At is point, the team has developed significant areas of focus for workflow management and will be building upon the current structure as informed through in-person and online crowdsourcing. The current sections to be developed are: advocacy, Creative Commons, the Library as Publisher, standards, workflow...

  3. Context-oriented scientific workflow system and its application in virtual screening

    OpenAIRE

    Fan, Xiaoliang; Brézillon, Patrick; Zhang, Ruisheng; Li, Lian

    2010-01-01

    Scientific workflow (SWF) system is gradually liberating the computational scientists from burden of data-centric operations to concentration on their decision making. However, contemporary SWF systems fail to address the variables when scientists urge to deliver new outcomes through reproduction of workflow, including not only workflow representation, but also its "context" of use. Thus, current failure is mainly due to lack of representing and managing the "context". We propose a context-or...

  4. An event- and repository-based component framework for workflow system architecture

    OpenAIRE

    Tombros, Dimitrios

    1999-01-01

    During the past decade a new class of systems has emerged, which plays an important role in the support of efficient business process implementation: workflow systems. Despite their proliferation however, workflow systems are still being developed in an ad hoc way without making use of advanced software engineering technologies such as component-based system development and reuse of architecture artifacts.This work proposes a modern approach to workflow system construction. The approach is ce...

  5. The Workflow Specification of Process Definition%工作流过程定义规范

    Institute of Scientific and Technical Information of China (English)

    缪晓阳; 石文俊; 吴朝晖

    2000-01-01

    This paper discusses the representation of a business process in a form.There are three basic aspects:the concept of workflow process definition,on which the idea of process definition interchange is raised;the meta-model of workflow,which is used to describe the entities and attributes of entities within the process definition;the workflow process definition language(WPDL),which is used to implement the Process definition.

  6. Pegasus: A Framework for Mapping Complex Scientific Workflows onto Distributed Systems

    OpenAIRE

    Ewa Deelman; Gurmeet Singh; Mei-Hui Su; James Blythe; Yolanda Gil; Carl Kesselman; Gaurang Mehta; Karan Vahi; G. Bruce Berriman; John Good; Anastasia Laity; Jacob, Joseph C.; Katz, Daniel S.

    2005-01-01

    This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. Pegasus enables users to represent the workflows at an abstract level without needing to worry about the particulars of the target execution systems. The paper describes general issues in mapping applications and the functionality of Pegasus. We present the results of improving application performance through workflow restructuring which clusters multiple tasks in a work...

  7. Bandwidth-Aware Scheduling of Workflow Application on Multiple Grid Sites

    OpenAIRE

    Harshadkumar B. Prajapati; Shah, Vipul A.

    2014-01-01

    Bandwidth-aware workflow scheduling is required to improve the performance of a workflow application in a multisite Grid environment, as the data movement cost between two low-bandwidth sites can adversely affect the makespan of the application. Pegasus WMS, an open-source and freely available WMS, cannot fully utilize its workflow mapping capability due to unavailability of integration of any bandwidth monitoring infrastructure in it. This paper develops the integration of Network Weather Se...

  8. High performance workflow implementation for protein surface characterization using grid technology

    OpenAIRE

    Clematis Andrea; D'Agostino Daniele; Morra Giulia; Merelli Ivan; Milanesi Luciano

    2005-01-01

    Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are...

  9. Development of a Workflow Integration Survey (WIS) for Implementing Computerized Clinical Decision Support

    OpenAIRE

    Flanagan, Mindy; Arbuckle, Nicole; Saleem, Jason J; Militello, Laura G.; Haggstrom, David A.; Doebbeling, Bradley N

    2011-01-01

    Interventions that focus on improving computerized clinical decision support (CDS) demonstrate that successful workflow integration can increase the adoption and use of CDS. However, metrics for assessing workflow integration in clinical settings are not well established. The goal of this study was to develop and validate a survey to assess the extent to which CDS is integrated into workflow. Qualitative data on CDS design, usability, and integration from four sites was collected by direct ob...

  10. Automated attribute inference in complex service workflows based on sharing analysis

    OpenAIRE

    Ivanovic, Dragan; Carro Liñares, Manuel; Hermenegildo, Manuel V.

    2011-01-01

    The properties of data and activities in business processes can be used to greatly facilítate several relevant tasks performed at design- and run-time, such as fragmentation, compliance checking, or top-down design. Business processes are often described using workflows. We present an approach for mechanically inferring business domain-specific attributes of workflow components (including data Ítems, activities, and elements of sub-workflows), taking as starting point known attrib...

  11. A component-based product line architecture for workflow management systems

    OpenAIRE

    Lazilha, Fabrício Ricardo; Barroca, Leonor; de Oliveira Junior, Edson Alves; de Souza Gimenes, Itana Maria

    2004-01-01

    This paper presents a component-based product line for workflow management systems. The process followed to design the product line was based on the Catalysis method. Extensions were made to represent variability across the process. The domain of workflow management systems has been shown to be appropriate to the application of the product line approach as there are a standard architecture and models established by a regulatory board, the Workflow Management Coalition. In addition, there is a...

  12. Incorporating Workflow Interference in Facility Layout Design: The Quartic Assignment Problem

    OpenAIRE

    Wen-Chyuan Chiang; Panagiotis Kouvelis; Timothy L. Urban

    2002-01-01

    Although many authors have noted the importance of minimizing workflow interference in facility layout design, traditional layout research tends to focus on minimizing the distance-based transportation cost. This paper formalizes the concept of workflow interference from a facility layout perspective. A model, formulated as a quartic assignment problem, is developed that explicitly considers the interference of workflow. Optimal and heuristic solution methodologies are developed and evaluated.

  13. A MVC framework for policy-based adaptation of workflow processes: A case study on confidentiality

    OpenAIRE

    Geebelen, Kristof; Kulikowski, Eryk; Truyen, Eddy; Joosen, Wouter

    2010-01-01

    Most work on adaptive workflows offers insufficient flexibility to enforce complex policies regarding dynamic, evolvable and robust workflows. In addition, many proposed approaches require customized workflow engines. This paper presents a portable framework for realistic enforcement of dynamic adaptation policies in business processes. The framework is based on the Model-View-Controller (MVC) pattern, commonly used for adding dynamism to web pages. To enhance reusability, our approach suppor...

  14. Effects of the Interactions Between LPS and BIM on Workflow in Two Building Design Projects

    OpenAIRE

    Khan, Sheriz; Tzortzopoulos, Patricia

    2014-01-01

    Variability in design workflow causes delays and undermines the performance of building projects. As lean processes, the Last Planner System (LPS) and Building Information Modeling (BIM) can improve workflow in building projects through features that reduce waste. Since its introduction, BIM has had significant positive influence on workflow in building design projects, but these have been rarely considered in combination with LPS. This paper is part of a postgraduate research focusing on the...

  15. Using a suite of ontologies for preserving workflow-centric research objects

    OpenAIRE

    Belhajjame, Khalid; Zhao, Jun; Klyne, Graham; Goble, Carole; Garijo, Daniel; Gamble, Matthew; Hettne, Kristina; Palma, Raul; Mina, Eleni; Corcho, Oscar; Gómez-Pérez, José Manuel; Bechhofer, Sean

    2015-01-01

    Scientific workflows are a popular mechanism for specifying and automating data-driven in silico experiments. A significant aspect of their value lies in their potential to be reused. Once shared, workflows become useful building blocks that can be combined or modified for developing new experiments. However, previous studies have shown that storing workflow specifications alone is not sufficient to ensure that they can be successfully reused, without being able to understand what the workflo...

  16. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M Jr; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  17. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those. PMID:21058181

  18. PhyloGrid: a development for a workflow in Phylogeny

    CERN Document Server

    Montes, Esther; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do this, no knowledge from his/her side about the computational procedure is required.

  19. Transaktionale Datei- und Dokumentenverwaltung in Workflow-Management-Systemen

    OpenAIRE

    Täuber, Wolfgang

    1996-01-01

    Diese Arbeit wird im Rahmen des Software-Labors der Universität Stuttgart durchgeführt. Das Software-Labor ist eine Einrichtung, das die Entwicklung marktfähiger Software, im Rahmen einer engen Zusammenarbeit der Universität Stuttgart mit der Industrie, zum Ziel hat. Ein Teilprojekt des Software-Labors besitzt als Themengebiet die Weiterentwicklung des Workflow-Management-Systems 'FlowMark' der Firma IBM. Für 'FlowMark' sollen unter anderem transaktionale Konzepte ausgearbeitet werden. Ziel d...

  20. Dokumentenmanagement in einem WWW-basierten Workflow-System

    OpenAIRE

    Horster, Oliver

    1998-01-01

    Workflow-Management-Systeme gestatten die computerunterstützte Bearbeitung von Vorgängen. Ein WFMS ermittelt dabei anhand einer Vorgangsbeschreibung die nächste auszuführende Aktivität und präsentiert sie den Bearbeitern, die sie dann unter Verwendung von Anwendungsprogrammen abarbeiten, beispielsweise durch die Bearbeitung von Dokumenten. Dazu werden Dokumente in sogenannten Dokumenten-Management-Systemen (DMS) verwaltet. Im PoliFlow-Projekt entsteht das WWW-basierte System SWATS, das ein ko...

  1. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  2. Workflow for large-scale analysis of melanoma tissue samples

    Directory of Open Access Journals (Sweden)

    Maria E. Yakovleva

    2015-09-01

    Full Text Available The aim of the present study was to create an optimal workflow for analysing a large cohort of malignant melanoma tissue samples. Samples were lysed with urea and enzymatically digested with trypsin or trypsin/Lys C. Buffer exchange or dilution was used to reduce urea concentration prior to digestion. The tissue digests were analysed directly or following strong cation exchange (SCX fractionation by nano LC–MS/MS. The approach which resulted in the largest number of protein IDs involved a buffer exchange step before enzymatic digestion with trypsin and chromatographic separation in 120 min gradient followed by SCX–RP separation of peptides.

  3. Understanding the Potential of Digital Intraoral and Benchtop Scanning Workflows.

    Science.gov (United States)

    Jansen, Curtis E

    2015-01-01

    Although the overwhelming majority of dental offices now use digital radiography and patient records, relatively few yet use either stand-alone intraoral scanning systems (6%) or complete systems that combine intraoral scanning with computer-aided design and computer-aided manufacturing (12%). This should change as dentists become more aware of the numerous advantages scanning systems offer in terms of patient care and communication of patient information, particularly with the dental laboratory. This article reviews the various types of scanner architecture as well as potential workflow models. PMID:26625165

  4. Integration of Motion Capture into 3D Animation Workflows

    OpenAIRE

    Unver, Ertu; Hughes, Daniel; Walker, Bernard; Blackburn, Ryan; Chien, Lin

    2011-01-01

    The research aims to test and evaluate Motion Capture (MoCap) technology on a live CG animation project and discover how it can actually con¬tribute to the animation production workflow. MoCap is a technique for gathering data of the movements of the human body. With the intention of using this information to drive the movements of 3D models in computer generated animation. MoCap offers significant advantages for producing natural and believable movement in 3D animation and opens up the pos...

  5. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  6. PhyloGrid: a development for a workflow in Phylogeny

    OpenAIRE

    Montes, Esther; Isea, Raul; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do t...

  7. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  8. The robust schedule - a link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren Munch; Wandahl, Søren

    2012-01-01

    In today’s construction, there is a paramount focus on time, and on the scheduling and control of time. Everything is organized with respect to time. The construction project has to be completed within a fixed and often tight deadline. Otherwise a daily penalty often has to be paid. This pins down...... result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under sub-optimal conditions. Even though productivity overall seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase...

  9. The robust schedule - A link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren; Wandahl, Søren

    2012-01-01

    In today’s construction, there is a paramount focus on time, and on the scheduling and control of time. Everything is organized with respect to time. The construction project has to be completed within a fixed and often tight deadline. Otherwise a daily penalty often has to be paid. This pin....... The result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under non-optimal conditions. Even though productivity seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase...

  10. Workflow for High Throughput Screening of Gas Sensing Materials

    Directory of Open Access Journals (Sweden)

    Ulrich Simon

    2006-04-01

    Full Text Available The workflow of a high throughput screening setup for the rapid identification ofnew and improved sensor materials is presented. The polyol method was applied to preparenanoparticular metal oxides as base materials, which were functionalised by surface doping.Using multi-electrode substrates and high throughput impedance spectroscopy (HT-IS awide range of materials could be screened in a short time. Applying HT-IS in search of newselective gas sensing materials a NO2-tolerant NO sensing material with reducedsensitivities towards other test gases was identified based on iridium doped zinc oxide.Analogous behaviour was observed for iridium doped indium oxide.

  11. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  12. Time-efficient CT colonography interpretation using an advanced image-gallery-based, computer-aided ''first-reader'' workflow for the detection of colorectal adenomas

    International Nuclear Information System (INIS)

    To assess the performance of an advanced ''first-reader'' workflow for computer-aided detection (CAD) of colorectal adenomas ≥ 6 mm at computed tomographic colonography (CTC) in a low-prevalence cohort. A total of 616 colonoscopy-validated CTC patient-datasets were retrospectively reviewed by a radiologist using a ''first-reader'' CAD workflow. CAD detections were presented as galleries of six automatically generated two-dimensional (2D) and three-dimensional (3D) images together with interactive 3D target views and 2D multiplanar views of the complete dataset. Each patient-dataset was interpreted by initially using CAD image-galleries followed by a fast 2D review to address unprompted colonic areas. Per-patient, per-polyp, and per-adenoma sensitivities were calculated for lesions ≥ 6 mm. Statistical testing employed Fisher's exact and McNemar tests. In 91/616 patients, 131 polyps (92 adenomas, 39 non-adenomas) ≥ 6 mm and two cancers were identified by reference standard. Using the CAD gallery-based first-reader workflow, the radiologist detected all adenomas ≥ 10 mm (34/34) and cancers. Per-patient and polyp sensitivities for lesions ≥ 6 mm were 84.3 % (75/89), and 83.2 % (109/131), respectively, with 89.1 % (57/64) and 85.9 % (79/92) for adenomas. Overall specificity was 95.6 % (504/527). Mean interpretation time was 3.1 min per patient. A CAD algorithm, applied in an image-gallery-based first-reader workflow, can substantially decrease reading times while enabling accurate detection of colorectal adenomas in a low-prevalence population. (orig.)

  13. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  14. Enhancing and Customizing Laboratory Information Systems to Improve/Enhance Pathologist Workflow.

    Science.gov (United States)

    Hartman, Douglas J

    2016-03-01

    Optimizing pathologist workflow can be difficult because it is affected by many variables. Surgical pathologists must complete many tasks that culminate in a final pathology report. Several software systems can be used to enhance/improve pathologist workflow. These include voice recognition software, pre-sign-out quality assurance, image utilization, and computerized provider order entry. Recent changes in the diagnostic coding and the more prominent role of centralized electronic health records represent potential areas for increased ways to enhance/improve the workflow for surgical pathologists. Additional unforeseen changes to the pathologist workflow may accompany the introduction of whole-slide imaging technology to the routine diagnostic work. PMID:26851662

  15. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  16. Accelerated partial breast irradiation utilizing brachytherapy: patient selection and workflow

    Science.gov (United States)

    Wobb, Jessica; Manyam, Bindu; Khan, Atif; Vicini, Frank

    2016-01-01

    Accelerated partial breast irradiation (APBI) represents an evolving technique that is a standard of care option in appropriately selected woman following breast conserving surgery. While multiple techniques now exist to deliver APBI, interstitial brachytherapy represents the technique used in several randomized trials (National Institute of Oncology, GEC-ESTRO). More recently, many centers have adopted applicator-based brachytherapy to deliver APBI due to the technical complexities of interstitial brachytherapy. The purpose of this article is to review methods to evaluate and select patients for APBI, as well as to define potential workflow mechanisms that allow for the safe and effective delivery of APBI. Multiple consensus statements have been developed to guide clinicians on determining appropriate candidates for APBI. However, recent studies have demonstrated that these guidelines fail to stratify patients according to the risk of local recurrence, and updated guidelines are expected in the years to come. Critical elements of workflow to ensure safe and effective delivery of APBI include a multidisciplinary approach and evaluation, optimization of target coverage and adherence to normal tissue guideline constraints, and proper quality assurance methods. PMID:26985202

  17. A High Throughput Workflow Environment for Cosmological Simulations

    CERN Document Server

    Erickson, Brandon M S; Evrard, August E; Becker, Matthew R; Busha, Michael T; Kravtsov, Andrey V; Marru, Suresh; Pierce, Marlon; Wechsler, Risa H

    2012-01-01

    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manua...

  18. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  19. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  20. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  1. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  2. Automatic extraction of highway light poles and towers from mobile LiDAR data

    Science.gov (United States)

    Yan, Wai Yeung; Morsy, Salem; Shaker, Ahmed; Tulloch, Mark

    2016-03-01

    Mobile LiDAR has been recently demonstrated as a viable technique for pole-like object detection and classification. Despite that a desirable accuracy (around 80%) has been reported in the existing studies, majority of them were presented in the street level with relatively flat ground and very few of them addressed how to extract the entire pole structure from the ground or curb surface. Therefore, this paper attempts to fill the research gap by presenting a workflow for automatic extraction of light poles and towers from mobile LiDAR data point cloud, with a particular focus on municipal highway. The data processing workflow includes (1) an automatic ground filtering mechanism to separate aboveground and ground features, (2) an unsupervised clustering algorithm to cluster the aboveground data point cloud, (3) a set of decision rules to identify and classify potential light poles and towers, and (4) a least-squares circle fitting algorithm to fit the circular pole structure so as to remove the ground points. The workflow was tested with a set of mobile LiDAR data collected for a section of highway 401 located in Toronto, Ontario, Canada. The results showed that the proposed method can achieve an over 91% of detection rate for five types of light poles and towers along the study area.

  3. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers by...... members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers a...

  4. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  5. Automatic tracking of neuro vascular tree paths

    Science.gov (United States)

    Suryanarayanan, S.; Gopinath, A.; Mallya, Y.; Shriram, K. S.; Joshi, M.

    2006-03-01

    3-D analysis of blood vessels from volumetric CT and MR datasets has many applications ranging from examination of pathologies such as aneurysm and calcification to measurement of cross-sections for therapy planning. Segmentation of the vascular structures followed by tracking is an important processing step towards automating the 3-D vessel analysis workflow. This paper demonstrates a fast and automated algorithm for tracking the major arterial structures that have been previously segmented. Our algorithm uses anatomical knowledge to identify the start and end points in the vessel structure that allows automation. Voxel coding scheme is used to code every voxel in the vessel based on its geodesic distance from the start point. A shortest path based iterative region growing is used to extract the vessel tracks that are subsequently smoothed using an active contour method. The algorithm also has the ability to automatically detect bifurcation points of major arteries. Results are shown for tracking the major arteries such as the common carotid, internal carotid, vertebrals, and arteries coming off the Circle of Willis across multiple cases with various data related and pathological challenges from 7 CTA cases and 2 MR Time of Flight (TOF) cases.

  6. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  7. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  8. Performance engineering method for workflow systems : an integrated view of human and computerised work processes

    OpenAIRE

    Brataas, Gunnar

    1996-01-01

    A method for designing workflow systems which satisfy performance requirements is proposed in this thesis. Integration of human and computerised performance is particularly useful for workflow systems where human and computerised processes are intertwined. The proposed framework encompasses human and computerised resources. Even though systematic performance engineering is not common practice in information system development, current, best practice shows that performance engineering of softw...

  9. Fault tolerant workflow scheduling based on replication and resubmission of tasks in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Jayadivya S K

    2012-06-01

    Full Text Available The aim of workflow scheduling system is to schedule the workflows within the user given deadline to achieve a good success rate. Workflow is a set of tasks processed in a predefined order based on its data and control dependency. Scheduling these workflows in a computing environment, like cloud environment, is an NP-Complete problem and it becomes more challenging when failures of tasks areconsidered. To overcome these failures, the workflow scheduling system should be fault tolerant. In this paper, the proposed Fault Tolerant Workflow Scheduling algorithm (FTWS provides fault tolerance by using replication and resubmission of tasks based on priority of the tasks. The replication of tasks depends on a heuristic metric which is calculated by finding the tradeoff between the replication factor and resubmission factor. The heuristic metric is considered because replication alone may lead to resource wastage and resubmission alone may increase makespan. Tasks are prioritized based on the criticality of the task which is calculated by using parameters like out degree, earliest deadline and high resubmission impact. Priority helps in meeting the deadline of a task and thereby reducing wastage of resources. FTWS schedules workflows within a deadline even in the presence of failures without using any history of information. The experiments were conducted in a simulated cloud environment by scheduling workflows in the presence of failures which are generated randomly. The experimental results of the proposed work demonstrate the effective success rate in-spite of various failures.

  10. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  11. A Multi-Fidelity Workflow to Derive Physics-Based Conceptual Design Methods

    OpenAIRE

    Böhnke, Daniel

    2015-01-01

    The present study developed a multi-fidelity workflow to derive physics-based conceptual design methods from models of higher-fidelity usually employed during preliminary aircraft design. The multi-fidelity workflow consists of a design of experiments, a multi-fidelity loop, and symbolic regression as surrogate modeling technique. Results are presented for conventional and unconventional aircraft configurations.

  12. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  13. Automatic utilities auditing

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Colin Boughton [Energy Metering Technology (United Kingdom)

    2000-08-01

    At present, energy audits represent only snapshot situations of the flow of energy. The normal pattern of energy audits as seen through the eyes of an experienced energy auditor is described. A brief history of energy auditing is given. It is claimed that the future of energy auditing lies in automatic meter reading with expert data analysis providing continuous automatic auditing thereby reducing the skill element. Ultimately, it will be feasible to carry out auditing at intervals of say 30 minutes rather than five years.

  14. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  15. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  16. Automatic atlas based electron density and structure contouring for MRI-based prostate radiation therapy on the cloud

    International Nuclear Information System (INIS)

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  17. Automatic Atlas Based Electron Density and Structure Contouring for MRI-based Prostate Radiation Therapy on the Cloud

    Science.gov (United States)

    Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.

    2014-03-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  18. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language......, a well-established visual language for modelling workflows in a business context. The framework’s modelling language is extended to include the tracking of real-valued quantities associated with the process (such as time, cost, temperature). In addition, this language also allows for an intention...... by means of a case study from the food industry. Through this case study we explore the extent to which the risk of production faults can be reduced and the impact of these can be minimised, primarily through restructuring of the production workflows. This approach is fully automated and only the...

  19. Domain-Specific Languages For Developing and Deploying Signature Discovery Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Jacob, Ferosh; Wynne, Adam S.; Liu, Yan; Gray, Jeff

    2013-12-02

    Domain-agnostic Signature Discovery entails scientific investigation across multiple domains through the re-use of existing algorithms into workflows. The existing algorithms may be written in any programming language for various hardware architectures (e.g., desktops, commodity clusters, and specialized parallel hardware platforms). This raises an engineering issue in generating Web services for heterogeneous algorithms so that they can be composed into a scientific workflow environment (e.g., Taverna). In this paper, we present our software tool that defines two simple Domain-Specific Languages (DSLs) to automate these processes: SDL and WDL. Our Service Description Language (SDL) describes key elements of a signature discovery algorithm and generates the service code. The Workflow Description Language (WDL) describes the pipeline of services and generates deployable artifacts for the Taverna workflow management system. We demonstrate our tool with a landscape classification example that is represented by BLAST workflows composed of services that wrap original scripts.

  20. The LabFlow system for workflow management in large scale biology research laboratories.

    Science.gov (United States)

    Goodman, N; Rozen, S; Stein, L D

    1998-01-01

    LabFlow is a workflow management system designed for large scale biology research laboratories. It provides a workflow model in which objects flow from task to task under programmatic control. The model supports parallelism, meaning that an object can flow down several paths simultaneously, and sub-workflows which can be invoked subroutine-style from a task. The system allocates tasks to Unix processes to achieve requisite levels of multiprocessing. The system uses the LabBase data management system to store workflow-state and laboratory results. LabFlow provides a Per15 object-oriented framework for defining workflows, and an engine for executing these. The software is freely available. PMID:9783211

  1. Optimizing perioperative decision making: improved information for clinical workflow planning.

    Science.gov (United States)

    Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph

    2012-01-01

    Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction. PMID:23304284

  2. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  3. The Distributed Workflow Management System--FlowAgent

    Institute of Scientific and Technical Information of China (English)

    王文军; 仲萃豪

    2000-01-01

    While mainframe or 2-tier client/server system have serious problems in flexibility and scalability for the large-scale business processes, 3-tier client/server architecture and object-oriented system modeling which construct business process on service components seem to bring software system some scalability. As enabling infrastructure for object-oriented methodology, distributed WFMS (Work-flow Management System) can flexibly describe business rules among autonomous 'service tasks', and support scalability of large-scale business process. But current distributed WFMS still have difficulty to manage a large number of distributed tasks, the 'multi-TaskDomain' architecture of FlowAgent will try to solve this problem, and bring a dynamic and distributed environment for task-scheduling.

  4. The impact of medical technology on office workflow.

    Science.gov (United States)

    McEvoy, S P

    2003-01-01

    Digital technologies are gaining wider acceptance within the medical and dental professions. The lure of increased productivity and improved quality entice practices to adapt. These systems are beginning to have a profound impact on the workflows within the practice, as well as putting new demands on existing resources. To successfully implement a new technology within your practice, you must look beyond advertising and discover the real requirements of the system. Vendors rarely try to help beyond the sale and installation of their equipment, nor do they consider how their product might require you to modify the way you and your staff work. Acquiring the necessary knowledge through self-education, a consultant, or (preferably) a combination of the two is the best way to integrate a new technology with your practice. PMID:14606549

  5. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  6. Workflow: a new modeling concept in critical care units.

    Science.gov (United States)

    Yousfi, F; Beuscart, R; Geib, J M

    1995-01-01

    The term Groupware concerns computer-based systems that support groups of people engaged in a common task (goal) and that provide an interface to a shared environment [1]. The absence of a common tool for exchanges between physicians and nurses causes a multiplication of paper supports for the recording of information. Our objective is to study software architectures in particular medical units that allow task coordination and managing conflicts between participants within a distributed environment. The final goal of this research is to propose a computer solution that could answer the user requirements in Critical Care Units (CCUs). This paper describes the Workflow management approach [5] for supporting group work in health care field. The emphasis is especially on asynchronous cooperation. This approach was applied to CCUs through the analysis and the proposal of a new architecture [6]. We shall limit ourselves to explaining control board and analyzing the message management we support. PMID:8591248

  7. Using Simulations to Integrate Technology into Health Care Aidesཿ Workflow

    Directory of Open Access Journals (Sweden)

    Sharla King

    2013-07-01

    Full Text Available Health care aides (HCAs are critical to home care, providing a range of services to people with chronic conditions, aging or are unable to care for themselves independently. The current HCA supply will not keep up with this increasing demand without fundamental changes in their work environment. One possible solution to some of the workflow challenges and workplace stress of HCAs is hand-held tablet technology. In order to introduce the use of tablets with HCAs, simulations were developed. Once an HCA was comfortable with the tablet, a simulated client was introduced. The HCA interacted with the simulated client and used the tablet applications to assist with providing care. After the simulations, the HCAs participated in a focus group. HCAs completed a survey before and after the tablet training and simulation to determine their perception and acceptance of the tablet. Future deployment and implementation of technologies in home care should be further evaluated for outcomes.

  8. Experiment planning and execution workflow at ASDEX Upgrade

    International Nuclear Information System (INIS)

    We present the current workflow from experiment proposals to the actual execution and evaluation of discharges at the ASDEX Upgrade tokamak. Requests for experiments are solicited from both within the IPP and from external collaborators in the yearly call-for-proposals, checked for feasibility and compliance with the project's research goals and collected in a proposal database. During the campaign shot requests are derived from the proposals and in weekly operation meetings the requests are mapped to a schedule (shot list). Before the execution of discharges a complete set of configuration data needs to be assembled. After the execution follows the analysis (including the evaluation of the discharge as to its usefulness for the underlying proposal) and logging of the attained parameters in a physics logbook. The paper describes processes, software tools, and information management showing how they ultimately lead to an improved scientific productivity.

  9. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  10. A component based approach to scientific workflow management

    International Nuclear Information System (INIS)

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse

  11. Automatic Dance Lesson Generation

    Science.gov (United States)

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  12. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  13. A Chaotic Particle Swarm Optimization-Based Heuristic for Market-Oriented Task-Level Scheduling in Cloud Workflow Systems

    OpenAIRE

    Xuejun Li; Jia Xu; Yun Yang

    2015-01-01

    Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they h...

  14. Grid-Workflow-Management-Systeme für die Ausführung wissenschaftlicher Prozessabläufe

    OpenAIRE

    Ekaterina Elts; Hans-Joachim Bungartz

    2016-01-01

    Für die Ausführung komplexer wissenschaftlicher Prozessabläufe (Workflows) in einer verteilten und heterogenen Rechner- und Softwareumgebung braucht man speziell darauf ausgerichtete Workflow-Management-Systeme. In diesem Bericht wurden einige international anerkannte Workflow-Management-Systeme untersucht und verglichen. Dabei wurden die besonderen Anforderungen an die wissenschaftlichen Workflows (im Gegensatz zu Geschäftsprozessen) beachtet und die jeweiligen Besonderheiten der betrac...

  15. Automatic Texture Mapping of Architectural and Archaeological 3d Models

    Science.gov (United States)

    Kersten, T. P.; Stallmann, D.

    2012-07-01

    Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  16. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images

    Science.gov (United States)

    2014-01-01

    Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics

  17. Workflow Management Application Programming Interface Specification%工作流管理应用编程接口规范

    Institute of Scientific and Technical Information of China (English)

    刘华伟; 吴朝晖

    2000-01-01

    The document 'Workflow Management Application Programming Interface Specification'is distributed by the Workflow Management Coalition to specify stadard APIs which can be supported by workflow management products.In this paper,we first introduce two parts of this interface,then discuss the standardized data structure and functions definition,finally address the future work.

  18. A Review on Scientific Workflows%科学工作流技术研究综述

    Institute of Scientific and Technical Information of China (English)

    张卫民; 刘灿灿; 骆志刚

    2011-01-01

    Scientific Workflow Management Systems ( SWfMS) has become an effective platform for representing and managing large-scale complex scientific computing processes, which glue together a large number of complex applications for data management, analysis, simulation, and visualization, and then assist the scientists for scientific discovery. In this paper, the current studies of Scientific Workflow (SWF) are reviewed firstly, including SWF models, presentations, languages, composition, validation, scheduling and provenance, as well as fault tolerance and system security. Then, the latest studies in recent years are also discussed. Based on the analysis of the key technologies and the latest development, the drawbacks in this domain are pointed out and some suggestions towards the future development are made. Finally, the recent research status quo in China is presented and some suggestions are given.%科学工作流管理系统(Scientific Workflow Management Systems,SWfMS)为科学家提供一个管理大规模复杂科学计算的有效平台,通过管理复杂应用程序对海量科学数据进行管理、分析、仿真和可视化以帮助科学家进行科学发现.本文首先对科学工作流(Scientific Workflow,SWF)的关键技术和研究现状进行全面综述,包括SWF模型、表示、语言、组合、验证、调度、数据来源管理以及系统的容错性与安全性等;然后对近两年的最新研究进行分析,并在此基础上指出该领域目前存在的不足和未来发展方向;最后对我国目前的研究现状进行分析并给出建议.

  19. Automatic texture segmentation for content-based image retrieval application

    OpenAIRE

    Fauzi, M.F.A.; Lewis, P. H.

    2006-01-01

    In this article, a brief review on texture segmentation is presented, before a novel automatic texture segmentation algorithm is developed. The algorithm is based on a modified discrete wavelet frames and the mean shift algorithm. The proposed technique is tested on a range of textured images including composite texture images, synthetic texture images, real scene images as well as our main source of images, the museum images of various kinds. An extension to the automatic texture segmentatio...

  20. Automatic breast cancer risk assessment from digital mammograms

    DEFF Research Database (Denmark)

    Karemore, Gopal Raghunath; Brandt, Sami; Karssemeijer, N;

    assessment tools such as Wolfe Patterns (Wolfe et al 1997), Tabar Patterns (Tabar et al 1982), radiologist’s categorical scorings Breast Imaging Report and Data System® (BIRADS) (ACR 2003), and computer-assisted planimetric measures of area percentage dense tissue (Byng et al 1994). Moreover, these tools...... rely on a radiologist’s assessment of mammographic appearance that makes it more vulnerable to false negative rate due to observational oversights and varying experience. In addition, these tools are not fully automatic that reduces the workflow efficiency in large screening studies. In this work, we...... described in detail by Brandt et al (submitted) and Raundahl et al (2008). Result: The performance of the proposed imaging marker was compared with various radiologist assisted scoring by area under ROC curve as shown in Table 1 and Fig. 1. Our proposed BC measure showed the maximum area under the ROC curve...