WorldWideScience

Sample records for automatic workflow composition

  1. Magallanes: a web services discovery and automatic workflow composition tool

    Directory of Open Access Journals (Sweden)

    Trelles Oswaldo

    2009-10-01

    Full Text Available Abstract Background To aid in bioinformatics data processing and analysis, an increasing number of web-based applications are being deployed. Although this is a positive circumstance in general, the proliferation of tools makes it difficult to find the right tool, or more importantly, the right set of tools that can work together to solve real complex problems. Results Magallanes (Magellan is a versatile, platform-independent Java library of algorithms aimed at discovering bioinformatics web services and associated data types. A second important feature of Magallanes is its ability to connect available and compatible web services into workflows that can process data sequentially to reach a desired output given a particular input. Magallanes' capabilities can be exploited both as an API or directly accessed through a graphic user interface. The Magallanes' API is freely available for academic use, and together with Magallanes application has been tested in MS-Windows™ XP and Unix-like operating systems. Detailed implementation information, including user manuals and tutorials, is available at http://www.bitlab-es.com/magallanes. Conclusion Different implementations of the same client (web page, desktop applications, web services, etc. have been deployed and are currently in use in real installations such as the National Institute of Bioinformatics (Spain and the ACGT-EU project. This shows the potential utility and versatility of the software library, including the integration of novel tools in the domain and with strong evidences in the line of facilitate the automatic discovering and composition of workflows.

  2. Provenance-Powered Automatic Workflow Generation and Composition

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Lee, T. J.

    2015-12-01

    In recent years, scientists have learned how to codify tools into reusable software modules that can be chained into multi-step executable workflows. Existing scientific workflow tools, created by computer scientists, require domain scientists to meticulously design their multi-step experiments before analyzing data. However, this is oftentimes contradictory to a domain scientist's daily routine of conducting research and exploration. We hope to resolve this dispute. Imagine this: An Earth scientist starts her day applying NASA Jet Propulsion Laboratory (JPL) published climate data processing algorithms over ARGO deep ocean temperature and AMSRE sea surface temperature datasets. Throughout the day, she tunes the algorithm parameters to study various aspects of the data. Suddenly, she notices some interesting results. She then turns to a computer scientist and asks, "can you reproduce my results?" By tracking and reverse engineering her activities, the computer scientist creates a workflow. The Earth scientist can now rerun the workflow to validate her findings, modify the workflow to discover further variations, or publish the workflow to share the knowledge. In this way, we aim to revolutionize computer-supported Earth science. We have developed a prototyping system to realize the aforementioned vision, in the context of service-oriented science. We have studied how Earth scientists conduct service-oriented data analytics research in their daily work, developed a provenance model to record their activities, and developed a technology to automatically generate workflow starting from user behavior and adaptability and reuse of these workflows for replicating/improving scientific studies. A data-centric repository infrastructure is established to catch richer provenance to further facilitate collaboration in the science community. We have also established a Petri nets-based verification instrument for provenance-based automatic workflow generation and recommendation.

  3. Context-aware Workflow Model for Supporting Composite Workflows

    Institute of Scientific and Technical Information of China (English)

    Jong-sun CHOI; Jae-young CHOI; Yong-yun CHO

    2010-01-01

    -In recent years,several researchers have applied workflow technologies for service automation on ubiquitous computing environments.However,most context-aware oprkflows do not offer a method to compose several workflows in order to get more large-scale or complicated workflow.They only provide a simple workflow model,not a composite workflow model.In this paper,the autorhs propose a context-aware worrkflow model to support composite workflows by expanding the patterns of the existing context-aware workflows,which support the basic workflow patterns.The suggested worklow modei offers composite workflow patterns for a context-aware workflow,which consists of various flow patterns,such as simple,split,parallel flows,and subflow.With the suggested model,the model can easily reuse few of existing workflows to make a new workflow.As a result,it can save the development efforts and time of cantext-aware workflows and increase the workflow reusability.Therefore,the suggested model is expected to make it easy to develop applications related to context-aware workflow services on ubiquitous computing environments.

  4. Software workflow for the automatic tagging of medieval manuscript images (SWATI)

    Science.gov (United States)

    Chandna, Swati; Tonne, Danah; Jejkal, Thomas; Stotzka, Rainer; Krause, Celia; Vanscheidt, Philipp; Busch, Hannah; Prabhune, Ajinkya

    2015-01-01

    Digital methods, tools and algorithms are gaining in importance for the analysis of digitized manuscript collections in the arts and humanities. One example is the BMBF-funded research project "eCodicology" which aims to design, evaluate and optimize algorithms for the automatic identification of macro- and micro-structural layout features of medieval manuscripts. The main goal of this research project is to provide better insights into high-dimensional datasets of medieval manuscripts for humanities scholars. The heterogeneous nature and size of the humanities data and the need to create a database of automatically extracted reproducible features for better statistical and visual analysis are the main challenges in designing a workflow for the arts and humanities. This paper presents a concept of a workflow for the automatic tagging of medieval manuscripts. As a starting point, the workflow uses medieval manuscripts digitized within the scope of the project Virtual Scriptorium St. Matthias". Firstly, these digitized manuscripts are ingested into a data repository. Secondly, specific algorithms are adapted or designed for the identification of macro- and micro-structural layout elements like page size, writing space, number of lines etc. And lastly, a statistical analysis and scientific evaluation of the manuscripts groups are performed. The workflow is designed generically to process large amounts of data automatically with any desired algorithm for feature extraction. As a result, a database of objectified and reproducible features is created which helps to analyze and visualize hidden relationships of around 170,000 pages. The workflow shows the potential of automatic image analysis by enabling the processing of a single page in less than a minute. Furthermore, the accuracy tests of the workflow on a small set of manuscripts with respect to features like page size and text areas show that automatic and manual analysis are comparable. The usage of a computer

  5. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    Science.gov (United States)

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments.

  6. An Automatic Image Processing Workflow for Daily Magnetic Resonance Imaging Quality Assurance.

    Science.gov (United States)

    Peltonen, Juha I; Mäkelä, Teemu; Sofiev, Alexey; Salli, Eero

    2017-04-01

    The performance of magnetic resonance imaging (MRI) equipment is typically monitored with a quality assurance (QA) program. The QA program includes various tests performed at regular intervals. Users may execute specific tests, e.g., daily, weekly, or monthly. The exact interval of these measurements varies according to the department policies, machine setup and usage, manufacturer's recommendations, and available resources. In our experience, a single image acquired before the first patient of the day offers a low effort and effective system check. When this daily QA check is repeated with identical imaging parameters and phantom setup, the data can be used to derive various time series of the scanner performance. However, daily QA with manual processing can quickly become laborious in a multi-scanner environment. Fully automated image analysis and results output can positively impact the QA process by decreasing reaction time, improving repeatability, and by offering novel performance evaluation methods. In this study, we have developed a daily MRI QA workflow that can measure multiple scanner performance parameters with minimal manual labor required. The daily QA system is built around a phantom image taken by the radiographers at the beginning of day. The image is acquired with a consistent phantom setup and standardized imaging parameters. Recorded parameters are processed into graphs available to everyone involved in the MRI QA process via a web-based interface. The presented automatic MRI QA system provides an efficient tool for following the short- and long-term stability of MRI scanners.

  7. Evaluation of an image-based tracking workflow with Kalman filtering for automatic image plane alignment in interventional MRI.

    Science.gov (United States)

    Neumann, M; Cuvillon, L; Breton, E; de Matheli, M

    2013-01-01

    Recently, a workflow for magnetic resonance (MR) image plane alignment based on tracking in real-time MR images was introduced. The workflow is based on a tracking device composed of 2 resonant micro-coils and a passive marker, and allows for tracking of the passive marker in clinical real-time images and automatic (re-)initialization using the microcoils. As the Kalman filter has proven its benefit as an estimator and predictor, it is well suited for use in tracking applications. In this paper, a Kalman filter is integrated in the previously developed workflow in order to predict position and orientation of the tracking device. Measurement noise covariances of the Kalman filter are dynamically changed in order to take into account that, according to the image plane orientation, only a subset of the 3D pose components is available. The improved tracking performance of the Kalman extended workflow could be quantified in simulation results. Also, a first experiment in the MRI scanner was performed but without quantitative results yet.

  8. Assessing the Performance of Automatic Speech Recognition Systems When Used by Native and Non-Native Speakers of Three Major Languages in Dictation Workflows

    DEFF Research Database (Denmark)

    Zapata, Julián; Kirkedal, Andreas Søeborg

    2015-01-01

    In this paper, we report on a two-part experiment aiming to assess and compare the performance of two types of automatic speech recognition (ASR) systems on two different computational platforms when used to augment dictation workflows. The experiment was performed with a sample of speakers...... of three major languages and with different linguistic profiles: non-native English speakers; non-native French speakers; and native Spanish speakers. The main objective of this experiment is to examine ASR performance in translation dictation (TD) and medical dictation (MD) workflows without manual...

  9. A workflow for the automatic segmentation of organelles in electron microscopy image stacks.

    Science.gov (United States)

    Perez, Alex J; Seyedhosseini, Mojtaba; Deerinck, Thomas J; Bushong, Eric A; Panda, Satchidananda; Tasdizen, Tolga; Ellisman, Mark H

    2014-01-01

    Electron microscopy (EM) facilitates analysis of the form, distribution, and functional status of key organelle systems in various pathological processes, including those associated with neurodegenerative disease. Such EM data often provide important new insights into the underlying disease mechanisms. The development of more accurate and efficient methods to quantify changes in subcellular microanatomy has already proven key to understanding the pathogenesis of Parkinson's and Alzheimer's diseases, as well as glaucoma. While our ability to acquire large volumes of 3D EM data is progressing rapidly, more advanced analysis tools are needed to assist in measuring precise three-dimensional morphologies of organelles within data sets that can include hundreds to thousands of whole cells. Although new imaging instrument throughputs can exceed teravoxels of data per day, image segmentation and analysis remain significant bottlenecks to achieving quantitative descriptions of whole cell structural organellomes. Here, we present a novel method for the automatic segmentation of organelles in 3D EM image stacks. Segmentations are generated using only 2D image information, making the method suitable for anisotropic imaging techniques such as serial block-face scanning electron microscopy (SBEM). Additionally, no assumptions about 3D organelle morphology are made, ensuring the method can be easily expanded to any number of structurally and functionally diverse organelles. Following the presentation of our algorithm, we validate its performance by assessing the segmentation accuracy of different organelle targets in an example SBEM dataset and demonstrate that it can be efficiently parallelized on supercomputing resources, resulting in a dramatic reduction in runtime.

  10. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  11. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  12. A workflow for the automatic segmentation of organelles in electron microscopy image stacks

    Directory of Open Access Journals (Sweden)

    Alex Joseph Perez

    2014-11-01

    Full Text Available Electron microscopy (EM facilitates analysis of the form, distribution, and functional status of key organelle systems in various pathological processes, including those associated with neurodegenerative disease. Such EM data often provide important new insights into the underlying disease mechanisms. The development of more accurate and efficient methods to quantify changes in subcellular microanatomy has already proven key to understanding the pathogenesis of Parkinson’s and Alzheimer’s diseases, as well as glaucoma. While our ability to acquire large volumes of 3D EM data is progressing rapidly, more advanced analysis tools are needed to assist in measuring precise three-dimensional morphologies of organelles within data sets that can include hundreds to thousands of whole cells. Although new imaging instrument throughputs can exceed teravoxels of data per day, image segmentation and analysis remain significant bottlenecks to achieving quantitative descriptions of whole cell structural organellomes. Here, we present a novel method for the automatic segmentation of organelles in 3D EM image stacks. Segmentations are generated using only 2D image information, making the method suitable for anisotropic imaging techniques such as serial block-face scanning electron microscopy (SBEM. Additionally, no assumptions about 3D organelle morphology are made, ensuring the method can be easily expanded to any number of structurally and functionally diverse organelles. Following the presentation of our algorithm, we validate its performance by assessing the segmentation accuracy of different organelle targets in an example SBEM dataset and demonstrate that it can be efficiently parallelized on supercomputing resources, resulting in a dramatic reduction in runtime.

  13. Characterizing chaotic melodies in automatic music composition.

    Science.gov (United States)

    Coca, Andrés E; Tost, Gerard O; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  14. Characterizing chaotic melodies in automatic music composition

    Science.gov (United States)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  15. QoS measurement of workflow-based web service compositions using Colored Petri net.

    Science.gov (United States)

    Nematzadeh, Hossein; Motameni, Homayun; Mohamad, Radziah; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  16. QoS Measurement of Workflow-Based Web Service Compositions Using Colored Petri Net

    Science.gov (United States)

    Nematzadeh, Hossein; Motameni, Homayun; Nematzadeh, Zahra

    2014-01-01

    Workflow-based web service compositions (WB-WSCs) is one of the main composition categories in service oriented architecture (SOA). Eflow, polymorphic process model (PPM), and business process execution language (BPEL) are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today's web environments. Business should try to provide good quality regarding the customers' requirements to a composed web service. Thus, quality of service (QoS) which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN) with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation. PMID:25110748

  17. QoS Measurement of Workflow-Based Web Service Compositions Using Colored Petri Net

    Directory of Open Access Journals (Sweden)

    Hossein Nematzadeh

    2014-01-01

    Full Text Available Workflow-based web service compositions (WB-WSCs is one of the main composition categories in service oriented architecture (SOA. Eflow, polymorphic process model (PPM, and business process execution language (BPEL are the main techniques of the category of WB-WSCs. Due to maturity of web services, measuring the quality of composite web services being developed by different techniques becomes one of the most important challenges in today’s web environments. Business should try to provide good quality regarding the customers’ requirements to a composed web service. Thus, quality of service (QoS which refers to nonfunctional parameters is important to be measured since the quality degree of a certain web service composition could be achieved. This paper tried to find a deterministic analytical method for dependability and performance measurement using Colored Petri net (CPN with explicit routing constructs and application of theory of probability. A computer tool called WSET was also developed for modeling and supporting QoS measurement through simulation.

  18. An automatic composition model of Chinese folk music

    Science.gov (United States)

    Zheng, Xiaomei; Li, Dongyang; Wang, Lei; Shen, Lin; Gao, Yanyuan; Zhu, Yuanyuan

    2017-03-01

    The automatic composition has achieved rich results in recent decades, including Western and some other areas of music. However, the automatic composition of Chinese music is less involved. After thousands of years of development, Chinese folk music has a wealth of resources. To design an automatic composition mode, learn the characters of Chinese folk melody and imitate the creative process of music is of some significance. According to the melodic features of Chinese folk music, a Chinese folk music composition based on Markov model is proposed to analyze Chinese traditional music. Folk songs with typical Chinese national characteristics are selected for analysis. In this paper, an example of automatic composition is given. The experimental results show that this composition model can produce music with characteristics of Chinese folk music.

  19. Managing and Documenting Legacy Scientific Workflows.

    Science.gov (United States)

    Acuña, Ruben; Chomilier, Jacques; Lacroix, Zoé

    2015-10-06

    Scientific legacy workflows are often developed over many years, poorly documented and implemented with scripting languages. In the context of our cross-disciplinary projects we face the problem of maintaining such scientific workflows. This paper presents the Workflow Instrumentation for Structure Extraction (WISE) method used to process several ad-hoc legacy workflows written in Python and automatically produce their workflow structural skeleton. Unlike many existing methods, WISE does not assume input workflows to be preprocessed in a known workflow formalism. It is also able to identify and analyze calls to external tools. We present the method and report its results on several scientific workflows.

  20. Structured Composition of Dataflow and Control-Flow for Reusable and Robust Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Bowers, S; Ludaescher, B; Ngu, A; Critchlow, T

    2005-09-07

    Data-centric scientific workflows are often modeled as dataflow process networks. The simplicity of the dataflow framework facilitates workflow design, analysis, and optimization. However, some workflow tasks are particularly ''control-flow intensive'', e.g., procedures to make workflows more fault-tolerant and adaptive in an unreliable, distributed computing environment. Modeling complex control-flow directly within a dataflow framework often leads to overly complicated workflows that are hard to comprehend, reuse, schedule, and maintain. In this paper, we develop a framework that allows a structured embedding of control-flow intensive subtasks within dataflow process networks. In this way, we can seamlessly handle complex control-flows without sacrificing the benefits of dataflow. We build upon a flexible actor-oriented modeling and design approach and extend it with (actor) frames and (workflow) templates. A frame is a placeholder for an (existing or planned) collection of components with similar function and signature. A template partially specifies the behavior of a subworkflow by leaving ''holes'' (i.e., frames) in the subworkflow definition. Taken together, these abstraction mechanisms facilitate the separation and structured re-combination of control-flow and dataflow in scientific workflow applications. We illustrate our approach with a real-world scientific workflow from the astrophysics domain. This data-intensive workflow requires remote execution and file transfer in a semi-reliable environment. For such work-flows, we propose a 3-layered architecture: The top-level, typically a dataflow process network, includes Generic Data Transfer (GDT) frames and Generic remote eXecution (GX) frames. At the second level, the user can specialize the behavior of these generic components by embedding a suitable template (here: transducer templates for control-flow intensive tasks). At the third level, frames inside the

  1. Automatic Music Composition using Answer Set Programming

    CERN Document Server

    Boenn, Georg; De Vos, Marina; ffitch, John

    2010-01-01

    Music composition used to be a pen and paper activity. These these days music is often composed with the aid of computer software, even to the point where the computer compose parts of the score autonomously. The composition of most styles of music is governed by rules. We show that by approaching the automation, analysis and verification of composition as a knowledge representation task and formalising these rules in a suitable logical language, powerful and expressive intelligent composition tools can be easily built. This application paper describes the use of answer set programming to construct an automated system, named ANTON, that can compose melodic, harmonic and rhythmic music, diagnose errors in human compositions and serve as a computer-aided composition tool. The combination of harmonic, rhythmic and melodic composition in a single framework makes ANTON unique in the growing area of algorithmic composition. With near real-time composition, ANTON reaches the point where it can not only be used as a ...

  2. QoS-Aware Automatic Service Composition: A Graph View

    Institute of Scientific and Technical Information of China (English)

    Wei Jiang; Tian Wu; Song-Lin Hu; Zhi-Yong Liu

    2011-01-01

    In the research of service composition,it demands efficient algorithms that not only retrieve correct service compositions automatically from thousands of services but also satisfy the quality requirements of different service users.However,most approaches treat these two aspects as two separate problems,automatic service composition and service selection.Although the latest researches realize the restriction of this separate view and some specific methods are proposed,they still suffer from serious limitations in scalability and accuracy when addressing both requirements simultaneously.In order to cope with these limitations and efficiently solve the combined problem which is known as QoS-aware or QoS-driven automatic service composition problem,we propose a new graph search problem,single-source optimal directed acyclic graphs (DAGs),for the first time.This novel single-source optimal DAGs (SSOD) problem is similar to,but more general than the classical single-source shortest paths (SSSP) problem.In this paper,a new graph model of SSOD problem is proposed and a Sim-Dijkstra algorithm is presented to address the SSOD problem with the time complexity of O(n log n +m) (n and m are the number of nodes and edges in the graph respectively),and the proofs of its soundness.It is also directly applied to solve the QoS-aware automatic service composition problem,and a service composition tool named QSynth is implemented.Evaluations show that Sim-Dijkstra algorithm achieves superior scalability and efficiency with respect to a large variety of composition scenarios,even more efficient than our worklist algorithm that won the performance championship of Web Services Challenge 2009.

  3. Scientific Process Automation and Workflow Management

    Energy Technology Data Exchange (ETDEWEB)

    Ludaescher, Bertram T.; Altintas, Ilkay; Bowers, Shawn; Cummings, J.; Critchlow, Terence J.; Deelman, Ewa; De Roure, D.; Freire, Juliana; Goble, Carole; Jones, Matt; Klasky, S.; McPhillips, Timothy; Podhorszki, Norbert; Silva, C.; Taylor, I.; Vouk, M.

    2010-01-01

    We introduce and describe scientific workflows, i.e., executable descriptions of automatable scientific processes such as computational science simulations and data analyses. Scientific workflows are often expressed in terms of tasks and their (data ow) dependencies. This chapter first provides an overview of the characteristic features of scientific workflows and outlines their life cycle. A detailed case study highlights workflow challenges and solutions in simulation management. We then provide a brief overview of how some concrete systems support the various phases of the workflow life cycle, i.e., design, resource management, execution, and provenance management. We conclude with a discussion on community-based workflow sharing.

  4. Child vocalization composition as discriminant information for automatic autism detection.

    Science.gov (United States)

    Xu, Dongxin; Gilkerson, Jill; Richards, Jeffrey; Yapanel, Umit; Gray, Sharmi

    2009-01-01

    Early identification is crucial for young children with autism to access early intervention. The existing screens require either a parent-report questionnaire and/or direct observation by a trained practitioner. Although an automatic tool would benefit parents, clinicians and children, there is no automatic screening tool in clinical use. This study reports a fully automatic mechanism for autism detection/screening for young children. This is a direct extension of the LENA (Language ENvironment Analysis) system, which utilizes speech signal processing technology to analyze and monitor a child's natural language environment and the vocalizations/speech of the child. It is discovered that child vocalization composition contains rich discriminant information for autism detection. By applying pattern recognition and machine learning approaches to child vocalization composition data, accuracy rates of 85% to 90% in cross-validation tests for autism detection have been achieved at the equal-error-rate (EER) point on a data set with 34 children with autism, 30 language delayed children and 76 typically developing children. Due to its easy and automatic procedure, it is believed that this new tool can serve a significant role in childhood autism screening, especially in regards to population-based or universal screening.

  5. A prototype of workflow management system for construction design projects

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    A great deal of benefits can be achieved if information and process are integrated within the building design project. This paper aims to establish a prototype of workflow management system for construction design project through the application of workflow technology. The composition and function of prototype is presented to satisfy the needs of information share and process integration. By integrating all subsystems and modules of the prototype, the whole system can deal with design information-flow modeling, emulating and optimizing, task planning and distributing, automatic tracking and monitoring, as well as network service, etc. In this way, the collaborative design environment of building design project is brought into being.

  6. Concentrate composition for Automatic Milking Systems - Effect on milking frequency

    DEFF Research Database (Denmark)

    Madsen, J; Weisbjerg, Martin Riis; Hvelplund, Torben

    2010-01-01

    The purpose of this study was to investigate the potential of affecting milking frequency in an Automatic Milking System (AMS) by changing ingredient composition of the concentrate fed in the AMS. In six experiments, six experimental concentrates were tested against a Standard concentrate all...... the Standard concentrate. A marked effect was found on the number of visits of the cows in the AMS and the subsequent milk production in relation to composition of the concentrate. The composition of the concentrates also influenced the composition of the milk and the MR intake. Based on the overall responses...... the cows preferred a mixture containing Barley and Oats. Also Wheat based concentrate appeared to be preferred to concentrate based on Maize or Barley and the cows did not like the Fat rich or the pure Artificially dried grass concentrate used in the experiment....

  7. Automatic Discovery of Non-Compositional Compounds in Parallel Data

    CERN Document Server

    Melamed, I D

    1997-01-01

    Automatic segmentation of text into minimal content-bearing units is an unsolved problem even for languages like English. Spaces between words offer an easy first approximation, but this approximation is not good enough for machine translation (MT), where many word sequences are not translated word-for-word. This paper presents an efficient automatic method for discovering sequences of words that are translated as a unit. The method proceeds by comparing pairs of statistical translation models induced from parallel texts in two languages. It can discover hundreds of non-compositional compounds on each iteration, and constructs longer compounds out of shorter ones. Objective evaluation on a simple machine translation task has shown the method's potential to improve the quality of MT output. The method makes few assumptions about the data, so it can be applied to parallel data other than parallel texts, such as word spellings and pronunciations.

  8. Modeling workflow using XML and Petri net

    Institute of Scientific and Technical Information of China (English)

    杨东; 温泉; 张申生

    2004-01-01

    Nowadays an increasing number of workflow products and research prototypes begin to adopt XML for representing workflow models owing to its easy use and well understanding for people and machines. However, most of workflow products and research prototypes provide the few supports for the verification of XML-based workflow model, such as free-deadlock properties, which is essential to successful application of workflow technology. In this paper, we tackle this problem by mapping the XML-based workflow model into Petri-net, a kind of well-known formalism for modeling,analyzing and verifying system. As a result, the XML-based workflow model can be automatically verified with the help of general Petri-net tools, such as DANAMICS. The presented approach not only enables end users to represent workflow model with XML-based modeling language, but also the correctness of model can be ensured, thus satisfying the needs of business processes.

  9. Data Exchange in Grid Workflow

    Institute of Scientific and Technical Information of China (English)

    ZENG Hongwei; MIAO Huaikou

    2006-01-01

    In existing web services-based workflow, data exchanging across the web services is centralized, the workflow engine intermediates at each step of the application sequence.However, many grid applications, especially data intensive scientific applications, require exchanging large amount of data across the grid services.Having a central workfiow engine relay the data between the services would results in a bottleneck in these cases.This paper proposes a data exchange model for individual grid workflow and multiworkflows composition respectively.The model enables direct communication for large amounts of data between two grid services.To enable data to exchange among multiple workflows, the bridge data service is used.

  10. 基于工作流模板的Web服务组合模型研究%Research on Web Services Composition Model Based on Workflow Template

    Institute of Scientific and Technical Information of China (English)

    李顺新; 凌海洋; 江南

    2009-01-01

    Web services composition is an important research field of service applications. By the similarity between workflow and Web services composition, a new Web services composition model based on workflow template is proposed. In this model, the workflow and Web service can be found more accurately by using the advantage of the functional semantics. The Agent method is used to execute the composition flow. Finally, the template flow and the Web service are published to the register library by using the publishing algorithm.%服务组合是Web服务应用的一个重要研究方向,利用工作流与服务组合的相似性,提出了一种基于工作流模板的Web服务组合模型;该模型利用功能语义在服务匹配上的优点,对流程、服务进行较为准确的查询;并通过Agent技术来执行组合方案;最后结合服务发布算法,将组合后的服务和流程发布在注册库中.

  11. Linked-OWL: A new approach for dynamic linked data service workflow composition

    Directory of Open Access Journals (Sweden)

    Hussien Ahmad

    2013-06-01

    Full Text Available The shift from Web of Document into Web of Data based on Linked Data principles defined by Tim Berners-Lee posed a big challenge to build and develop applications to work in Web of Data environment. There are several attempts to build service and application models for Linked Data Cloud. In this paper, we propose a new service model for linked data "Linked-OWL" which is based on RESTful services and OWL-S and copes with linked data principles. This new model shifts the service concept from functions into linked data things and opens the road for Linked Oriented Architecture (LOA and Web of Services as part and on top of Web of Data. This model also provides high level of dynamic service composition capabilities for more accurate dynamic composition and execution of complex business processes in Web of Data environment.

  12. On A Semi-Automatic Method for Generating Composition Tables

    CERN Document Server

    Liu, Weiming

    2011-01-01

    Originating from Allen's Interval Algebra, composition-based reasoning has been widely acknowledged as the most popular reasoning technique in qualitative spatial and temporal reasoning. Given a qualitative calculus (i.e. a relation model), the first thing we should do is to establish its composition table (CT). In the past three decades, such work is usually done manually. This is undesirable and error-prone, given that the calculus may contain tens or hundreds of basic relations. Computing the correct CT has been identified by Tony Cohn as a challenge for computer scientists in 1995. This paper addresses this problem and introduces a semi-automatic method to compute the CT by randomly generating triples of elements. For several important qualitative calculi, our method can establish the correct CT in a reasonable short time. This is illustrated by applications to the Interval Algebra, the Region Connection Calculus RCC-8, the INDU calculus, and the Oriented Point Relation Algebras. Our method can also be us...

  13. Automated data reduction workflows for astronomy

    CERN Document Server

    Freudling, W; Bramich, D M; Ballester, P; Forchi, V; Garcia-Dablo, C E; Moehler, S; Neeser, M J

    2013-01-01

    Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. The efficiency of data reduction can be improved by using automatic workflows to organise data and execute the sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowch...

  14. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  15. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  16. Professional Windows Workflow Foundation

    CERN Document Server

    Kitta, Todd

    2007-01-01

    If you want to gain the skills to build Windows Workflow Foundation solutions, then this is the book for you. It provides you with a clear, practical guide on how to develop workflow-based software and integrate it into existing technology landscapes. Throughout the pages, you'll also find numerous real-world examples and sample code that will help you to get started quickly.Each major area of Windows Workflow Foundation is explored in depth along with some of the fundamentals operations related to generic workflow applications. You'll also find detailed coverage on how to develop workflow in

  17. Optimize Internal Workflow Management

    Directory of Open Access Journals (Sweden)

    Lucia RUSU

    2010-01-01

    Full Text Available Workflow Management has the role of creating andmaintaining an efficient flow of information and tasks inside anorganization. The major benefit of workflows is the solutions tothe growing needs of organizations. The external and theinternal processes associated with a business need to be carefullyorganized in order to provide a strong foundation for the dailywork. This paper focuses internal workflow within a company,attempts to provide some basic principles related to workflows,and a workflows solution for modeling and deploying usingVisual Studio and SharePoint Server.

  18. Unidirectional high fiber content composites: Automatic 3D FE model generation and damage simulation

    DEFF Research Database (Denmark)

    Qing, Hai; Mishnaevsky, Leon

    2009-01-01

    A new method and a software code for the automatic generation of 3D micromechanical FE models of unidirectional long-fiber-reinforced composite (LFRC) with high fiber volume fraction with random fiber arrangement are presented. The fiber arrangement in the cross-section is generated through random...

  19. A P2P approach to resource discovery in on-line monitoring of Grid workflows

    NARCIS (Netherlands)

    Łabno, B.; Bubak, M.; Baliś, B.

    2008-01-01

    On-line monitoring of Grid workflows is challenging since workflows are loosely coupled and highly dynamic. An efficient mechanism of automatic resource discovery is needed in order to discover new producers of workflow monitoring data fast. However, currently used Grid information systems are not s

  20. AWSCS-A System to Evaluate Different Approaches for the Automatic Composition and Execution of Web Services Flows.

    Science.gov (United States)

    Tardiole Kuehne, Bruno; Estrella, Julio Cezar; Nunes, Luiz Henrique; Martins de Oliveira, Edvard; Hideo Nakamura, Luis; Gomes Ferreira, Carlos Henrique; Carlucci Santana, Regina Helena; Reiff-Marganiec, Stephan; Santana, Marcos José

    2015-01-01

    This paper proposes a system named AWSCS (Automatic Web Service Composition System) to evaluate different approaches for automatic composition of Web services, based on QoS parameters that are measured at execution time. The AWSCS is a system to implement different approaches for automatic composition of Web services and also to execute the resulting flows from these approaches. Aiming at demonstrating the results of this paper, a scenario was developed, where empirical flows were built to demonstrate the operation of AWSCS, since algorithms for automatic composition are not readily available to test. The results allow us to study the behaviour of running composite Web services, when flows with the same functionality but different problem-solving strategies were compared. Furthermore, we observed that the influence of the load applied on the running system as the type of load submitted to the system is an important factor to define which approach for the Web service composition can achieve the best performance in production.

  1. AUTOMATIC WEB SERVICE SELECTION BY OPTIMIZING COST OF COMPOSITION IN SLAKY COMPOSER USING ASSIGNMENT MINIMIZATION APPROACH

    Directory of Open Access Journals (Sweden)

    P. Sandhya

    2012-12-01

    Full Text Available Web service composition is a means of building enterprises virtually by knitting relevant web services on the fly. Automatic web service composition is done dynamically at runtime. Extensive research has been done in the field of automatic web service composition. However all the works focus on providing client oriented results and hence there is less industry adoption of composition technology. In this paper we have proposed a new service collaboration stack that composes with realistic business metrics of a provider in addition to client metrics. Some of the service provider metrics include time planning, profit management, native intelligence, user adoption, environment, market scenario, vision and industry adoption. In this paper we focus on enhancing industry adoption through optimizing cost of service composition. We propose the SLAKY composer that solves assignment of appropriate service during composition as an assignment minimization problem to reduce the cost of composition. We also extend OWL-S profile sub ontology to augment cost as a service parameter.

  2. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  3. Scientific workflows for bibliometrics

    NARCIS (Netherlands)

    Guler, A.T.; Waaijer, C.J.; Palmblad, M.

    2016-01-01

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly avai

  4. Optimizing Workflow Data Footprint

    Directory of Open Access Journals (Sweden)

    Gurmeet Singh

    2007-01-01

    Full Text Available In this paper we examine the issue of optimizing disk usage and scheduling large-scale scientific workflows onto distributed resources where the workflows are data-intensive, requiring large amounts of data storage, and the resources have limited storage resources. Our approach is two-fold: we minimize the amount of space a workflow requires during execution by removing data files at runtime when they are no longer needed and we demonstrate that workflows may have to be restructured to reduce the overall data footprint of the workflow. We show the results of our data management and workflow restructuring solutions using a Laser Interferometer Gravitational-Wave Observatory (LIGO application and an astronomy application, Montage, running on a large-scale production grid-the Open Science Grid. We show that although reducing the data footprint of Montage by 48% can be achieved with dynamic data cleanup techniques, LIGO Scientific Collaboration workflows require additional restructuring to achieve a 56% reduction in data space usage. We also examine the cost of the workflow restructuring in terms of the application's runtime.

  5. From Workflow to Interworkflow

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Workflow management systems are being introduced in manyorganizations to automa te the business process. The initial emphasis of introducing a workflow manageme nt system is on its application to the workflow in a given organization. The nex t step is to interconnect the workflow across organizations. We call it interwor kflow, and the total support technologies, which are necessary for its realizati on, interworkflow management mechanism. Interworkflow is being expected as a su pporting mechanism for Business-to-Business Electronic Commerce. We had propos ed this management mechanism and confirmed its realization with the prototype. At the same time, the interface and the protocol for interconnecting heterogeneous workflow management systems has been standardized by the WfMC. So, we advance t he project of the implementation of interworkflow management system for the prac tical use and its experimental proof.

  6. Benchmarking ETL Workflows

    Science.gov (United States)

    Simitsis, Alkis; Vassiliadis, Panos; Dayal, Umeshwar; Karagiannis, Anastasios; Tziovara, Vasiliki

    Extraction-Transform-Load (ETL) processes comprise complex data workflows, which are responsible for the maintenance of a Data Warehouse. A plethora of ETL tools is currently available constituting a multi-million dollar market. Each ETL tool uses its own technique for the design and implementation of an ETL workflow, making the task of assessing ETL tools extremely difficult. In this paper, we identify common characteristics of ETL workflows in an effort of proposing a unified evaluation method for ETL. We also identify the main points of interest in designing, implementing, and maintaining ETL workflows. Finally, we propose a principled organization of test suites based on the TPC-H schema for the problem of experimenting with ETL workflows.

  7. 柔性工作流中的服务组合可协调性研究%Coordination Research for Service Composition Based on Flexible Workflow

    Institute of Scientific and Technical Information of China (English)

    张静乐; 杨扬; 卡利勒; 王元卓; 赵晓永

    2011-01-01

    In this paper, we propose a research way about coordination of service composition. First of all, the definition of services and service composition of coordination was proposed. Then, the decision algorithm of coordination is given based on the definition of service composition. At last, take e-commerce systems in the logistics system as an example, based on the Stochastic Petri Net of model, the method of automatic service composition based on service coordination was highlighted.%提出柔性工作流中的服务组合可协调性分析方法.给出服务以及服务组合可协调性的定义;在可协调性定义的基础上给出组合服务间可协调性判定算法;最后,以电子商务系统中的物流系统为例,在随机Petri网模型建模的基础上,着重介绍了柔性工作系统中基于服务可协调性判定的服务自动组合方法.

  8. Integrating configuration workflows with project management system

    Science.gov (United States)

    Nilsen, Dimitri; Weber, Pavel

    2014-06-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  9. A framework for interoperability of BPEL-based workflows

    Institute of Scientific and Technical Information of China (English)

    Li Xitong; Fan Yushun; Huang Shuangxi

    2008-01-01

    With the prevalence of service-oriented architecture (SOA), web services have become the dominating technology to construct workflow systems. As a workflow is the composition of a series of interrelated web services which realize its activities, the interoperability of workflows can be treated as the composition of web services. To address it, a framework for interoperability of business process execution language (BPEL)-based workflows is presented, which can perform three phases, that is, transformation, conformance test and execution. The core components of the framework are proposed, especially how these components promote interoperability. In particular, dynamic binding and re-composition of workflows in terms of web service testing are presented. Besides, an example of business-to-business (B2B) collaboration is provided to illustrate how to perform composition and conformance test.

  10. Scientific workflows for bibliometrics.

    Science.gov (United States)

    Guler, Arzu Tugce; Waaijer, Cathelijn J F; Palmblad, Magnus

    Scientific workflows organize the assembly of specialized software into an overall data flow and are particularly well suited for multi-step analyses using different types of software tools. They are also favorable in terms of reusability, as previously designed workflows could be made publicly available through the myExperiment community and then used in other workflows. We here illustrate how scientific workflows and the Taverna workbench in particular can be used in bibliometrics. We discuss the specific capabilities of Taverna that makes this software a powerful tool in this field, such as automated data import via Web services, data extraction from XML by XPaths, and statistical analysis and visualization with R. The support of the latter is particularly relevant, as it allows integration of a number of recently developed R packages specifically for bibliometrics. Examples are used to illustrate the possibilities of Taverna in the fields of bibliometrics and scientometrics.

  11. An Automatic Web Service Composition Framework Using QoS-Based Web Service Ranking Algorithm.

    Science.gov (United States)

    Mallayya, Deivamani; Ramachandran, Baskaran; Viswanathan, Suganya

    2015-01-01

    Web service has become the technology of choice for service oriented computing to meet the interoperability demands in web applications. In the Internet era, the exponential addition of web services nominates the "quality of service" as essential parameter in discriminating the web services. In this paper, a user preference based web service ranking (UPWSR) algorithm is proposed to rank web services based on user preferences and QoS aspect of the web service. When the user's request cannot be fulfilled by a single atomic service, several existing services should be composed and delivered as a composition. The proposed framework allows the user to specify the local and global constraints for composite web services which improves flexibility. UPWSR algorithm identifies best fit services for each task in the user request and, by choosing the number of candidate services for each task, reduces the time to generate the composition plans. To tackle the problem of web service composition, QoS aware automatic web service composition (QAWSC) algorithm proposed in this paper is based on the QoS aspects of the web services and user preferences. The proposed framework allows user to provide feedback about the composite service which improves the reputation of the services.

  12. A Model for Semi-Automatic Composition of Educational Content from Open Repositories of Learning Objects

    Directory of Open Access Journals (Sweden)

    Paula Andrea Rodríguez Marín

    2014-04-01

    Full Text Available Learning objects (LOs repositories are important in building educational content and should allow search, retrieval and composition processes to be successfully developed to reach educational goals. However, such processes require so much time-consuming and not always provide the desired results. Thus, the aim of this paper is to propose a model for the semiautomatic composition of LOs, which are automatically recovered from open repositories. For the development of model, various text similarity measures are discussed, while for calibration and validation some comparison experiments were performed using the results obtained by teachers. Experimental results show that when using a value of k (number of LOs selected of at least 3, the percentage of similarities between the model and such made by experts exceeds 75%. To conclude, it can be established that the model proposed allows teachers to save time and effort for LOs selection by performing a pre-filter process.

  13. All-automatic swimmer tracking system based on an optimized scaled composite JTC technique

    Science.gov (United States)

    Benarab, D.; Napoléon, T.; Alfalou, A.; Verney, A.; Hellard, P.

    2016-04-01

    In this paper, an all-automatic optimized JTC based swimmer tracking system is proposed and evaluated on real video database outcome from national and international swimming competitions (French National Championship, Limoges 2015, FINA World Championships, Barcelona 2013 and Kazan 2015). First, we proposed to calibrate the swimming pool using the DLT algorithm (Direct Linear Transformation). DLT calculates the homography matrix given a sufficient set of correspondence points between pixels and metric coordinates: i.e. DLT takes into account the dimensions of the swimming pool and the type of the swim. Once the swimming pool is calibrated, we extract the lane. Then we apply a motion detection approach to detect globally the swimmer in this lane. Next, we apply our optimized Scaled Composite JTC which consists of creating an adapted input plane that contains the predicted region and the head reference image. This latter is generated using a composite filter of fin images chosen from the database. The dimension of this reference will be scaled according to the ratio between the head's dimension and the width of the swimming lane. Finally, applying the proposed approach improves the performances of our previous tracking method by adding a detection module in order to achieve an all-automatic swimmer tracking system.

  14. CaGrid Workflow Toolkit: A taverna based workflow tool for cancer grid

    Directory of Open Access Journals (Sweden)

    Sulakhe Dinanath

    2010-11-01

    Full Text Available Abstract Background In biological and medical domain, the use of web services made the data and computation functionality accessible in a unified manner, which helped automate the data pipeline that was previously performed manually. Workflow technology is widely used in the orchestration of multiple services to facilitate in-silico research. Cancer Biomedical Informatics Grid (caBIG is an information network enabling the sharing of cancer research related resources and caGrid is its underlying service-based computation infrastructure. CaBIG requires that services are composed and orchestrated in a given sequence to realize data pipelines, which are often called scientific workflows. Results CaGrid selected Taverna as its workflow execution system of choice due to its integration with web service technology and support for a wide range of web services, plug-in architecture to cater for easy integration of third party extensions, etc. The caGrid Workflow Toolkit (or the toolkit for short, an extension to the Taverna workflow system, is designed and implemented to ease building and running caGrid workflows. It provides users with support for various phases in using workflows: service discovery, composition and orchestration, data access, and secure service invocation, which have been identified by the caGrid community as challenging in a multi-institutional and cross-discipline domain. Conclusions By extending the Taverna Workbench, caGrid Workflow Toolkit provided a comprehensive solution to compose and coordinate services in caGrid, which would otherwise remain isolated and disconnected from each other. Using it users can access more than 140 services and are offered with a rich set of features including discovery of data and analytical services, query and transfer of data, security protections for service invocations, state management in service interactions, and sharing of workflows, experiences and best practices. The proposed solution is

  15. Mining workflow processes from distributed workflow enactment event logs

    Directory of Open Access Journals (Sweden)

    Kwanghoon Pio Kim

    2012-12-01

    Full Text Available Workflow management systems help to execute, monitor and manage work process flow and execution. These systems, as they are executing, keep a record of who does what and when (e.g. log of events. The activity of using computer software to examine these records, and deriving various structural data results is called workflow mining. The workflow mining activity, in general, needs to encompass behavioral (process/control-flow, social, informational (data-flow, and organizational perspectives; as well as other perspectives, because workflow systems are "people systems" that must be designed, deployed, and understood within their social and organizational contexts. This paper particularly focuses on mining the behavioral aspect of workflows from XML-based workflow enactment event logs, which are vertically (semantic-driven distribution or horizontally (syntactic-driven distribution distributed over the networked workflow enactment components. That is, this paper proposes distributed workflow mining approaches that are able to rediscover ICN-based structured workflow process models through incrementally amalgamating a series of vertically or horizontally fragmented temporal workcases. And each of the approaches consists of a temporal fragment discovery algorithm, which is able to discover a set of temporal fragment models from the fragmented workflow enactment event logs, and a workflow process mining algorithm which rediscovers a structured workflow process model from the discovered temporal fragment models. Where, the temporal fragment model represents the concrete model of the XML-based distributed workflow fragment events log.

  16. Chang'E-3 data pre-processing system based on scientific workflow

    Science.gov (United States)

    tan, xu; liu, jianjun; wang, yuanyuan; yan, wei; zhang, xiaoxia; li, chunlai

    2016-04-01

    The Chang'E-3(CE3) mission have obtained a huge amount of lunar scientific data. Data pre-processing is an important segment of CE3 ground research and application system. With a dramatic increase in the demand of data research and application, Chang'E-3 data pre-processing system(CEDPS) based on scientific workflow is proposed for the purpose of making scientists more flexible and productive by automating data-driven. The system should allow the planning, conduct and control of the data processing procedure with the following possibilities: • describe a data processing task, include:1)define input data/output data, 2)define the data relationship, 3)define the sequence of tasks,4)define the communication between tasks,5)define mathematical formula, 6)define the relationship between task and data. • automatic processing of tasks. Accordingly, Describing a task is the key point whether the system is flexible. We design a workflow designer which is a visual environment for capturing processes as workflows, the three-level model for the workflow designer is discussed:1) The data relationship is established through product tree.2)The process model is constructed based on directed acyclic graph(DAG). Especially, a set of process workflow constructs, including Sequence, Loop, Merge, Fork are compositional one with another.3)To reduce the modeling complexity of the mathematical formulas using DAG, semantic modeling based on MathML is approached. On top of that, we will present how processed the CE3 data with CEDPS.

  17. Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils

    Directory of Open Access Journals (Sweden)

    David Castejón

    2016-02-01

    Full Text Available In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC. To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual was determined by 1H-NMR spectroscopy according to this protocol.

  18. Automatic 1H-NMR Screening of Fatty Acid Composition in Edible Oils

    Science.gov (United States)

    Castejón, David; Fricke, Pascal; Cambero, María Isabel; Herrera, Antonio

    2016-01-01

    In this work, we introduce an NMR-based screening method for the fatty acid composition analysis of edible oils. We describe the evaluation and optimization needed for the automated analysis of vegetable oils by low-field NMR to obtain the fatty acid composition (FAC). To achieve this, two scripts, which automatically analyze and interpret the spectral data, were developed. The objective of this work was to drive forward the automated analysis of the FAC by NMR. Due to the fact that this protocol can be carried out at low field and that the complete process from sample preparation to printing the report only takes about 3 min, this approach is promising to become a fundamental technique for high-throughput screening. To demonstrate the applicability of this method, the fatty acid composition of extra virgin olive oils from various Spanish olive varieties (arbequina, cornicabra, hojiblanca, manzanilla, and picual) was determined by 1H-NMR spectroscopy according to this protocol. PMID:26891323

  19. Feature-point-extracting-based automatically mosaic for composite microscopic images

    Institute of Scientific and Technical Information of China (English)

    YIN YanSheng; ZHAO XiuYang; TIAN XiaoFeng; LI Jia

    2007-01-01

    Image mosaic is a crucial step in the three-dimensional reconstruction of composite materials to align the serial images. A novel method is adopted to mosaic two SiC/Al microscopic images with an amplification coefficient of 1000. The two images are denoised by Gaussian model, and feature points are then extracted by using Harris corner detector. The feature points are filtered through Canny edge detector. A 40x40 feature template is chosen by sowing a seed in an overlapped area of the reference image, and the homologous region in floating image is acquired automatically by the way of correlation analysis. The feature points in matched templates are used as feature point-sets. Using the transformational parameters acquired by SVD-ICP method, the two images are transformed into the universal coordinates and merged to the final mosaic image.

  20. Semi-automatic web service composition for the life sciences using the BioMoby semantic web framework.

    Science.gov (United States)

    DiBernardo, Michael; Pottinger, Rachel; Wilkinson, Mark

    2008-10-01

    Researchers in the life-sciences are currently limited to small-scale informatics experiments and analyses because of the lack of interoperability among life-sciences web services. This limitation can be addressed by annotating services and their interfaces with semantic information, so that interoperability problems can be reasoned about programmatically. The Moby semantic web framework is a popular and mature platform that is used for this purpose. However, the number of services that are available to select from when building a workflow is becoming unmanageable for users. As such, attempts have been made to assist with service selection and composition. These tasks fall under the general label of automated service composition. We present a prototype workflow assembly client that reduces the number of choices that users have to make by (1) restricting the overall set of services presented to them and (2) ranking services so that the the most desirable ones are presented first. We demonstrate via an evaluation of this prototype that a unification of relatively simple techniques can rank desirable services highly while maintaining interactive response times.

  1. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  2. PRODUCT-ORIENTED WORKFLOW MANAGEMENT IN CAPP

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A product-oriented process workflow management model is proposed based on the multi-agent technology.The autonomy, inter-operability, scalability and flexibility of agent are used to cooperate the whole process planning andachieve the full share of resource and information. Thus, unnecessary waste of human labor, time and work is reducedand the computer-aided process planning (CAPP) system's adaptability and stability are improved. In the detailed im-plementation, according to the products' BOM (Bill of materials) in structural design, the task assignment, managementcontrol, automatic process making, process examination and process sanction are combined into a unified management tomake it convenient for the adjustment, control and management.

  3. From chart tracking to workflow management.

    Science.gov (United States)

    Srinivasan, P; Vignes, G; Venable, C; Hazelwood, A; Cade, T

    1994-01-01

    The current interest in system-wide integration appears to be based on the assumption that an organization, by digitizing information and accepting a common standard for the exchange of such information, will improve the accessibility of this information and automatically experience benefits resulting from its more productive use. We do not dispute this reasoning, but assert that an organization's capacity for effective change is proportional to the understanding of the current structure among its personnel. Our workflow manager is based on the use of a Parameterized Petri Net (PPN) model which can be configured to represent an arbitrarily detailed picture of an organization. The PPN model can be animated to observe the model organization in action, and the results of the animation analyzed. This simulation is a dynamic ongoing process which changes with the system and allows members of the organization to pose "what if" questions as a means of exploring opportunities for change. We present, the "workflow management system" as the natural successor to the tracking program, incorporating modeling, scheduling, reactive planning, performance evaluation, and simulation. This workflow management system is more than adequate for meeting the needs of a paper chart tracking system, and, as the patient record is computerized, will serve as a planning and evaluation tool in converting the paper-based health information system into a computer-based system.

  4. Web API for biology with a workflow navigation system.

    Science.gov (United States)

    Kwon, Yeondae; Shigemoto, Yasumasa; Kuwana, Yoshikazu; Sugawara, Hideaki

    2009-07-01

    DNA Data Bank of Japan (DDBJ) provides Web-based systems for biological analysis, called Web APIs for biology (WABI). So far, we have developed over 20 SOAP services and several workflows that consist of a series of method invocations. In this article, we present newly developed services of WABI, that is, REST-based Web services, additional workflows and a workflow navigation system. Each Web service and workflow can be used as a complete service or a building block for programmers to construct more complex information processing systems. The workflow navigation system aims to help non-programming biologists perform analysis tasks by providing next applicable services on Web browsers according to the output of a previously selected service. With this function, users can apply multiple services consecutively only by following links without any programming or manual copy-and-paste operations on Web browsers. The listed services are determined automatically by the system referring to the dictionaries of service categories, the input/output types of services and HTML tags. WABI and the workflow navigation system are freely accessible at http://www.xml.nig.ac.jp/index.html and http://cyclamen.ddbj.nig.ac.jp/, respectively.

  5. 云制造环境下基于工作流的多粒度资源组合方法%Approach to multi-granularity resource composition based on workflow in cloud manufacturing

    Institute of Scientific and Technical Information of China (English)

    李海波

    2013-01-01

    Integrating physical manufacturing resources and transforming to logic manufacturing resource was the main task for resource virtualization in cloud manufacturing, there was no feasible solution to this problem at present Based on the mining of workflow log and definition of activities with resource configuration in workflow model, the service frequencies of physical manufacturing resources composition were calculated from two levels, business process and activity instance respectively. Then multi-granularity composition set was obtained by merging the two results. This approach could supply the reliable assurance for mapping from physical to virtual manufacturing resource. Finally an experiment was proposed to verify the effectiveness of this method.%把物理制造资源组合在一起并转化为逻辑制造资源是云制造中资源虚拟化的主要任务,目前尚无实用的资源组合方法,为此基于对工作流日志的挖掘,以及工作流模型中对活动及资源配置的定义,分别从过程实例和活动两个层面计算物理制造资源组合的使用频率,最后合并两个计算结果得到不同粒度的资源组合,以此形成的资源组合集可在资源虚拟化时为物理一虚拟资源的映射提供依据.最后通过实验验证了所提方法的有效性.

  6. Workflow Management in Electronic Commerce

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Spaccapietra, S.; March, S.T.; Kambayashi, Y.

    2002-01-01

    This tutorial addresses the application of workflow management (WFM) for process support in both these cases. The tutorial is organized into three parts. In the first part, we pay attention to (classical) workflow management in the context of a single organization. In the second part, we extend this

  7. A workflow for digitalization projects

    OpenAIRE

    De Mulder, Tom

    2005-01-01

    More and more institutions want to convert their traditional content to digital formats. In such pro jects the digitalization and metadata stages often happen asynchronously. This paper identifies the importance of frequent cross-verification of both. We suggest a workflow to formalise this process, and a possible technical implementation to automate this workflow.

  8. Automating Workflow using Dialectical Argumentation

    NARCIS (Netherlands)

    Urovi, Visara; Bromuri, Stefano; McGinnis, Jarred; Stathis, Kostas; Omicini, Andrea

    2008-01-01

    This paper presents a multi-agent framework based on argumentative agent technology for the automation of the workflow selection and execution. In this framework, workflow selection is coordinated by agent interactions governed by the rules of a dialogue game whose purpose is to evaluate the workflo

  9. Monitoring of Grid scientific workflows

    NARCIS (Netherlands)

    Balis, B.; Bubak, M.; Łabno, B.

    2008-01-01

    Scientific workflows are a means of conducting in silico experiments in modern computing infrastructures for e-Science, often built on top of Grids. Monitoring of Grid scientific workflows is essential not only for performance analysis but also to collect provenance data and gather feedback useful i

  10. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  11. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent...... of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical...

  12. Faces from the web: automatic selection and composition of media for casual screen consumption and printed artwork

    Science.gov (United States)

    Cheatle, Phil; Greig, Darryl; Slatter, David

    2010-02-01

    Web image search engines facilitate the production of image sets in which faces appear. Many people enjoy producing and sharing media collections of this type and generating new images or video experiences. Skilled practitioners produce visually appealing artifacts from such collections but few users have the time or creative ability to do so. The problem is to automatically create an image or ambient experience which sustains interest. A full solution requires agreements with copyright holders and input from graphics designers. We address the underlying technical problems of extraction and composition. We describe an automatic system that identifies regions containing human faces in each image of an image set resulting from a web search. The face regions are composed into dynamically synthesized multilayer graphical backgrounds. The aesthetic aspects of the composition are controlled by active templates. These aspects include face size and positioning but also face identity and number of faces in a group. The output structure is multi layer supporting both the generation of static images and video consisting of transitions between the compositions.

  13. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  14. Agile parallel bioinformatics workflow management using Pwrake

    Directory of Open Access Journals (Sweden)

    Tanaka Masahiro

    2011-09-01

    Full Text Available Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environment are often prioritized in scientific workflow management. These features have a greater affinity with the agile software development method through iterative development phases after trial and error. Here, we show the application of a scientific workflow system Pwrake to bioinformatics workflows. Pwrake is a parallel workflow extension of Ruby's standard build tool Rake, the flexibility of which has been demonstrated in the astronomy domain. Therefore, we hypothesize that Pwrake also has advantages in actual bioinformatics workflows. Findings We implemented the Pwrake workflows to process next generation sequencing data using the Genomic Analysis Toolkit (GATK and Dindel. GATK and Dindel workflows are typical examples of sequential and parallel workflows, respectively. We found that in practice, actual scientific workflow development iterates over two phases, the workflow definition phase and the parameter adjustment phase. We introduced separate workflow definitions to help focus on each of the two developmental phases, as well as helper methods to simplify the descriptions. This approach increased iterative development efficiency. Moreover, we implemented combined workflows to demonstrate modularity of the GATK and Dindel workflows. Conclusions Pwrake enables agile management of scientific workflows in the bioinformatics domain. The internal domain specific language design built on Ruby gives the flexibility of rakefiles for writing scientific workflows

  15. Standards for business analytics and departmental workflow.

    Science.gov (United States)

    Erickson, Bradley J; Meenan, Christopher; Langer, Steve

    2013-02-01

    Efficient workflow is essential for a successful business. However, there is relatively little literature on analytical tools and standards for defining workflow and measuring workflow efficiency. Here, we describe an effort to define a workflow lexicon for medical imaging departments, including the rationale, the process, and the resulting lexicon.

  16. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  17. Scientific Workflows + Provenance = Better (Meta-)Data Management

    Science.gov (United States)

    Ludaescher, B.; Cuevas-Vicenttín, V.; Missier, P.; Dey, S.; Kianmajd, P.; Wei, Y.; Koop, D.; Chirigati, F.; Altintas, I.; Belhajjame, K.; Bowers, S.

    2013-12-01

    The origin and processing history of an artifact is known as its provenance. Data provenance is an important form of metadata that explains how a particular data product came about, e.g., how and when it was derived in a computational process, which parameter settings and input data were used, etc. Provenance information provides transparency and helps to explain and interpret data products. Other common uses and applications of provenance include quality control, data curation, result debugging, and more generally, 'reproducible science'. Scientific workflow systems (e.g. Kepler, Taverna, VisTrails, and others) provide controlled environments for developing computational pipelines with built-in provenance support. Workflow results can then be explained in terms of workflow steps, parameter settings, input data, etc. using provenance that is automatically captured by the system. Scientific workflows themselves provide a user-friendly abstraction of the computational process and are thus a form of ('prospective') provenance in their own right. The full potential of provenance information is realized when combining workflow-level information (prospective provenance) with trace-level information (retrospective provenance). To this end, the DataONE Provenance Working Group (ProvWG) has developed an extension of the W3C PROV standard, called D-PROV. Whereas PROV provides a 'least common denominator' for exchanging and integrating provenance information, D-PROV adds new 'observables' that described workflow-level information (e.g., the functional steps in a pipeline), as well as workflow-specific trace-level information ( timestamps for each workflow step executed, the inputs and outputs used, etc.) Using examples, we will demonstrate how the combination of prospective and retrospective provenance provides added value in managing scientific data. The DataONE ProvWG is also developing tools based on D-PROV that allow scientists to get more mileage from provenance metadata

  18. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  19. Development of a beam builder for automatic fabrication of large composite space structures

    Science.gov (United States)

    Bodle, J. G.

    1979-01-01

    The composite material beam builder which will produce triangular beams from pre-consolidated graphite/glass/thermoplastic composite material through automated mechanical processes is presented, side member storage, feed and positioning, ultrasonic welding, and beam cutoff are formed. Each process lends itself to modular subsystem development. Initial development is concentrated on the key processes for roll forming and ultrasonic welding composite thermoplastic materials. The construction and test of an experimental roll forming machine and ultrasonic welding process control techniques are described.

  20. Within day variation in fatty acid composition of milk from cows in an automatic milking system

    DEFF Research Database (Denmark)

    Larsen, Mette Krogh; Weisbjerg, Martin Riis; Kristensen, Camilla Bjerg;

    2012-01-01

    Milk fatty acid composition is influenced by a range of conditions such as breed, feeding, and stage of lactation. Knowledge of milk fatty acid composition of individual cows would make it possible to sort milk at farm level according to certain fatty acid specifications. In the present study, 22...

  1. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  2. A Novel Verification Approach of Workflow Schema

    Institute of Scientific and Technical Information of China (English)

    WANG Guangqi; WANG Juying; WANG Yan; SONG Baoyan; YU Ge

    2006-01-01

    A workflow schema is an abstract description of the business processed by workflow model, and plays a critical role in analyzing, executing and reorganizing business processes. The verification issue on the correctness of complicated workflow schemas is difficult in the field of workflow. We make an intensive study of it in this paper. We present here local errors and schema logic errors (global errors) in workflow schemas in detail, and offer some constraint rules trying to avoid schema errors during modeling. In addition, we propose a verification approach based on graph reduction and graph spread, and give the algorithm. The algorithm is implemented in a workflow prototype system e-ScopeWork.

  3. Automatic Determination of Fiber-Length Distribution in Composite Material Using 3D CT Data

    Directory of Open Access Journals (Sweden)

    Günther Greiner

    2010-01-01

    Full Text Available Determining fiber length distribution in fiber reinforced polymer components is a crucial step in quality assurance, since fiber length has a strong influence on overall strength, stiffness, and stability of the material. The approximate fiber length distribution is usually determined early in the development process, as conventional methods require a destruction of the sample component. In this paper, a novel, automatic, and nondestructive approach for the determination of fiber length distribution in fiber reinforced polymers is presented. For this purpose, high-resolution computed tomography is used as imaging method together with subsequent image analysis for evaluation. The image analysis consists of an iterative process where single fibers are detected automatically in each iteration step after having applied image enhancement algorithms. Subsequently, a model-based approach is used together with a priori information in order to guide a fiber tracing and segmentation process. Thereby, the length of the segmented fibers can be calculated and a length distribution can be deduced. The performance and the robustness of the segmentation method is demonstrated by applying it to artificially generated test data and selected real components.

  4. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  5. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  6. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; Vet, van der P.E.; Veer, van der G.C.; Roos, M.; Dijk, van E.M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  7. Rapid Energy Modeling Workflow Demonstration

    Science.gov (United States)

    2013-10-31

    BIM Building Information Modeling BPA Building Performance Analysis BTU British Thermal Unit CBECS Commercial Building ...geometry, orientation, weather, and materials, generates 3D Building Information Models ( BIM ) guided by satellite views of building footprints and...Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and conceptual energy analysis.

  8. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    Declarative or constraint-based business process and workflow notations have received increasing interest in the last decade as possible means of addressing the challenge of supporting at the same time flexibility in execution, adaptability and compliance. However, the definition of concurrent se...... development has been verified correct in the Isabelle-HOL interactive theorem prover....

  9. Constructing workflows from script applications

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.; Malawski, M.

    2012-01-01

    For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment,

  10. Enabling Smart Workflows over Heterogeneous ID-SensingTechnologies

    Directory of Open Access Journals (Sweden)

    Guillermo Palacios

    2012-11-01

    Full Text Available Sensing technologies in mobile devices play a key role in reducing the gapbetween the physical and the digital world. The use of automatic identification capabilitiescan improve user participation in business processes where physical elements are involved(Smart Workflows. However, identifying all objects in the user surroundings does notautomatically translate into meaningful services to the user. This work introduces Parkour,an architecture that allows the development of services that match the goals of each ofthe participants in a smart workflow. Parkour is based on a pluggable architecture thatcan be extended to provide support for new tasks and technologies. In order to facilitatethe development of these plug-ins, tools that automate the development process are alsoprovided. Several Parkour-based systems have been developed in order to validate theapplicability of the proposal.

  11. Execution Time Estimation for Workflow Scheduling

    NARCIS (Netherlands)

    Chirkin, A.M.; Belloum, A..S.Z.; Kovalchuk, S.V.; Makkes, M.X.

    2014-01-01

    Estimation of the execution time is an important part of the workflow scheduling problem. The aim of this paper is to highlight common problems in estimating the workflow execution time and propose a solution that takes into account the complexity and the randomness of the workflow components and th

  12. Automatic phases recognition in pituitary surgeries by microscope images classification

    OpenAIRE

    Lalys, Florent; Riffaud, Laurent; Morandi, Xavier; Jannin, Pierre

    2010-01-01

    International audience; The segmentation of the surgical workflow might be helpful for providing context-sensitive user interfaces, or generating automatic report. Our approach focused on the automatic recognition of surgical phases by microscope image classification. Our workflow, including images features extraction, image database labelisation, Principal Component Analysis (PCA) transformation and 10-fold cross-validation studies was performed on a specific type of neurosurgical interventi...

  13. Toward automatic evaluation of defect detectability in infrared images of composites and honeycomb structures

    Science.gov (United States)

    Florez-Ospina, Juan F.; Benitez-Restrepo, H. D.

    2015-07-01

    Non-destructive testing (NDT) refers to inspection methods employed to assess a material specimen without impairing its future usefulness. An important type of these methods is infrared (IR) for NDT (IRNDT), which employs the heat emitted by bodies/objects to rapidly and noninvasively inspect wide surfaces and to find specific defects such as delaminations, cracks, voids, and discontinuities in materials. Current advancements in sensor technology for IRNDT generate great amounts of image sequences. These data require further processing to determine the integrity of objects. Processing techniques for IRNDT data implicitly looks for defect visibility enhancement. Commonly, IRNDT community employs signal to noise ratio (SNR) to measure defect visibility. Nonetheless, current applications of SNR are local, thereby overseeing spatial information, and depend on a-priori knowledge of defect's location. In this paper, we present a general framework to assess defect detectability based on SNR maps derived from processed IR images. The joint use of image segmentation procedures along with algorithms for filling regions of interest (ROI) estimates a reference background to compute SNR maps. Our main contributions are: (i) a method to compute SNR maps that takes into account spatial variation and are independent of a-priori knowledge of defect location in the sample, (ii) spatial background analysis in processed images, and (iii) semi-automatic calculation of segmentation algorithm parameters. We test our approach in carbon fiber and honeycomb samples with complex geometries and defects with different sizes and depths.

  14. Modeling Workflow Using UML Activity Diagram

    Institute of Scientific and Technical Information of China (English)

    Wei Yinxing(韦银星); Zhang Shensheng

    2004-01-01

    An enterprise can improve its adaptability in the changing market by means of workflow technologies. In the build time, the main function of Workflow Management System (WFMS) is to model business process. Workflow model is an abstract representation of the real-world business process. The Unified Modeling Language (UML) activity diagram is an important visual process modeling language proposed by the Object Management Group (OMG). The novelty of this paper is representing workflow model by means of UML activity diagram. A translation from UML activity diagram to π-calculus is established. Using π-calculus, the deadlock property of workflow is analyzed.

  15. Workflow-based approaches to neuroimaging analysis.

    Science.gov (United States)

    Fissell, Kate

    2007-01-01

    Analysis of functional and structural magnetic resonance imaging (MRI) brain images requires a complex sequence of data processing steps to proceed from raw image data to the final statistical tests. Neuroimaging researchers have begun to apply workflow-based computing techniques to automate data analysis tasks. This chapter discusses eight major components of workflow management systems (WFMSs): the workflow description language, editor, task modules, data access, verification, client, engine, and provenance, and their implementation in the Fiswidgets neuroimaging workflow system. Neuroinformatics challenges involved in applying workflow techniques in the domain of neuroimaging are discussed.

  16. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  17. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  18. Layered Workflow Process Model Based on Extended Synchronizer

    Directory of Open Access Journals (Sweden)

    Gang Ni

    2014-07-01

    Full Text Available The layered workflow process model provide a modeling approach and analysis for the key process with Petri Net. It not only describes the relation between the process of business flow and transition nodes clearly, but also limits the rapid increase in the scale of libraries, transition and directed arcs. This paper studies the process like reservation and complaint handling information management system, especially for the multi-mergence and discriminator patterns which can not be directly modeled with existing synchronizers. Petri Net is adopted to provide formalization description for the workflow patterns and the relation between Arcs and weight class are also analyzed. We use the number of in and out arcs to generalize the workflow into three synchronous modes: fully synchronous mode, competition synchronous mode and asynchronous mode. The types and parameters for synchronization are added to extend the modeling ability of the synchronizers and the synchronous distance is also expanded. The extended synchronizers have the ability to terminate branches automatically or activate the next link many times, besides the ability of original synchronizers. By the analyses on cases of the key business, it is verified that the original synchronizers can not model directly, while the extended synchronizers based on Petri Net can provide modeling for multi-mergence and discriminator modes.

  19. Operational Semantic of Workflow Engine and the Realizing Technique

    Institute of Scientific and Technical Information of China (English)

    FU Yan-ning; LIU Lei; ZHAO Dong-fan; JIN Long-fei

    2005-01-01

    At present, there is no formalized description of the executing procedure of workflow models. The procedure of workflow models executing in workflow engine is described using operational semantic. The formalized description of process instances and activity instances leads to very clear structure of the workflow engine, has easy cooperation of the heterogeneous workflow engines and guides the realization of the workflow engine function. Meanwhile, the software of workflow engine has been completed by means of the formalized description.

  20. A Comparison of Using Taverna and BPEL in Building Scientific Workflows: the case of caGrid.

    Science.gov (United States)

    Tan, Wei; Missier, Paolo; Foster, Ian; Madduri, Ravi; Goble, Carole

    2010-06-25

    With the emergence of "service oriented science," the need arises to orchestrate multiple services to facilitate scientific investigation-that is, to create "science workflows." We present here our findings in providing a workflow solution for the caGrid service-based grid infrastructure. We choose BPEL and Taverna as candidates, and compare their usability in the lifecycle of a scientific workflow, including workflow composition, execution, and result analysis. Our experience shows that BPEL as an imperative language offers a comprehensive set of modeling primitives for workflows of all flavors; while Taverna offers a dataflow model and a more compact set of primitives that facilitates dataflow modeling and pipelined execution. We hope that this comparison study not only helps researchers select a language or tool that meets their specific needs, but also offers some insight on how a workflow language and tool can fulfill the requirement of the scientific community.

  1. Designing a road map for geoscience workflows

    Science.gov (United States)

    Duffy, Christopher; Gil, Yolanda; Deelman, Ewa; Marru, Suresh; Pierce, Marlon; Demir, Ibrahim; Wiener, Gerry

    2012-06-01

    Advances in geoscience research and discovery are fundamentally tied to data and computation, but formal strategies for managing the diversity of models and data resources in the Earth sciences have not yet been resolved or fully appreciated. The U.S. National Science Foundation (NSF) EarthCube initiative (http://earthcube.ning.com), which aims to support community-guided cyberinfrastructure to integrate data and information across the geosciences, recently funded four community development activities: Geoscience Workflows; Semantics and Ontologies; Data Discovery, Mining, and Integration; and Governance. The Geoscience Workflows working group, with broad participation from the geosciences, cyberinfrastructure, and other relevant communities, is formulating a workflows road map (http://sites.google.com/site/earthcubeworkflow/). The Geoscience Workflows team coordinates with each of the other community development groups given their direct relevance to workflows. Semantics and ontologies are mechanisms for describing workflows and the data they process.

  2. Agile parallel bioinformatics workflow management using Pwrake.

    OpenAIRE

    2011-01-01

    Abstract Background In bioinformatics projects, scientific workflow systems are widely used to manage computational procedures. Full-featured workflow systems have been proposed to fulfil the demand for workflow management. However, such systems tend to be over-weighted for actual bioinformatics practices. We realize that quick deployment of cutting-edge software implementing advanced algorithms and data formats, and continuous adaptation to changes in computational resources and the environm...

  3. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...... on multi-processing and cluster computing using PyCSP. Additionally, McStas is demonstrated to utilise grid computing resources using PyCSP. Finally, this thesis presents a new dynamic channel model, which has not yet been implemented for PyCSP. The dynamic channel is able to change the internal...

  4. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  5. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  6. 基于语义Web Service的模型自动组合综述%Survey of Automatic Model Composition Based on Semantic Web Service

    Institute of Scientific and Technical Information of China (English)

    黄辉; 陈学广; 王志武

    2013-01-01

    回顾了DSS模型管理功能中模型组合概念的提出、发展和实现.对当前分布式模型管理以及模型组合的研究现状进行了综述,比较了主流的模型自动组合设计方法各自的优缺点.由于Web Service技术的出现,以及语义Web、人工智能规划等领域的创新应用,使得模型的自动组合有了实现的可能,但同时也面临难题:目前的分布式模型组合主要借鉴Web Service的自动组合技术,通过将模型封装为模型服务来把模型组合的问题转化为Web Service组合.然而模型组合有其自身的特点,例如定性和定量模型不能直接组合,使得传统的Web Service自动组合方法不完全适用.%This paper reviewed the proposition,development and implementation of the model composition function in decision support systems (DSS).A survey of distributed model management and model composition was given,and multiple model composition methods were compared.Due to the emergence of Web Service technology,and the innovative application of both of the semantic Web and AI planning fields,the automatic composition of DSS models is possible.But some problems also exist.The distributed model composition methods mainly learn from the automatic Web service composition,and models are encapsulated into Web services.The problem of model composition is converted to Web service composition.However,model composition has its own characteristics that the traditional automatic Web service composition methods are not fully applicable,for example the direct composition of qualitative and quantitative models.

  7. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  8. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    Analysis of workflow processes with quantitative aspects like timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and study the foundational problems of soundness and strong (time-bounded) soundness. We explore the decidability o...

  9. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  10. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.; Grefen, P.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  11. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  12. Metaworkflows and Workflow Interoperability for Heliophysics

    Science.gov (United States)

    Pierantoni, Gabriele; Carley, Eoin P.

    2014-06-01

    Heliophysics is a relatively new branch of physics that investigates the relationship between the Sun and the other bodies of the solar system. To investigate such relationships, heliophysicists can rely on various tools developed by the community. Some of these tools are on-line catalogues that list events (such as Coronal Mass Ejections, CMEs) and their characteristics as they were observed on the surface of the Sun or on the other bodies of the Solar System. Other tools offer on-line data analysis and access to images and data catalogues. During their research, heliophysicists often perform investigations that need to coordinate several of these services and to repeat these complex operations until the phenomena under investigation are fully analyzed. Heliophysicists combine the results of these services; this service orchestration is best suited for workflows. This approach has been investigated in the HELIO project. The HELIO project developed an infrastructure for a Virtual Observatory for Heliophysics and implemented service orchestration using TAVERNA workflows. HELIO developed a set of workflows that proved to be useful but lacked flexibility and re-usability. The TAVERNA workflows also needed to be executed directly in TAVERNA workbench, and this forced all users to learn how to use the workbench. Within the SCI-BUS and ER-FLOW projects, we have started an effort to re-think and re-design the heliophysics workflows with the aim of fostering re-usability and ease of use. We base our approach on two key concepts, that of meta-workflows and that of workflow interoperability. We have divided the produced workflows in three different layers. The first layer is Basic Workflows, developed both in the TAVERNA and WS-PGRADE languages. They are building blocks that users compose to address their scientific challenges. They implement well-defined Use Cases that usually involve only one service. The second layer is Science Workflows usually developed in TAVERNA. They

  13. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    output, given the input resource at hand. Still, in such cases it may be possible to reach the set goal by chaining a number of tools. The approach presented here frees the user of having to meddle with tools and the construction of workflows. Instead, the user only needs to supply the workflow manager......The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  14. Implementation of WPDL Conforming Workflow Model

    Institute of Scientific and Technical Information of China (English)

    张志君; 范玉顺

    2003-01-01

    Workflow process definition language (WPDL) facilitates the transfer of workflow process definitions between separate workflow products. However, much work is still needed to transfer the specific workflow model to a WPDL conforming model. CIMFlow is a workflow management system developed by the National CIMS Engineering Research Center. This paper discusses the methods by which the CIMFlow model conforms to the WPDL meta-model and the differences between the WPDL meta-model and the CIMFlow model. Some improvements are proposed for the WPDL specification. Finally, the mapping and translating methods between the entities and attributes are given for the two models. The proposed methods and improvements are valuable as a reference for other mapping applications and the WPDL specification.

  15. CSP for Executable Scientific Workflows

    DEFF Research Database (Denmark)

    Friborg, Rune Møllegaard

    and can usually benefit performance-wise from both multiprocessing, cluster and grid environments. PyCSP is an implementation of Communicating Sequential Processes (CSP) for the Python programming language and takes advantage of CSP's formal and verifiable approach to controlling concurrency...... and the readability of Python source code. Python is a popular programming language in the scientific community, with many scientific libraries (modules) and simple integration to external languages. This thesis presents a PyCSP extended with many new features and a more robust implementation to allow scientific...... is demonstrated through examples. By providing a robust library for organising scientific workflows in a Python application I hope to inspire scientific users to adopt PyCSP. As a proof-of-concept this thesis demonstrates three scientific applications: kNN, stochastic minimum search and McStas to scale well...

  16. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  17. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  18. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert

    2012-04-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs containing both relational (SQL) processing nodes and opaque processing nodes programmed in Python. For both types of nodes, Panda generates logical provenance - provenance information stored at the processing-node level - and uses the generated provenance to support record-level backward tracing and forward tracing operations. In our demonstration we use Panda to integrate, process, and analyze actual education data from multiple sources. We specifically demonstrate how Panda\\'s provenance generation and tracing capabilities can be very useful for workflow debugging, and for drilling down on specific results of interest. © 2012 IEEE.

  19. The design of cloud workflow systems

    CERN Document Server

    Liu, Xiao; Zhang, Gaofeng

    2011-01-01

    Cloud computing is the latest market-oriented computing paradigm which brings software design and development into a new era characterized by ""XaaS"", i.e. everything as a service. Cloud workflows, as typical software applications in the cloud, are composed of a set of partially ordered cloud software services to achieve specific goals. However, due to the low QoS (quality of service) nature of the cloud environment, the design of workflow systems in the cloud becomes a challenging issue for the delivery of high quality cloud workflow applications. To address such an issue, this book presents

  20. Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses.

    Science.gov (United States)

    Liu, Bo; Madduri, Ravi K; Sotomayor, Borja; Chard, Kyle; Lacinski, Lukasz; Dave, Utpal J; Li, Jianqiang; Liu, Chunchen; Foster, Ian T

    2014-06-01

    Due to the upcoming data deluge of genome data, the need for storing and processing large-scale genome data, easy access to biomedical analyses tools, efficient data sharing and retrieval has presented significant challenges. The variability in data volume results in variable computing and storage requirements, therefore biomedical researchers are pursuing more reliable, dynamic and convenient methods for conducting sequencing analyses. This paper proposes a Cloud-based bioinformatics workflow platform for large-scale next-generation sequencing analyses, which enables reliable and highly scalable execution of sequencing analyses workflows in a fully automated manner. Our platform extends the existing Galaxy workflow system by adding data management capabilities for transferring large quantities of data efficiently and reliably (via Globus Transfer), domain-specific analyses tools preconfigured for immediate use by researchers (via user-specific tools integration), automatic deployment on Cloud for on-demand resource allocation and pay-as-you-go pricing (via Globus Provision), a Cloud provisioning tool for auto-scaling (via HTCondor scheduler), and the support for validating the correctness of workflows (via semantic verification tools). Two bioinformatics workflow use cases as well as performance evaluation are presented to validate the feasibility of the proposed approach.

  1. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  2. Workflow Based Software Development Environment Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  3. E-BioFlow: Different perspectives on scientific workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; van der Vet, P.; Breit, T.; Nijholt, A.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  4. E-BioFlow: Different Perspectives on Scientific Workflows

    NARCIS (Netherlands)

    Wassink, I.; Rauwerda, H.; Vet, van der P.E.; Breit, T.; Nijholt, A.; Elloumi, M.; Küng, J.; Linial, M.; Murphy, R.F.; Schneider, K.; Toma, C.

    2008-01-01

    We introduce a new type of workflow design system called e-BioFlow and illustrate it by means of a simple sequence alignment workflow. E-BioFlow, intended to model advanced scientific workflows, enables the user to model a workflow from three different but strongly coupled perspectives: the control

  5. SHIWA Services for Workflow Creation and Sharing in Hydrometeorolog

    Science.gov (United States)

    Terstyanszky, Gabor; Kiss, Tamas; Kacsuk, Peter; Sipos, Gergely

    2014-05-01

    Researchers want to run scientific experiments on Distributed Computing Infrastructures (DCI) to access large pools of resources and services. To run these experiments requires specific expertise that they may not have. Workflows can hide resources and services as a virtualisation layer providing a user interface that researchers can use. There are many scientific workflow systems but they are not interoperable. To learn a workflow system and create workflows may require significant efforts. Considering these efforts it is not reasonable to expect that researchers will learn new workflow systems if they want to run workflows developed in other workflow systems. To overcome it requires creating workflow interoperability solutions to allow workflow sharing. The FP7 'Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs' (SHIWA) project developed the Coarse-Grained Interoperability concept (CGI). It enables recycling and sharing workflows of different workflow systems and executing them on different DCIs. SHIWA developed the SHIWA Simulation Platform (SSP) to implement the CGI concept integrating three major components: the SHIWA Science Gateway, the workflow engines supported by the CGI concept and DCI resources where workflows are executed. The science gateway contains a portal, a submission service, a workflow repository and a proxy server to support the whole workflow life-cycle. The SHIWA Portal allows workflow creation, configuration, execution and monitoring through a Graphical User Interface using the WS-PGRADE workflow system as the host workflow system. The SHIWA Repository stores the formal description of workflows and workflow engines plus executables and data needed to execute them. It offers a wide-range of browse and search operations. To support non-native workflow execution the SHIWA Submission Service imports the workflow and workflow engine from the SHIWA Repository. This service either invokes locally or remotely

  6. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  7. How Workflow Documentation Facilitates Curation Planning

    Science.gov (United States)

    Wickett, K.; Thomer, A. K.; Baker, K. S.; DiLauro, T.; Asangba, A. E.

    2013-12-01

    The description of the specific processes and artifacts that led to the creation of a data product provide a detailed picture of data provenance in the form of a workflow. The Site-Based Data Curation project, hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, has been investigating how workflows can be used in developing curation processes and policies that move curation "upstream" in the research process. The team has documented an individual workflow for geobiology data collected during a single field trip to Yellowstone National Park. This specific workflow suggests a generalized three-part process for field data collection that comprises three distinct elements: a Planning Stage, a Fieldwork Stage, and a Processing and Analysis Stage. Beyond supplying an account of data provenance, the workflow has allowed the team to identify 1) points of intervention for curation processes and 2) data products that are likely candidates for sharing or deposit. Although these objects may be viewed by individual researchers as 'intermediate' data products, discussions with geobiology researchers have suggested that with appropriate packaging and description they may serve as valuable observational data for other researchers. Curation interventions may include the introduction of regularized data formats during the planning process, data description procedures, the identification and use of established controlled vocabularies, and data quality and validation procedures. We propose a poster that shows the individual workflow and our generalization into a three-stage process. We plan to discuss with attendees how well the three-stage view applies to other types of field-based research, likely points of intervention, and what kinds of interventions are appropriate and feasible in the example workflow.

  8. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  9. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  10. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  11. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  12. Digital workflow management for quality assessment in pathology.

    Science.gov (United States)

    Kalinski, Thomas; Sel, Saadettin; Hofmann, Harald; Zwönitzer, Ralf; Bernarding, Johannes; Roessner, Albert

    2008-01-01

    Information systems (IS) are well established in the multitude of departments and practices of pathology. Apart from being a collection of doctor's reports, IS can be used to organize and evaluate workflow processes. We report on such a digital workflow management using IS at the Department of Pathology, University Hospital Magdeburg, Germany, and present an evaluation of workflow data collected over a whole year. This allows us to measure workflow processes and to distinguish the effects of alterations in the workflow for quality assessment. Moreover, digital workflow management provides the basis for the integration of diagnostic virtual microscopy.

  13. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  14. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  15. Workflow-Based Dynamic Enterprise Modeling

    Institute of Scientific and Technical Information of China (English)

    黄双喜; 范玉顺; 罗海滨; 林慧萍

    2002-01-01

    Traditional systems for enterprise modeling and business process control are often static and cannot adapt to the changing environment. This paper presents a workflow-based method to dynamically execute the enterprise model. This method gives an explicit representation of the business process logic and the relationships between the elements involved in the process. An execution-oriented integrated enterprise modeling system is proposed in combination with other enterprise views. The enterprise model can be established and executed dynamically in the actual environment due to the dynamic properties of the workflow model.

  16. Designing Flexible E-Business Workflow Systems

    Directory of Open Access Journals (Sweden)

    Cătălin Silvestru

    2010-01-01

    Full Text Available In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design of flexible and dynamic workflow management systems for electronic businesses that can lead to agility.

  17. A system for deduction-based formal verification of workflow-oriented software models

    Directory of Open Access Journals (Sweden)

    Klimek Radosław

    2014-12-01

    Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.

  18. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    Science.gov (United States)

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  19. Automatic Block Decomposition of Parametrically Changing Volumes Décomposition automatique par blocs de volumes variables paramétrisés

    Directory of Open Access Journals (Sweden)

    Taghavi R.

    2006-12-01

    Full Text Available A method is introduced for the automatic decomposition of time-varying volumes such as those encountered in engine FEA and CFD. Examples of the application of this method to all-hexahedral mesh generation are also presented. Une méthode est ici proposée pour la décomposition automatique. de volumes instationnaires tels que ceux rencontrés dans les moteurs en calculs de structure en élément finis ou de mécanique de fluides. Des exemples d'application de cette méthode à la génération de maillages en hexahèdre sont présentés.

  20. 一种基于ODE的服务组合自动化部署方案%An Automatic Deployment Scheme for Service Composition Based on ODE

    Institute of Scientific and Technical Information of China (English)

    金仙力; 杨庚

    2012-01-01

    通过对Apache ODE结构以及部署、执行BPEL流程的原理分析,提出一种Apache ODE引擎环境下服务组合的自动化部署方案.实例测试结果表明了该方案的可行性与有效性.%Based on the analysis of the structure and the deployment of Apache ODE and the principle of BPEL process execution, this paper proposes an automatic deployment scheme for service composition in the environment of Apache ODE engine. The results of example test show that this scheme is feasible and valid.

  1. Development and implementation of an automatic integration system for fibre optic sensors in the braiding process with the objective of online-monitoring of composite structures

    Science.gov (United States)

    Hufenbach, W.; Gude, M.; Czulak, A.; Kretschmann, Martin

    2014-04-01

    Increasing economic, political and ecological pressure leads to steadily rising percentage of modern processing and manufacturing processes for fibre reinforced polymers in industrial batch production. Component weights beneath a level achievable by classic construction materials, which lead to a reduced energy and cost balance during product lifetime, justify the higher fabrication costs. However, complex quality control and failure prediction slow down the substitution by composite materials. High-resolution fibre-optic sensors (FOS), due their low diameter, high measuring point density and simple handling, show a high applicability potential for an automated sensor-integration in manufacturing processes, and therefore the online monitoring of composite products manufactured in industrial scale. Integrated sensors can be used to monitor manufacturing processes, part tests as well as the component structure during product life cycle, which simplifies allows quality control during production and the optimization of single manufacturing processes.[1;2] Furthermore, detailed failure analyses lead to a enhanced understanding of failure processes appearing in composite materials. This leads to a lower wastrel number and products of a higher value and longer product life cycle, whereby costs, material and energy are saved. This work shows an automation approach for FOS-integration in the braiding process. For that purpose a braiding wheel has been supplemented with an appliance for automatic sensor application, which has been used to manufacture preforms of high-pressure composite vessels with FOS-networks integrated between the fibre layers. All following manufacturing processes (vacuum infiltration, curing) and component tests (quasi-static pressure test, programmed delamination) were monitored with the help of the integrated sensor networks. Keywords: SHM, high-pressure composite vessel, braiding, automated sensor integration, pressure test, quality control, optic

  2. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  3. RESTFul based heterogeneous Geoprocessing workflow interoperation for Sensor Web Service

    Science.gov (United States)

    Yang, Chao; Chen, Nengcheng; Di, Liping

    2012-10-01

    Advanced sensors on board satellites offer detailed Earth observations. A workflow is one approach for designing, implementing and constructing a flexible and live link between these sensors' resources and users. It can coordinate, organize and aggregate the distributed sensor Web services to meet the requirement of a complex Earth observation scenario. A RESTFul based workflow interoperation method is proposed to integrate heterogeneous workflows into an interoperable unit. The Atom protocols are applied to describe and manage workflow resources. The XML Process Definition Language (XPDL) and Business Process Execution Language (BPEL) workflow standards are applied to structure a workflow that accesses sensor information and one that processes it separately. Then, a scenario for nitrogen dioxide (NO2) from a volcanic eruption is used to investigate the feasibility of the proposed method. The RESTFul based workflows interoperation system can describe, publish, discover, access and coordinate heterogeneous Geoprocessing workflows.

  4. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  5. Rapid Energy Modeling Workflow Demonstration Project

    Science.gov (United States)

    2014-01-01

    BIM Building Information Model BLCC building life cycle costs BPA Building Performance Analysis CAD computer assisted...utilizes information on operations, geometry, orientation, weather, and materials, generating Three-Dimensional (3D) Building Information Models ( BIM ...executed a demonstration of Rapid Energy Modeling (REM) workflows that employed building information modeling ( BIM ) approaches and

  6. Beyond Scientific Workflows: Networked Open Processes

    NARCIS (Netherlands)

    Cushing, R.; Bubak, M.; Belloum, A.; de Laat, C.

    2013-01-01

    The multitude of scientific services and processes being developed brings about challenges for future in silico distributed experiments. Choosing the correct service from an expanding body of processes means that the the task of manually building workflows is becoming untenable. In this paper we pro

  7. Building Digital Audio Preservation Infrastructure and Workflows

    Science.gov (United States)

    Young, Anjanette; Olivieri, Blynne; Eckler, Karl; Gerontakos, Theodore

    2010-01-01

    In 2009 the University of Washington (UW) Libraries special collections received funding for the digital preservation of its audio indigenous language holdings. The university libraries, where the authors work in various capacities, had begun digitizing image and text collections in 1997. Because of this, at the onset of the project, workflows (a…

  8. A Reference Architecture for Workflow Management Systems

    NARCIS (Netherlands)

    Grefen, Paul; Remmerts de Vries, Remmert

    1998-01-01

    In the workflow management field, fast developments are taking place. A growing number of systems is currently under development, both in academic and commercial environments. Consequently, a wide variety of ad hoc architectures has come into existence. Reference models are necessary, however, to al

  9. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  10. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  11. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  12. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.;

    2007-01-01

    an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an approach where an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic trajectory...

  13. The application of workflows to digital heritage systems

    OpenAIRE

    Al-Barakati, Abdullah

    2012-01-01

    Digital heritage systems usually handle a rich and varied mix of digital objects, accompanied by complex and intersecting workflows and processes. However, they usually lack effective workflow management within their components as evident in the lack of integrated solutions that include workflow components. There are a number of reasons for this limitation in workflow management utilization including some technical challenges, the unique nature of each digital resource and the challenges impo...

  14. Understanding dental CAD/CAM for restorations--the digital workflow from a mechanical engineering viewpoint.

    Science.gov (United States)

    Tapie, L; Lebon, N; Mawussi, B; Fron Chabouis, H; Duret, F; Attal, J-P

    2015-01-01

    As digital technology infiltrates every area of daily life, including the field of medicine, so it is increasingly being introduced into dental practice. Apart from chairside practice, computer-aided design/computer-aided manufacturing (CAD/CAM) solutions are available for creating inlays, crowns, fixed partial dentures (FPDs), implant abutments, and other dental prostheses. CAD/CAM dental solutions can be considered a chain of digital devices and software for the almost automatic design and creation of dental restorations. However, dentists who want to use the technology often do not have the time or knowledge to understand it. A basic knowledge of the CAD/CAM digital workflow for dental restorations can help dentists to grasp the technology and purchase a CAM/CAM system that meets the needs of their office. This article provides a computer-science and mechanical-engineering approach to the CAD/CAM digital workflow to help dentists understand the technology.

  15. From Data to Knowledge to Discoveries: Artificial Intelligence and Scientific Workflows

    Directory of Open Access Journals (Sweden)

    Yolanda Gil

    2009-01-01

    Full Text Available Scientific computing has entered a new era of scale and sharing with the arrival of cyberinfrastructure facilities for computational experimentation. A key emerging concept is scientific workflows, which provide a declarative representation of complex scientific applications that can be automatically managed and executed in distributed shared resources. In the coming decades, computational experimentation will push the boundaries of current cyberinfrastructure in terms of inter-disciplinary scope and integrative models of scientific phenomena under study. This paper argues that knowledge-rich workflow environments will provide necessary capabilities for that vision by assisting scientists to validate and vet complex analysis processes and by automating important aspects of scientific exploration and discovery.

  16. Flexible workflow sharing and execution services for e-scientists

    Science.gov (United States)

    Kacsuk, Péter; Terstyanszky, Gábor; Kiss, Tamas; Sipos, Gergely

    2013-04-01

    The sequence of computational and data manipulation steps required to perform a specific scientific analysis is called a workflow. Workflows that orchestrate data and/or compute intensive applications on Distributed Computing Infrastructures (DCIs) recently became standard tools in e-science. At the same time the broad and fragmented landscape of workflows and DCIs slows down the uptake of workflow-based work. The development, sharing, integration and execution of workflows is still a challenge for many scientists. The FP7 "Sharing Interoperable Workflow for Large-Scale Scientific Simulation on Available DCIs" (SHIWA) project significantly improved the situation, with a simulation platform that connects different workflow systems, different workflow languages, different DCIs and workflows into a single, interoperable unit. The SHIWA Simulation Platform is a service package, already used by various scientific communities, and used as a tool by the recently started ER-flow FP7 project to expand the use of workflows among European scientists. The presentation will introduce the SHIWA Simulation Platform and the services that ER-flow provides based on the platform to space and earth science researchers. The SHIWA Simulation Platform includes: 1. SHIWA Repository: A database where workflows and meta-data about workflows can be stored. The database is a central repository to discover and share workflows within and among communities . 2. SHIWA Portal: A web portal that is integrated with the SHIWA Repository and includes a workflow executor engine that can orchestrate various types of workflows on various grid and cloud platforms. 3. SHIWA Desktop: A desktop environment that provides similar access capabilities than the SHIWA Portal, however it runs on the users' desktops/laptops instead of a portal server. 4. Workflow engines: the ASKALON, Galaxy, GWES, Kepler, LONI Pipeline, MOTEUR, Pegasus, P-GRADE, ProActive, Triana, Taverna and WS-PGRADE workflow engines are already

  17. Solutions for complex, multi data type and multi tool analysis: principles and applications of using workflow and pipelining methods.

    Science.gov (United States)

    Munro, Robin E J; Guo, Yike

    2009-01-01

    Analytical workflow technology, sometimes also called data pipelining, is the fundamental component that provides the scalable analytical middleware that can be used to enable the rapid building and deployment of an analytical application. Analytical workflows enable researchers, analysts and informaticians to integrate and access data and tools from structured and non-structured data sources so that analytics can bridge different silos of information; compose multiple analytical methods and data transformations without coding; rapidly develop applications and solutions by visually constructing analytical workflows that are easy to revise should the requirements change; access domain-specific extensions for specific projects or areas, for example, text extraction, visualisation, reporting, genetics, cheminformatics, bioinformatics and patient-based analytics; automatically deploy workflows directly into web portals and as web services to be part of a service-oriented architecture (SOA). By performing workflow building, using a middleware layer for data integration, it is a relatively simple exercise to visually design an analytical process for data analysis and then publish this as a service to a web browser. All this is encapsulated into what can be referred to as an 'Embedded Analytics' methodology which will be described here with examples covering different scientifically focused data analysis problems.

  18. 基于流演算的Web服务自动组合方法%An Approach for Automatic Web Services Composition Based on Fluent Calculus

    Institute of Scientific and Technical Information of China (English)

    陈志勇; 李庆忠; 王文明; 崔立真; 丛国进

    2013-01-01

    近年来,基于语义的Web服务组合,尤其是Web服务的自动组合方法已成为服务计算领域的一个研究热点.实现了从一个OWL-S过程模型到流演算概念的映射,并给出了相应的转换算法.在此基础上,提出了一个新颖的、基于流演算形式化体系的Web服务自动组合方法.该方法采用前推推理机制对状态和动作进行推理,有效地克服了以传统的情景演算为代表的人工智能规划算法执行效率较低的问题.设计实现了一个实验性的原型系统,结合一个旅游行程规划的实例说明了本文提出的方法的有效性.对提出的BCABFC(Backward-Chaining Algorithm Based On Fluent Calculus)算法与基于情景演算的同类算法进行性能比较,实验结果表明该算法具有较好的性能.%In recent years, semantics-based Web Services composition, especially automated composition method has become popularity in the research area of Service Computing. This paper has identified a mapping between an OWL-S process ontology and the fluent calculus concepts. We present an algorithm to translate OWL-S service descriptions into an equivalent fluent calculus service specification. This paper presents a novel approach for automatic Web service composition method based on the formalism of fluent calculus. In our approach, the Web service composition process is viewed as an AI planning problem in the fluent calculus formalism. We show how the planning capabilities of the fluent calculus can be used to automatically generate an abstract composition model in terms of user personalized requests. This method applies the principle of progression for reasoning the status and action of the object. As a result, it brings a higher efficiency than traditional AI planning algorithms characterized by Situation Calculus. For testing our composition method, we have designed and implemented an experimental prototype and demonstrate its effectiveness with the help of an application

  19. geoKepler Workflow Module for Computationally Scalable and Reproducible Geoprocessing and Modeling

    Science.gov (United States)

    Cowart, C.; Block, J.; Crawl, D.; Graham, J.; Gupta, A.; Nguyen, M.; de Callafon, R.; Smarr, L.; Altintas, I.

    2015-12-01

    The NSF-funded WIFIRE project has developed an open-source, online geospatial workflow platform for unifying geoprocessing tools and models for for fire and other geospatially dependent modeling applications. It is a product of WIFIRE's objective to build an end-to-end cyberinfrastructure for real-time and data-driven simulation, prediction and visualization of wildfire behavior. geoKepler includes a set of reusable GIS components, or actors, for the Kepler Scientific Workflow System (https://kepler-project.org). Actors exist for reading and writing GIS data in formats such as Shapefile, GeoJSON, KML, and using OGC web services such as WFS. The actors also allow for calling geoprocessing tools in other packages such as GDAL and GRASS. Kepler integrates functions from multiple platforms and file formats into one framework, thus enabling optimal GIS interoperability, model coupling, and scalability. Products of the GIS actors can be fed directly to models such as FARSITE and WRF. Kepler's ability to schedule and scale processes using Hadoop and Spark also makes geoprocessing ultimately extensible and computationally scalable. The reusable workflows in geoKepler can be made to run automatically when alerted by real-time environmental conditions. Here, we show breakthroughs in the speed of creating complex data for hazard assessments with this platform. We also demonstrate geoKepler workflows that use Data Assimilation to ingest real-time weather data into wildfire simulations, and for data mining techniques to gain insight into environmental conditions affecting fire behavior. Existing machine learning tools and libraries such as R and MLlib are being leveraged for this purpose in Kepler, as well as Kepler's Distributed Data Parallel (DDP) capability to provide a framework for scalable processing. geoKepler workflows can be executed via an iPython notebook as a part of a Jupyter hub at UC San Diego for sharing and reporting of the scientific analysis and results from

  20. Optimizing CyberShake Seismic Hazard Workflows for Large HPC Resources

    Science.gov (United States)

    Callaghan, S.; Maechling, P. J.; Juve, G.; Vahi, K.; Deelman, E.; Jordan, T. H.

    2014-12-01

    The CyberShake computational platform is a well-integrated collection of scientific software and middleware that calculates 3D simulation-based probabilistic seismic hazard curves and hazard maps for the Los Angeles region. Currently each CyberShake model comprises about 235 million synthetic seismograms from about 415,000 rupture variations computed at 286 sites. CyberShake integrates large-scale parallel and high-throughput serial seismological research codes into a processing framework in which early stages produce files used as inputs by later stages. Scientific workflow tools are used to manage the jobs, data, and metadata. The Southern California Earthquake Center (SCEC) developed the CyberShake platform using USC High Performance Computing and Communications systems and open-science NSF resources.CyberShake calculations were migrated to the NSF Track 1 system NCSA Blue Waters when it became operational in 2013, via an interdisciplinary team approach including domain scientists, computer scientists, and middleware developers. Due to the excellent performance of Blue Waters and CyberShake software optimizations, we reduced the makespan (a measure of wallclock time-to-solution) of a CyberShake study from 1467 to 342 hours. We will describe the technical enhancements behind this improvement, including judicious introduction of new GPU software, improved scientific software components, increased workflow-based automation, and Blue Waters-specific workflow optimizations.Our CyberShake performance improvements highlight the benefits of scientific workflow tools. The CyberShake workflow software stack includes the Pegasus Workflow Management System (Pegasus-WMS, which includes Condor DAGMan), HTCondor, and Globus GRAM, with Pegasus-mpi-cluster managing the high-throughput tasks on the HPC resources. The workflow tools handle data management, automatically transferring about 13 TB back to SCEC storage.We will present performance metrics from the most recent Cyber

  1. Applying direct observation to model workflow and assess adoption.

    Science.gov (United States)

    Unertl, Kim M; Weinger, Matthew B; Johnson, Kevin B

    2006-01-01

    Lack of understanding about workflow can impair health IT system adoption. Observational techniques can provide valuable information about clinical workflow. A pilot study using direct observation was conducted in an outpatient chronic disease clinic. The goals of the study were to assess workflow and information flow and to develop a general model of workflow and information behavior. Over 55 hours of direct observation showed that the pilot site utilized many of the features of the informatics systems available to them, but also employed multiple non-electronic artifacts and workarounds. Gaps existed between clinic workflow and informatics tool workflow, as well as between institutional expectations of informatics tool use and actual use. Concurrent use of both paper-based and electronic systems resulted in duplication of effort and inefficiencies. A relatively short period of direct observation revealed important information about workflow and informatics tool adoption.

  2. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  3. SegMine workflows for semantic microarray data analysis in Orange4WS

    Directory of Open Access Journals (Sweden)

    Kulovesi Kimmo

    2011-10-01

    Full Text Available Abstract Background In experimental data analysis, bioinformatics researchers increasingly rely on tools that enable the composition and reuse of scientific workflows. The utility of current bioinformatics workflow environments can be significantly increased by offering advanced data mining services as workflow components. Such services can support, for instance, knowledge discovery from diverse distributed data and knowledge sources (such as GO, KEGG, PubMed, and experimental databases. Specifically, cutting-edge data analysis approaches, such as semantic data mining, link discovery, and visualization, have not yet been made available to researchers investigating complex biological datasets. Results We present a new methodology, SegMine, for semantic analysis of microarray data by exploiting general biological knowledge, and a new workflow environment, Orange4WS, with integrated support for web services in which the SegMine methodology is implemented. The SegMine methodology consists of two main steps. First, the semantic subgroup discovery algorithm is used to construct elaborate rules that identify enriched gene sets. Then, a link discovery service is used for the creation and visualization of new biological hypotheses. The utility of SegMine, implemented as a set of workflows in Orange4WS, is demonstrated in two microarray data analysis applications. In the analysis of senescence in human stem cells, the use of SegMine resulted in three novel research hypotheses that could improve understanding of the underlying mechanisms of senescence and identification of candidate marker genes. Conclusions Compared to the available data analysis systems, SegMine offers improved hypothesis generation and data interpretation for bioinformatics in an easy-to-use integrated workflow environment.

  4. Statistical modeling and recognition of surgical workflow.

    Science.gov (United States)

    Padoy, Nicolas; Blum, Tobias; Ahmadi, Seyed-Ahmad; Feussner, Hubertus; Berger, Marie-Odile; Navab, Nassir

    2012-04-01

    In this paper, we contribute to the development of context-aware operating rooms by introducing a novel approach to modeling and monitoring the workflow of surgical interventions. We first propose a new representation of interventions in terms of multidimensional time-series formed by synchronized signals acquired over time. We then introduce methods based on Dynamic Time Warping and Hidden Markov Models to analyze and process this data. This results in workflow models combining low-level signals with high-level information such as predefined phases, which can be used to detect actions and trigger an event. Two methods are presented to train these models, using either fully or partially labeled training surgeries. Results are given based on tool usage recordings from sixteen laparoscopic cholecystectomies performed by several surgeons.

  5. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  6. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  7. Quantifying nursing workflow in medication administration.

    Science.gov (United States)

    Keohane, Carol A; Bane, Anne D; Featherstone, Erica; Hayes, Judy; Woolf, Seth; Hurley, Ann; Bates, David W; Gandhi, Tejal K; Poon, Eric G

    2008-01-01

    New medication administration systems are showing promise in improving patient safety at the point of care, but adoption of these systems requires significant changes in nursing workflow. To prepare for these changes, the authors report on a time-motion study that measured the proportion of time that nurses spend on various patient care activities, focusing on medication administration-related activities. Implications of their findings are discussed.

  8. Tailored business solutions by workflow technologies

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available VISP (Virtual Internet Service Provider is an IST-STREP project, which is conducting research in the field of these new technologies, targeted to telecom/ISP companies. One of the first tasks of the VISP project is to identify the most appropriate technologies in order to construct the VISP platform. This paper presents the most significant results in the field of choreography and orchestration, two key domains that must accompany process modeling in the construction of a workflow environment.

  9. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  10. EDMS based workflow for Printing Industry

    OpenAIRE

    Prathap Nayak; Anuradha Rao; Ramakrishna Nayak

    2013-01-01

    Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always depende...

  11. Text-mining-assisted biocuration workflows in Argo.

    Science.gov (United States)

    Rak, Rafal; Batista-Navarro, Riza Theresa; Rowley, Andrew; Carter, Jacob; Ananiadou, Sophia

    2014-01-01

    Biocuration activities have been broadly categorized into the selection of relevant documents, the annotation of biological concepts of interest and identification of interactions between the concepts. Text mining has been shown to have a potential to significantly reduce the effort of biocurators in all the three activities, and various semi-automatic methodologies have been integrated into curation pipelines to support them. We investigate the suitability of Argo, a workbench for building text-mining solutions with the use of a rich graphical user interface, for the process of biocuration. Central to Argo are customizable workflows that users compose by arranging available elementary analytics to form task-specific processing units. A built-in manual annotation editor is the single most used biocuration tool of the workbench, as it allows users to create annotations directly in text, as well as modify or delete annotations created by automatic processing components. Apart from syntactic and semantic analytics, the ever-growing library of components includes several data readers and consumers that support well-established as well as emerging data interchange formats such as XMI, RDF and BioC, which facilitate the interoperability of Argo with other platforms or resources. To validate the suitability of Argo for curation activities, we participated in the BioCreative IV challenge whose purpose was to evaluate Web-based systems addressing user-defined biocuration tasks. Argo proved to have the edge over other systems in terms of flexibility of defining biocuration tasks. As expected, the versatility of the workbench inevitably lengthened the time the curators spent on learning the system before taking on the task, which may have affected the usability of Argo. The participation in the challenge gave us an opportunity to gather valuable feedback and identify areas of improvement, some of which have already been introduced. Database URL: http://argo.nactem.ac.uk.

  12. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  13. Semi-automatic identification of counterfeit offers in online shopping platforms

    OpenAIRE

    Wartner, Christian; Arnold, Patrick; Rahm, Erhard

    2015-01-01

    Product counterfeiting is a serious problem causing the industry estimated losses of billions of dollars every year. With the increasing spread of e-commerce, the number of counterfeit products sold online increased substantially. We propose the adoption of a semi-automatic workflow to identify likely counterfeit offers in online platforms and to present these offers to a domain expert for manual verification. The workflow includes steps to generate search queries for relevant product offers,...

  14. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.;

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems...

  15. Research on an Intelligent Automatic Turning System

    Directory of Open Access Journals (Sweden)

    Lichong Huang

    2012-12-01

    Full Text Available Equipment manufacturing industry is the strategic industries of a country. And its core part is the CNC machine tool. Therefore, enhancing the independent research of relevant technology of CNC machine, especially the open CNC system, is of great significance. This paper presented some key techniques of an Intelligent Automatic Turning System and gave a viable solution for system integration. First of all, the integrated system architecture and the flexible and efficient workflow for perfoming the intelligent automatic turning process is illustrated. Secondly, the innovated methods of the workpiece feature recognition and expression and process planning of the NC machining are put forward. Thirdly, the cutting tool auto-selection and the cutting parameter optimization solution are generated with a integrated inference of rule-based reasoning and case-based reasoning. Finally, the actual machining case based on the developed intelligent automatic turning system proved the presented solutions are valid, practical and efficient.

  16. A Web-Based Rapid Prototyping Workflow Management Information System for Computer Repair and Maintenance

    Directory of Open Access Journals (Sweden)

    A. H. El-Mousa

    2008-01-01

    Full Text Available Problem statement: Response to paper-based requests for computer and peripheral repair and maintenance has become very troublesome and slow due to the large demand and expansion at the University of Jordan. The objectives of this study were to: (i investigate the current system processes associated with the paper-based workflow system. (ii design and implement, using a suitable workflow management approach, a totally electronic alterative to improve performance. Approach: The methodology followed in the transform of the business processes from the paper to the electronic-based was using the rapid prototyping workflow management approach. This approach is seen to be especially effective in such cases where there is direct continuous interaction and involvement between the different stakeholders during the lifecycle of the project. Results: The system had been implemented and tested with the result that efficiency, accountability and response time have greatly improved in handling repair and maintenance orders. The transform from paper to electronic resulted in greatly enhanced user satisfaction especially since the developed system provided automatic feedback regarding order status. Conclusion: The design process and results provide a working blueprint for easy and quick various similar university-based business processes to be transformed to electronic. This should highly motivate internal in-house restructuring of business process to utilize technology and IT to enhance performance and accountability.

  17. Integrated Cloud-Based Services for Medical Workflow Systems

    Directory of Open Access Journals (Sweden)

    Gharbi Nada

    2016-12-01

    Full Text Available Recent years have witnessed significant progress of workflow systems in different business areas. However, in the medical domain, the workflow systems are comparatively scarcely researched. In the medical domain, the workflows are as important as in other areas. In fact, the flow of information in the healthcare industry is even more critical than it is in other industries. Workflow can provide a new way of looking at how processes and procedures are completed in particular medical systems, and it can help improve the decision-making in these systems. Despite potential capabilities of workflow systems, medical systems still often perceive critical challenges in maintaining patient medical information that results in the difficulties in accessing patient data by different systems. In this paper, a new cloud-based service-oriented architecture is proposed. This architecture will support a medical workflow system integrated with cloud services aligned with medical standards to improve the healthcare system.

  18. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  19. Workflow-based Context-aware Control of Surgical Robots

    OpenAIRE

    Beyl, Tim

    2015-01-01

    Surgical assistance system such as medical robots enhanced the capabilities of medical procedures in the last decades. This work presents a new perspective on the use of workflows with surgical robots in order to improve the technical capabilities and the ease of use of such systems. This is accomplished by a 3D perception system for the supervision of the surgical operating room and a workflow-based controller, that allows to monitor the surgical process using workflow-tracking techniques.

  20. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  1. Research on an Integrated Enterprise Workflow Model

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    An integrated enterprise workflow model called PPROCE is presented firstly. Then, an enterprise's ontology established by TOVE and Process Specification Language (PSL) is studied. Combined with TOVE's partition idea, PSL is extended and new PSL Extensions is created to define the ontology of process, organization, resource and product in the PPROCE model. As a result, PPROCE model can be defined by a set of corresponding formal language. It facilitates the future work not only in the model verification, model optimization and model simulation, but also in the model translation.

  2. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  3. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  4. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  5. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  6. Process Makna - A Semantic Wiki for Scientific Workflows

    CERN Document Server

    Paschke, Adrian

    2010-01-01

    Virtual e-Science infrastructures supporting Web-based scientific workflows are an example for knowledge-intensive collaborative and weakly-structured processes where the interaction with the human scientists during process execution plays a central role. In this paper we propose the lightweight dynamic user-friendly interaction with humans during execution of scientific workflows via the low-barrier approach of Semantic Wikis as an intuitive interface for non-technical scientists. Our Process Makna Semantic Wiki system is a novel combination of an business process management system adapted for scientific workflows with a Corporate Semantic Web Wiki user interface supporting knowledge intensive human interaction tasks during scientific workflow execution.

  7. Beginning WF Windows Workflow in .NET 4.0

    CERN Document Server

    Collins, M

    2010-01-01

    Windows Workflow Foundation is a ground-breaking addition to the core of the .NET Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for a workflow-based solution has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been compose

  8. Model Checking Workflow Net Based on Petri Net

    Institute of Scientific and Technical Information of China (English)

    ZHOU Conghua; CHEN Zhenyu

    2006-01-01

    The soundness is a very important criterion for the correctness of the workflow.Specifying the soundness with Computation Tree Logic (CTL) allows us to verify the soundness with symbolic model checkers.Therefore the state explosion problem in verifying soundness can be overcome efficiently.When the property is not satisfied by the system,model checking can give a counter-example, which can guide us to correct the workflow.In addition, relaxed soundness is another important criterion for the workflow.We also prove that Computation Tree Logic * (CTL * ) can be used to character the relaxed soundness of the workflow.

  9. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  10. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  11. The Prosthetic Workflow in the Digital Era

    Science.gov (United States)

    De Franco, Michele; Bosetti, Giovanni

    2016-01-01

    The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland) intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010) software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient. PMID:27829834

  12. Workflow Management for a Cosmology Collaboratory

    Institute of Scientific and Technical Information of China (English)

    StewartC.Loken; CharlesMcParland

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problms in particle and nuclear physics.Itsd goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles)in precision measurements of cosmological parameters.Over the past several years,astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and,during the 4 to 8 weeks of their most"explosive~ activity,measure their changing magnitude and spectra.The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and MaunaKea,Hawaii and Mt.Palomar,California,This new program provides a demanding testbed for the integration of computational,data management and collaboratory technologies.A citical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments .This paper describes the workflow management framework for the project,discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  13. Delta: Data Reduction for Integrated Application Workflows.

    Energy Technology Data Exchange (ETDEWEB)

    Lofstead, Gerald Fredrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jean-Baptiste, Gregory [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Oldfield, Ron A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-06-01

    Integrated Application Workflows (IAWs) run multiple simulation workflow components con- currently on an HPC resource connecting these components using compute area resources and compensating for any performance or data processing rate mismatches. These IAWs require high frequency and high volume data transfers between compute nodes and staging area nodes during the lifetime of a large parallel computation. The available network band- width between the two areas may not be enough to efficiently support the data movement. As the processing power available to compute resources increases, the requirements for this data transfer will become more difficult to satisfy and perhaps will not be satisfiable at all since network capabilities are not expanding at a comparable rate. Furthermore, energy consumption in HPC environments is expected to grow by an order of magnitude as exas- cale systems become a reality. The energy cost of moving large amounts of data frequently will contribute to this issue. It is necessary to reduce the volume of data without reducing the quality of data when it is being processed and analyzed. Delta resolves the issue by addressing the lifetime data transfer operations. Delta removes subsequent identical copies of already transmitted data during transfers and restores those copies once the data has reached the destination. Delta is able to identify duplicated information and determine the most space efficient way to represent it. Initial tests show about 50% reduction in data movement while maintaining the same data quality and transmission frequency.

  14. Scalable Scientific Workflows Management System SWFMS

    Directory of Open Access Journals (Sweden)

    M. Abdul Rahman

    2016-11-01

    Full Text Available In today’s electronic world conducting scientific experiments, especially in natural sciences domain, has become more and more challenging for domain scientists since “science” today has turned out to be more complex due to the two dimensional intricacy; one: assorted as well as complex computational (analytical applications and two: increasingly large volume as well as heterogeneity of scientific data products processed by these applications. Furthermore, the involvement of increasingly large number of scientific instruments such as sensors and machines makes the scientific data management even more challenging since the data generated from such type of instruments are highly complex. To reduce the amount of complexities in conducting scientific experiments as much as possible, an integrated framework that transparently implements the conceptual separation between both the dimensions is direly needed. In order to facilitate scientific experiments ‘workflow’ technology has in recent years emerged in scientific disciplines like biology, bioinformatics, geology, environmental science, and eco-informatics. Much more research work has been done to develop the scientific workflow systems. However, our analysis over these existing systems shows that they lack a well-structured conceptual modeling methodology to deal with the two complex dimensions in a transparent manner. This paper presents a scientific workflow framework that properly addresses these two dimensional complexities in a proper manner.

  15. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  16. Workflow management for a cosmology collaboratory

    Energy Technology Data Exchange (ETDEWEB)

    Loken, Stewart C.; McParland, Charles

    2001-07-20

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work.

  17. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  18. Von der Prozeßorientierung zum Workflow Management - Teil 2: Prozeßmanagement, Workflow Management, Workflow-Management-Systeme

    OpenAIRE

    Maurer, Gerd

    1996-01-01

    Die Begriffe Prozeßorientierung, Prozeßmanagement, Workflow Management und Workflow-Management-Systeme sind noch immer nicht klar definiert und voneinander abgegrenzt. Ausgehend von einem speziellen Verständnis der Prozeßorientierung (Arbeitspapier WI Nr. 9/1996) wird Prozeßmanagement als ein umfassender Ansatz zur prozeßorientierten Gestaltung und Führung von Unternehmen definiert. Das Workflow Management stellt die eher formale, stark DV-bezogene Komponente des Prozeßmanagements dar und bil...

  19. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  20. Research of Web-based Workflow Management System

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The state of the art of workflow management techniques in research is introduced. The research and development trends of Workflow Manag ement System (WFMS) are presented. On basis of analysis and comparison of kinds of WFMSs, a WFMS based on Web technology and distributed object management is pr oposed. Finally, the application of the WFMS in supply chain management is descr ibed in detail.

  1. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    Sinderen, van Marten J.; Joosten, Stef M.M.; Guareis de Farias, Clever R.

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the corres

  2. An architecture including network QoS in scientific workflows

    NARCIS (Netherlands)

    Zhao, Z.; Grosso, P.; Koning, R.; van der Ham, J.; de Laat, C.

    2010-01-01

    The quality of the network services has so far rarely been considered in composing and executing scientific workflows. Currently, scientific applications tune the execution quality of workflows neglecting network resources, and by selecting only optimal software services and computing resources. One

  3. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of refere

  4. Towards an actor-driven workflow management system for Grids

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum

    2010-01-01

    Currently, most workflow management systems in Grid environments provide push-oriented job distribution strategies, where jobs are explicitly delegated to resources. In those scenarios the dedicated resources execute submitted jobs according to the request of a workflow engine or Grid wide scheduler

  5. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  6. Conceptual Framework and Architecture for Service Mediating Workflow Management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  7. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Z. Zhao; A. Belloum; C. de Laat; P. Adriaans; B. Hertzberger

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of ex

  8. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  9. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  10. Implementing and Running a Workflow Application on Cloud Resources

    Directory of Open Access Journals (Sweden)

    Gabriela Andreea MORAR

    2011-01-01

    Full Text Available Scientist need to run applications that are time and resource consuming, but, not all of them, have the requires knowledge to run this applications in a parallel manner, by using grid, cluster or cloud resources. In the past few years many workflow building frameworks were developed in order to help scientist take a better advantage of computing resources, by designing workflows based on their applications and executing them on heterogeneous resources. This paper presents a case study of implementing and running a workflow for an E-bay data retrieval application. The workflow was designed using Askalon framework and executed on the cloud resources. The purpose of this paper is to demonstrate how workflows and cloud resources can be used by scientists in order to achieve speedup for their application without the need of spending large amounts of money on computational resources.

  11. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  12. An Integrated Workflow for DNA Methylation Analysis

    Institute of Scientific and Technical Information of China (English)

    Pingchuan Li; Feray Demirci; Gayathri Mahalingam; Caghan Demirci; Mayumi Nakano; Blake C.Meyers

    2013-01-01

    The analysis of cytosine methylation provides a new way to assess and describe epigenetic regulation at a whole-genome level in many eukaryotes.DNA methylation has a demonstrated role in the genome stability and protection,regulation of gene expression and many other aspects of genome function and maintenance.BS-seq is a relatively unbiased method for profiling the DNA methylation,with a resolution capable of measuring methylation at individual cytosines.Here we describe,as an example,a workflow to handle DNA methylation analysis,from BS-seq library preparation to the data visualization.We describe some applications for the analysis and interpretation of these data.Our laboratory provides public access to plant DNA methylation data via visualization tools available at our "Next-Gen Sequence" websites (http://mpss.udel.edu),along with small RNA,RNA-seq and other data types.

  13. A Framework for Distributed Preservation Workflows

    Directory of Open Access Journals (Sweden)

    Rainer Schmidt

    2010-07-01

    Full Text Available The Planets Project is developing a service-oriented environment for the definition and evaluation of preservation strategies for human-centric data. It focuses on the question of logically preserving digital materials, as opposed to the physical preservation of content bit-streams. This includes the development of preservation tools for the automated characterisation, migration, and comparison of different types of Digital Objects as well as the emulation of their original runtime environment in order to ensure long-time access and interpretability. The Planets integrated environment provides a number of end-user applications that allow data curators to execute and scientifically evaluate preservation experiments based on composable preservation services. In this paper, we focus on the middleware and programming model and show how it can be utilised in order to create complex preservation workflows.

  14. A Formal Model For Declarative Workflows

    DEFF Research Database (Denmark)

    Mukkamala, Raghava Rao

    it as a general formal model for specification and execution of declarative, event-based business processes, as a generalization of a concurrency model, the classic event structures. The model allows for an intuitive operational semantics and mapping of execution state by a notion of markings of the graphs and we...... the declarative nature of the projected graphs (which are also DCR graphs). We have also provided semantics for distributed executions based on synchronous communication among network of projected graphs and proved that global and distributed executions are equivalent. Further, to support modeling of processes......Current business process technology is pretty good in supporting well-structured business processes and aim at achieving a fixed goal by carrying out an exact set of operations. In contrast, those exact operations needed to fulfill a business pro- cess/workflow may not be always possible to foresee...

  15. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  16. Web service automatic composition based on semantic relationship graph%基于语义关系图的Web服务自动组合方法

    Institute of Scientific and Technical Information of China (English)

    冯建周; 孔令富; 王晓寰

    2012-01-01

    针对基于图搜索实现Web服务自动组合存在搜索空间太大以及难以表达各种组合结构的问题,提出一种基于语义匹配关系确定组合结构的方法。该方法首先对Web服务进行形式化的语义描述,然后基于语义匹配关系,将服务库中只与用户请求的输入输出关联的服务构成语义关系图。在此基础上,基于语义匹配关系定义各种组合结构模型,以综合语义匹配度为寻优目标改进广度优先搜索算法,定义不同结构的语义匹配度计算方法,生成一条综合语义匹配度最优的Web服务组合路径。通过实例验证了该算法的可行性。%The method based on graph search was a simple and direct way to realize the Web service automatic composition, hut the search space was too large and it was difficult to express various combination structures among serv ices. To solve this problem, a method based on semantic matching relationship to determine combination structure was presented. Formal description of the Web services semantics were presented, and then based on semantic matehing relationship, the semantic relationship graph was established by services which were only related to user provided input and expected output. On this basis, various combination structure models were defined based on semantic matching relationship, and taking integrated semantic matching degree as optimal goal, the breadth-first search algorithm was improved, the calculation method of semantic matching degree in various combination structure was defined, and the service combination path which owned the optimal integrated semantic matching degree was generated. The feasibility of proposed algorithm was verified though an example.

  17. CA-PLAN, a Service-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Shung-Bin Yan; Feng-Jian Wang

    2005-01-01

    Workflow management systems (WfMSs) are accepted worldwide due to their ability to model and control business processes. Previously, we defined an intra-organizational workflow specification model, Process LANguage (PLAN).PLAN, with associated tools, allowed a user to describe a graph specification for processes, artifacts, and participants in an organization. PLAN has been successfully implemented in Agentflow to support workflow (Agentflow) applications. PLAN,and most current WfMSs are designed to adopt a centralized architecture so that they can be applied to a single organization.However, in such a structure, participants in Agentflow applications in different organizations cannot serve each other with workflows.In this paper, a service-oriented cooperative workflow model, Cooperative Agentflow Process LANguage (CA-PLAN) is presented. CA-PLAN proposes a workflow component model to model inter-organizational processes. In CA-PLAN, an interorganizational process is partitioned into several intra-organizational processes. Each workflow system inside an organization is modeled as an Integrated Workflow Component (IWC). Each IWC contains a process service interface, specifying process services provided by an organization, in conjunction with a remote process interface specifying what remote processes are used to refer to remote process services provided by other organizations, and intra-organizational processes. An IWC is a workflow node and participant. An inter-organizational process is made up of connections among these process services and remote processes with respect to different IWCs. In this paper, the related service techniques and supporting tools provided in Agentflow systems are presented.

  18. Service-based flexible workflow system for virtual enterprise

    Institute of Scientific and Technical Information of China (English)

    WU Shao-fei

    2008-01-01

    Using the services provided by virtual enterprises, we presented a solution to implement flexible inter-enterprise workflow management. Services were the responses of events that can be accessed programmatically on the Internet by HTTP protocol. Services were obtained according to some standardized service templates. The workflow engine's flexible control to a request was bound to appropriate services and their providers by using a constraint-based, dynamic binding mechanism. Hence, a flexible and collaborative business was achieved. The workflow management system supports virtual enterprise, and the styles of virtual enterprises can be adjusted readily to adapt various situations.

  19. Editorial and Technological Workflow Tools to Promote Website Quality

    Directory of Open Access Journals (Sweden)

    Emily G. Morton-Owens

    2011-09-01

    Full Text Available Library websites are an increasingly visible representation of the library as an institution, which makes website quality an important way to communicate competence and trustworthiness to users. A website editorial workflow is one way to enforce a process and ensure quality. In a workflow, users receive roles, like author or editor, and content travels through various stages in which grammar, spelling, tone, and format are checked. One library used a workflow system to involve librarians in the creation of content. This system, implemented in Drupal, an opensource content management system, solved problems of coordination, quality, and comprehensiveness that existed on the library’s earlier, static website.

  20. A Workflow Process Mining Algorithm Based on Synchro-Net

    Institute of Scientific and Technical Information of China (English)

    Xing-Qi Huang; Li-Fu Wang; Wen Zhao; Shi-Kun Zhang; Chong-Yi Yuan

    2006-01-01

    Sometimes historic information about workflow execution is needed to analyze business processes. Process mining aims at extracting information from event logs for capturing a business process in execution. In this paper a process mining algorithm is proposed based on Synchro-Net which is a synchronization-based model of workflow logic and workflow semantics. With this mining algorithm based on the model, problems such as invisible tasks and short-loops can be dealt with at ease. A process mining example is presented to illustrate the algorithm, and the evaluation is also given.

  1. Workflow logs analysis system for enterprise performance measurement

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Workflow logs that record the execution of business processes offer very valuable data resource for real-time enterprise performance measurement. In this paper, a novel scheme that uses the technology of data warehouse and OLAP to explore workflow logs and create complex analysis reports for enterprise performance measurement is proposed. Three key points of this scheme are studied: 1) the measure set; 2) the open and flexible architecture for workflow logs analysis system; 3) the data models in WFMS and data warehouse. A case study that shows the validity of the scheme is also provided.

  2. What is needed for effective open access workflows?

    CERN Document Server

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  3. CMS Alignement and Calibration workflows lesson learned and future plans

    CERN Document Server

    De Guio, Federico

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  4. Reduction techniques of workflow verification and its implementation

    Institute of Scientific and Technical Information of China (English)

    李沛武; 卢正鼎; 付湘林

    2004-01-01

    Many workflow management systems have emerged in recent years, but few of them provide any form of support for verification. This frequently results in runtime errors that need to be corrected at prohibitive costs. In Ref. [ 1 ], a few reduction rules of verifying workflow graph are given. After analyzing the reduction rules, the overlapped reduction rule is found to be inaccurate. In this paper, the improved reduction rules are presented and the matrix-based implementing algorithm is given, so that the scope of the verification of workflow is expanded and the efficiency of the algorithm is enhanced. The method is simple and natural, and its implementation is easy too.

  5. Research into automatic scoring method of English composition and its feasibility%英语自动作文评分方法及其可行性研究

    Institute of Scientific and Technical Information of China (English)

    杨有统

    2011-01-01

    in order to test the feasibility of automatic scoring model of English compositions written the Chinese students,this paper adopts 35 positions written by senior English majors as sample and explores the relation between automatic extraction language use characteristics and man-scoring.The result shows that the use number of chunk of composition,type,the fourth root of composition length and writing score are markedly correlated.%为了检验中国学生英语作文自动评分模型的可行性,采用某高校英语专业高年级学生35篇作文为样本,探讨计算机提取语言使用特征和人工评分之间的关系。研究表明作文中语块的使用数量、类符数、作文长度四次方根和写作得分显著相关。

  6. Workflow to numerically reproduce laboratory ultrasonic datasets

    Institute of Scientific and Technical Information of China (English)

    A. Biryukov; N. Tisato; G. Grasselli

    2014-01-01

    The risks and uncertainties related to the storage of high-level radioactive waste (HLRW) can be reduced thanks to focused studies and investigations. HLRWs are going to be placed in deep geological re-positories, enveloped in an engineered bentonite barrier, whose physical conditions are subjected to change throughout the lifespan of the infrastructure. Seismic tomography can be employed to monitor its physical state and integrity. The design of the seismic monitoring system can be optimized via con-ducting and analyzing numerical simulations of wave propagation in representative repository geometry. However, the quality of the numerical results relies on their initial calibration. The main aim of this paper is to provide a workflow to calibrate numerical tools employing laboratory ultrasonic datasets. The finite difference code SOFI2D was employed to model ultrasonic waves propagating through a laboratory sample. Specifically, the input velocity model was calibrated to achieve a best match between experi-mental and numerical ultrasonic traces. Likely due to the imperfections of the contact surfaces, the resultant velocities of P- and S-wave propagation tend to be noticeably lower than those a priori assigned. Then, the calibrated model was employed to estimate the attenuation in a montmorillonite sample. The obtained low quality factors (Q) suggest that pronounced inelastic behavior of the clay has to be taken into account in geophysical modeling and analysis. Consequently, this contribution should be considered as a first step towards the creation of a numerical tool to evaluate wave propagation in nuclear waste repositories.

  7. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  8. EDMS based workflow for Printing Industry

    Directory of Open Access Journals (Sweden)

    Prathap Nayak

    2013-04-01

    Full Text Available Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always dependendent on each other. Hence the information has to be harmonized artistically in order to avoid production downtime or employees pointing fingers at each other. This paper analyses how the implementation of Electronic Document Management System (EDMS could contribute to the Printing Industry for immediate access to stored documents within and across departments irrespective of geographical boundaries. The paper outlines initially with a brief history, contemporary EDMS system and some illustrated examples with a study done by choosing Library as a pilot area for evaluating EDMS. The paper ends with an imitative proposal that maps several document management based activities for implementation of EDMS for a Printing Industry.

  9. Workflow for the use of a high-resolution image detector in endovascular interventional procedures

    Science.gov (United States)

    Rana, R.; Loughran, B.; Swetadri Vasan, S. N.; Pope, L.; Ionita, C. N.; Siddiqui, A.; Lin, N.; Bednarek, D. R.; Rudin, S.

    2014-03-01

    Endovascular image-guided intervention (EIGI) has become the primary interventional therapy for the most widespread vascular diseases. These procedures involve the insertion of a catheter into the femoral artery, which is then threaded under fluoroscopic guidance to the site of the pathology to be treated. Flat Panel Detectors (FPDs) are normally used for EIGIs; however, once the catheter is guided to the pathological site, high-resolution imaging capabilities can be used for accurately guiding a successful endovascular treatment. The Micro-Angiographic Fluoroscope (MAF) detector provides needed high-resolution, high-sensitivity, and real-time imaging capabilities. An experimental MAF enabled with a Control, Acquisition, Processing, Image Display and Storage (CAPIDS) system was installed and aligned on a detector changer attached to the C-arm of a clinical angiographic unit. The CAPIDS system was developed and implemented using LabVIEW software and provides a user-friendly interface that enables control of several clinical radiographic imaging modes of the MAF including: fluoroscopy, roadmap, radiography, and digital-subtraction-angiography (DSA). Using the automatic controls, the MAF detector can be moved to the deployed position, in front of a standard FPD, whenever higher resolution is needed during angiographic or interventional vascular imaging procedures. To minimize any possible negative impact to image guidance with the two detector systems, it is essential to have a well-designed workflow that enables smooth deployment of the MAF at critical stages of clinical procedures. For the ultimate success of this new imaging capability, a clear understanding of the workflow design is essential. This presentation provides a detailed description and demonstration of such a workflow design.

  10. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workflow models using π-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS (Labeled Transition Semantics) semantics of π-calculus. The main advantage of the workflow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of workflow models can be checked through weak bisimulation theorem in the π-calculus, thus facilitating the optimization of business processes.

  11. Network resource control for grid workflow management systems

    NARCIS (Netherlands)

    Strijkers, R.J.; Cristea, M.; Korkhov, V.; Marchal, D.; Belloum, A.; Laat, C.de; Meijer, R.J.

    2010-01-01

    Grid workflow management systems automate the orchestration of scientific applications with large computational and data processing needs, but lack control over network resources. Consequently, the management system cannot prevent multiple communication intensive applications to compete for network

  12. A Community-Driven Workflow Recommendation and Reuse Infrastructure Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX...

  13. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  14. Optimization of tomographic reconstruction workflows on geographically distributed resources.

    Science.gov (United States)

    Bicer, Tekin; Gürsoy, Dogˇa; Kettimuthu, Rajkumar; De Carlo, Francesco; Foster, Ian T

    2016-07-01

    New technological advancements in synchrotron light sources enable data acquisitions at unprecedented levels. This emergent trend affects not only the size of the generated data but also the need for larger computational resources. Although beamline scientists and users have access to local computational resources, these are typically limited and can result in extended execution times. Applications that are based on iterative processing as in tomographic reconstruction methods require high-performance compute clusters for timely analysis of data. Here, time-sensitive analysis and processing of Advanced Photon Source data on geographically distributed resources are focused on. Two main challenges are considered: (i) modeling of the performance of tomographic reconstruction workflows and (ii) transparent execution of these workflows on distributed resources. For the former, three main stages are considered: (i) data transfer between storage and computational resources, (i) wait/queue time of reconstruction jobs at compute resources, and (iii) computation of reconstruction tasks. These performance models allow evaluation and estimation of the execution time of any given iterative tomographic reconstruction workflow that runs on geographically distributed resources. For the latter challenge, a workflow management system is built, which can automate the execution of workflows and minimize the user interaction with the underlying infrastructure. The system utilizes Globus to perform secure and efficient data transfer operations. The proposed models and the workflow management system are evaluated by using three high-performance computing and two storage resources, all of which are geographically distributed. Workflows were created with different computational requirements using two compute-intensive tomographic reconstruction algorithms. Experimental evaluation shows that the proposed models and system can be used for selecting the optimum resources, which in turn can

  15. Taverna: a tool for building and running workflows of services

    Science.gov (United States)

    Hull, Duncan; Wolstencroft, Katy; Stevens, Robert; Goble, Carole; Pocock, Mathew R.; Li, Peter; Oinn, Tom

    2006-01-01

    Taverna is an application that eases the use and integration of the growing number of molecular biology tools and databases available on the web, especially web services. It allows bioinformaticians to construct workflows or pipelines of services to perform a range of different analyses, such as sequence analysis and genome annotation. These high-level workflows can integrate many different resources into a single analysis. Taverna is available freely under the terms of the GNU Lesser General Public License (LGPL) from . PMID:16845108

  16. A scheduling framework applied to digital publishing workflows

    Science.gov (United States)

    Lozano, Wilson; Rivera, Wilson

    2006-02-01

    This paper presents the advances in developing a dynamic scheduling technique suitable for automating digital publishing workflows. Traditionally scheduling in digital publishing has been limited to timing criteria. The proposed scheduling strategy takes into account contingency and priority fluctuations. The new scheduling algorithm, referred to as QB-MUF, gives high priority to jobs with low probability of failing according to artifact recognition and workflow modeling critera. The experimental results show the suitability and efficiency of the scheduling strategy.

  17. Implementation and evaluation of a new workflow for registration and segmentation of pulmonary MRI data for regional lung perfusion assessment

    Science.gov (United States)

    Böttger, T.; Grunewald, K.; Schöbinger, M.; Fink, C.; Risse, F.; Kauczor, H. U.; Meinzer, H. P.; Wolf, Ivo

    2007-03-01

    Recently it has been shown that regional lung perfusion can be assessed using time-resolved contrast-enhanced magnetic resonance (MR) imaging. Quantification of the perfusion images has been attempted, based on definition of small regions of interest (ROIs). Use of complete lung segmentations instead of ROIs could possibly increase quantification accuracy. Due to the low signal-to-noise ratio, automatic segmentation algorithms cannot be applied. On the other hand, manual segmentation of the lung tissue is very time consuming and can become inaccurate, as the borders of the lung to adjacent tissues are not always clearly visible. We propose a new workflow for semi-automatic segmentation of the lung from additionally acquired morphological HASTE MR images. First the lung is delineated semi-automatically in the HASTE image. Next the HASTE image is automatically registered with the perfusion images. Finally, the transformation resulting from the registration is used to align the lung segmentation from the morphological dataset with the perfusion images. We evaluated rigid, affine and locally elastic transformations, suitable optimizers and different implementations of mutual information (MI) metrics to determine the best possible registration algorithm. We located the shortcomings of the registration procedure and under which conditions automatic registration will succeed or fail. Segmentation results were evaluated using overlap and distance measures. Integration of the new workflow reduces the time needed for post-processing of the data, simplifies the perfusion quantification and reduces interobserver variability in the segmentation process. In addition, the matched morphological data set can be used to identify morphologic changes as the source for the perfusion abnormalities.

  18. CDK-Taverna: an open workflow environment for cheminformatics

    Directory of Open Access Journals (Sweden)

    Zielesny Achim

    2010-03-01

    Full Text Available Abstract Background Small molecules are of increasing interest for bioinformatics in areas such as metabolomics and drug discovery. The recent release of large open access chemistry databases generates a demand for flexible tools to process them and discover new knowledge. To freely support open science based on these data resources, it is desirable for the processing tools to be open source and available for everyone. Results Here we describe a novel combination of the workflow engine Taverna and the cheminformatics library Chemistry Development Kit (CDK resulting in a open source workflow solution for cheminformatics. We have implemented more than 160 different workers to handle specific cheminformatics tasks. We describe the applications of CDK-Taverna in various usage scenarios. Conclusions The combination of the workflow engine Taverna and the Chemistry Development Kit provides the first open source cheminformatics workflow solution for the biosciences. With the Taverna-community working towards a more powerful workflow engine and a more user-friendly user interface, CDK-Taverna has the potential to become a free alternative to existing proprietary workflow tools.

  19. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  20. A scientific workflow framework for (13)C metabolic flux analysis.

    Science.gov (United States)

    Dalman, Tolga; Wiechert, Wolfgang; Nöh, Katharina

    2016-08-20

    Metabolic flux analysis (MFA) with (13)C labeling data is a high-precision technique to quantify intracellular reaction rates (fluxes). One of the major challenges of (13)C MFA is the interactivity of the computational workflow according to which the fluxes are determined from the input data (metabolic network model, labeling data, and physiological rates). Here, the workflow assembly is inevitably determined by the scientist who has to consider interacting biological, experimental, and computational aspects. Decision-making is context dependent and requires expertise, rendering an automated evaluation process hardly possible. Here, we present a scientific workflow framework (SWF) for creating, executing, and controlling on demand (13)C MFA workflows. (13)C MFA-specific tools and libraries, such as the high-performance simulation toolbox 13CFLUX2, are wrapped as web services and thereby integrated into a service-oriented architecture. Besides workflow steering, the SWF features transparent provenance collection and enables full flexibility for ad hoc scripting solutions. To handle compute-intensive tasks, cloud computing is supported. We demonstrate how the challenges posed by (13)C MFA workflows can be solved with our approach on the basis of two proof-of-concept use cases.

  1. An ultrasound image-guided surgical workflow model

    Science.gov (United States)

    Guo, Bing; Lemke, Heinz; Liu, Brent; Huang, H. K.; Grant, Edward G.

    2006-03-01

    A 2003 report in the Journal of Annual Surgery predicted an increase in demand for surgical services to be as high as 14 to 47% in the workload of all surgical fields by 2020. Medical difficulties which are already now apparent in the surgical OR (Operation Room) will be amplified in the near future and it is necessary to address this problem and develop strategies to handle the workload. Workflow issues are central to the efficiency of the OR and in response to today's continuing workforce shortages and escalating costs. Among them include: Inefficient and redundant processes, System Inflexibility, Ergonomic deficiencies, Scattered Data, Lack of Guidelines, Standards, and Organization. The objective of this research is to validate the hypothesis that a workflow model does improve the efficiency and quality of surgical procedure. We chose to study the image-guided surgical workflow for US as a first proof of concept by minimizing the OR workflow issues. We developed, and implemented deformable workflow models using existing and projected future clinical environment data as well as a customized ICT system with seamless integration and real-time availability. An ultrasound (US) image-guided surgical workflow (IG SWF) for a specific surgical procedure, the US IG Liver Biopsy, was researched to find out the inefficient and redundant processes, scattered data in clinical systems, and improve the overall quality of surgical procedures to the patient.

  2. BReW: Blackbox Resource Selection for e-Science Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh [Univ. of Southern California, Los Angeles, CA (United States); Soroush, Emad [Univ. of Washington, Seattle, WA (United States); Van Ingen, Catharine [Microsoft Research, San Francisco, CA (United States); Agarwal, Deb [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ramakrishnan, Lavanya [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2010-10-04

    Workflows are commonly used to model data intensive scientific analysis. As computational resource needs increase for eScience, emerging platforms like clouds present additional resource choices for scientists and policy makers. We introduce BReW, a tool enables users to make rapid, highlevel platform selection for their workflows using limited workflow knowledge. This helps make informed decisions on whether to port a workflow to a new platform. Our analysis of synthetic and real eScience workflows shows that using just total runtime length, maximum task fanout, and total data used and produced by the workflow, BReW can provide platform predictions comparable to whitebox models with detailed workflow knowledge.

  3. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  4. The CESM Workflow Re-Engineering Project

    Science.gov (United States)

    Strand, G.

    2015-12-01

    The Community Earth System Model (CESM) Workflow Re-Engineering Project is a collaborative project between the CESM Software Engineering Group (CSEG) and the NCAR Computation and Information Systems Lab (CISL) Application Scalability and Performance (ASAP) Group to revamp how CESM saves its output. The CMIP3 and particularly CMIP5 experiences in submitting CESM data to those intercomparison projects revealed that the output format of the CESM is not well-suited for the data requirements common to model intercomparison projects. CESM, for efficiency reasons, creates output files containing all fields for each model time sampling, but MIPs require individual files for each field comprising all model time samples. This transposition of model output can be very time-consuming; depending on the volume of data written by the specific simulation, the time to re-orient the data can be comparable to the time required for the simulation to complete. Previous strategies including using serial tools to perform this transposition, but they are now far too inefficient to deal with the many terabytes of output a single simulation can generate. A new set of Python tools, using data parallelism, have been written to enable this re-orientation, and have achieved markedly improved I/O performance. The perspective of a data manager/data producer in the use of these new tools is presented, and likely future work on their development and use will be shown. These tools are a critical part of the NCAR CESM submission to the upcoming CMIP6, with the intention that a much more timely and efficient submission of the expected petabytes of data will be accomplished in the given time frame.

  5. Automatic chemical monitoring in the composition of functions performed by the unit level control system in the new projects of nuclear power plant units

    Science.gov (United States)

    Denisova, L. G.; Khrennikov, N. N.

    2014-08-01

    The article presents information on the state of regulatory framework and development of a subsystem for automated chemical monitoring of water chemistries in the primary and secondary coolant circuits used as part of the automatic process control system in new projects of VVER reactor-based nuclear power plant units. For the strategy of developing and putting in use the water chemistry-related part of the automated process control system within the standard AES-2006 nuclear power plant project to be implemented, it is necessary to develop regulatory documents dealing with certain requirements imposed on automatic water chemistry monitoring systems in accordance with the requirements of federal codes and regulations in the field of using atomic energy.

  6. 工作流管理规范综述%The Workflow Management Specification Overview

    Institute of Scientific and Technical Information of China (English)

    陈畅; 吴朝晖

    2000-01-01

    Workflow Management is a fast evolving technology,many software vendors have WFM products available today in the market.To enable interoperability between heterogeneous workflow products and improve integration of workflow applications with other IT services,it is necessary to work out common specifications.The purpose of this paper is to provide a framework to specifications for implementation in workflow products developed by the WFM Coalition.It provides a common"Reference Model"for workflow management systems.

  7. Run-time revenue maximization for composite web services with response time commitments

    NARCIS (Netherlands)

    Živković, M.; Bosman, J.W.; Berg, H. van den; Mei, R. van der; Meeuwissen, H.B.; Núñez-Queija, R.

    2012-01-01

    We investigate dynamic decision mechanisms for composite web services maximizing the expected revenue for the providers of composite services. A composite web service is represented by a (sequential) workflow, and for each task within this workflow, a number of service alternatives may be available.

  8. Automatically updating predictive modeling workflows support decision-making in drug design.

    Science.gov (United States)

    Muegge, Ingo; Bentzien, Jörg; Mukherjee, Prasenjit; Hughes, Robert O

    2016-09-01

    Using predictive models for early decision-making in drug discovery has become standard practice. We suggest that model building needs to be automated with minimum input and low technical maintenance requirements. Models perform best when tailored to answering specific compound optimization related questions. If qualitative answers are required, 2-bin classification models are preferred. Integrating predictive modeling results with structural information stimulates better decision making. For in silico models supporting rapid structure-activity relationship cycles the performance deteriorates within weeks. Frequent automated updates of predictive models ensure best predictions. Consensus between multiple modeling approaches increases the prediction confidence. Combining qualified and nonqualified data optimally uses all available information. Dose predictions provide a holistic alternative to multiple individual property predictions for reaching complex decisions.

  9. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  10. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  11. CamBAfx: Workflow Design, Implementation and Application for Neuroimaging.

    Science.gov (United States)

    Ooi, Cinly; Bullmore, Edward T; Wink, Alle-Meije; Sendur, Levent; Barnes, Anna; Achard, Sophie; Aspden, John; Abbott, Sanja; Yue, Shigang; Kitzbichler, Manfred; Meunier, David; Maxim, Voichita; Salvador, Raymond; Henty, Julian; Tait, Roger; Subramaniam, Naresh; Suckling, John

    2009-01-01

    CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers) and those who design them (designers). It provides a front-end (user interface) optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  12. CamBAfx: workflow design, implementation and application for neuroimaging

    Directory of Open Access Journals (Sweden)

    Cinly Ooi

    2009-08-01

    Full Text Available CamBAfx is a workflow application designed for both researchers who use workflows to process data (consumers and those who design them (designers. It provides a front-end (user interface optimized for data processing designed in a way familiar to consumers. The back-end uses a pipeline model to represent workflows since this is a common and useful metaphor used by designers and is easy to manipulate compared to other representations like programming scripts. As an Eclipse Rich Client Platform application, CamBAfx's pipelines and functions can be bundled with the software or downloaded post-installation. The user interface contains all the workflow facilities expected by consumers. Using the Eclipse Extension Mechanism designers are encouraged to customize CamBAfx for their own pipelines. CamBAfx wraps a workflow facility around neuroinformatics software without modification. CamBAfx's design, licensing and Eclipse Branding Mechanism allow it to be used as the user interface for other software, facilitating exchange of innovative computational tools between originating labs.

  13. Nursing medication administration and workflow using computerized physician order entry.

    Science.gov (United States)

    Tschannen, Dana; Talsma, Akkeneel; Reinemeyer, Nicholas; Belt, Christine; Schoville, Rhonda

    2011-07-01

    The benefits of computerized physician order entry systems have been described widely; however, the impact of computerized physician order entry on nursing workflow and its potential for error are unclear. The purpose of this study was to determine the impact of a computerized physician order entry system on nursing workflow. Using an exploratory design, nurses employed on an adult ICU (n = 36) and a general pediatric unit (n = 50) involved in computerized physician order entry-based medication delivery were observed. Nurses were also asked questions regarding the impact of computerized physician order entry on nursing workflow. Observations revealed total time required for administering medications averaged 8.45 minutes in the ICU and 9.93 minutes in the pediatric unit. Several additional steps were required in the process for pediatric patients, including preparing the medications and communicating with patients and family, which resulted in greater time associated with the delivery of medications. Frequent barriers to workflow were noted by nurses across settings, including system issues (ie, inefficient medication reconciliation processes, long order sets requiring more time to determine medication dosage), less frequent interaction between the healthcare team, and greater use of informal communication modes. Areas for nursing workflow improvement include (1) medication reconciliation/order duplication, (2) strategies to improve communication, and (3) evaluation of the impact of computerized physician order entry on practice standards.

  14. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  15. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  16. Optimizing high performance computing workflow for protein functional annotation.

    Science.gov (United States)

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  17. Automatic generation of 2D micromechanical finite element model of silicon–carbide/aluminum metal matrix composites: Effects of the boundary conditions

    DEFF Research Database (Denmark)

    Qing, Hai

    2013-01-01

    Two-dimensional finite element (FE) simulations of the deformation and damage evolution of Silicon–Carbide (SiC) particle reinforced aluminum alloy composite including interphase are carried out for different microstructures and particle volume fractions of the composites. A program is developed...

  18. compMS2Miner: An Automatable Metabolite Identification, Visualization, and Data-Sharing R Package for High-Resolution LC-MS Data Sets.

    Science.gov (United States)

    Edmands, William M B; Petrick, Lauren; Barupal, Dinesh K; Scalbert, Augustin; Wilson, Mark J; Wickliffe, Jeffrey K; Rappaport, Stephen M

    2017-04-04

    A long-standing challenge of untargeted metabolomic profiling by ultrahigh-performance liquid chromatography-high-resolution mass spectrometry (UHPLC-HRMS) is efficient transition from unknown mass spectral features to confident metabolite annotations. The compMS(2)Miner (Comprehensive MS(2) Miner) package was developed in the R language to facilitate rapid, comprehensive feature annotation using a peak-picker-output and MS(2) data files as inputs. The number of MS(2) spectra that can be collected during a metabolomic profiling experiment far outweigh the amount of time required for pain-staking manual interpretation; therefore, a degree of software workflow autonomy is required for broad-scale metabolite annotation. CompMS(2)Miner integrates many useful tools in a single workflow for metabolite annotation and also provides a means to overview the MS(2) data with a Web application GUI compMS(2)Explorer (Comprehensive MS(2) Explorer) that also facilitates data-sharing and transparency. The automatable compMS(2)Miner workflow consists of the following steps: (i) matching unknown MS(1) features to precursor MS(2) scans, (ii) filtration of spectral noise (dynamic noise filter), (iii) generation of composite mass spectra by multiple similar spectrum signal summation and redundant/contaminant spectra removal, (iv) interpretation of possible fragment ion substructure using an internal database, (v) annotation of unknowns with chemical and spectral databases with prediction of mammalian biotransformation metabolites, wrapper functions for in silico fragmentation software, nearest neighbor chemical similarity scoring, random forest based retention time prediction, text-mining based false positive removal/true positive ranking, chemical taxonomic prediction and differential evolution based global annotation score optimization, and (vi) network graph visualizations, data curation, and sharing are made possible via the compMS(2)Explorer application. Metabolite identities and

  19. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-05-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues' expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable "software appliance" to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish "talkoot" (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a "science story" in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of interest will be

  20. An Overview of Workflow Management on Mobile Agent Technology

    Directory of Open Access Journals (Sweden)

    Anup Patnaik

    2014-07-01

    Full Text Available Mobile agent workflow management/plugins is quite appropriate to handle control flows in open distributed system; basically it is the emerging technology which can bring the process oriented tasks to run as a single unit from diverse frameworks. This workflow technology offers organizations the opportunity to reshape business processes beyond the boundaries of their own organizations so that instead of static models, modern era incurring dynamic workflows which can respond the changes during its execution, provide necessary security measures, great degree of adaptivity, troubleshoot the running processes and recovery of lost states through fault tolerance. The prototype that we are planning to design makes sure to hold reliability, security, robustness, scalability without being forced to make tradeoffs the performance. This paper is concerned with design, implementation and evaluation of performance on the improved methods of proposed prototype models based on current research in this domain.

  1. A HYBRID PETRI-NET MODEL OF GRID WORKFLOW

    Institute of Scientific and Technical Information of China (English)

    Ji Yimu; Wang Ruchuan; Ren Xunyi

    2008-01-01

    In order to effectively control the random tasks submitted and executed in grid workflow, a grid workflow model based on hybrid petri-net is presented. This model is composed of random petri-net, colored petri-net and general petri-net. Therein random petri-net declares the relationship between the number of grid users' random tasks and the size of service window and computes the server intensity of grid system. Colored petri-net sets different color for places with grid services and provides the valid interfaces for grid resource allocation and task scheduling. The experiment indicated that the model presented in this letter could compute the valve between the number of users' random tasks and the size of grid service window in grid workflow management system.

  2. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  3. A Hybrid Authorization Model For Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    Zhang Xiaoguang(张晓光); Cao Jian; Zhang Shensheng

    2003-01-01

    In the context of workflow systems, security-relevant aspect is related to the assignment of activities to (human or automated) agents. This paper intends to cast light on the management of project-oriented workflow. A comprehensive authorization model is proposed from the perspective of project management. In this model, the concept of activity decomposition and team is introduced, which improves the security of conventional role-based access control. Furthermore, policy is provided to define the static and dynamic constraints such as Separation of Duty (SoD). Validity of constraints is proposed to provide a fine-grained assignment, which improves the performance of policy management. The model is applicable not only to project-oriented workflow applications but also to other teamwork environments such as virtual enterprise.

  4. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  5. Hybrid Workflow Policy Management for Heart Disease Identification

    Directory of Open Access Journals (Sweden)

    Dong-Hyun Kim

    2009-12-01

    Full Text Available As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS in Grid computing are promisingsolution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we proposehybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS workflow applications.

  6. Hybrid Workflow Policy Management for Heart Disease Identification

    CERN Document Server

    Kim, Dong-Hyun; Youn, Chan-Hyun

    2010-01-01

    As science technology grows, medical application is becoming more complex to solve the physiological problems within expected time. Workflow management systems (WMS) in Grid computing are promising solution to solve the sophisticated problem such as genomic analysis, drug discovery, disease identification, etc. Although existing WMS can provide basic management functionality in Grid environment, consideration of user requirements such as performance, reliability and interaction with user is missing. In this paper, we propose hybrid workflow management system for heart disease identification and discuss how to guarantee different user requirements according to user SLA. The proposed system is applied to Physio-Grid e-health platform to identify human heart disease with ECG analysis and Virtual Heart Simulation (VHS) workflow applications.

  7. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  8. Integration of the radiotherapy irradiation planning in the digital workflow; Integration der Bestrahlungsplanung in den volldigitalen Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Roehner, F.; Schmucker, M.; Henne, K.; Bruggmoser, G.; Grosu, A.L.; Frommhold, H.; Heinemann, F.E. [Universitaetsklinikum Freiburg (Germany). Klinik fuer Strahlenheilkunde; Momm, F. [Ortenau Klinikum, Offenburg-Gengenbach (Germany). Radio-Onkologie

    2013-02-15

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  9. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2014-01-01

    Cue Rondo is an open composition to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". Caution: streaming the sound/video files will in some cases only provide a few minutes' sample, or the visuals will not appear at all...

  10. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2011-01-01

    Strategies are open compositions to be realised by improvising musicians. See more about my composition practise in the entry "Composition - General Introduction". Caution: streaming the sound files will in some cases only provide a few minutes' sample. Please DOWNLOAD them to hear them in full...

  11. Composition

    DEFF Research Database (Denmark)

    Bergstrøm-Nielsen, Carl

    2010-01-01

    New Year is an open composition to be realised by improvising musicians. It is included in "From the Danish Seasons" (see under this title). See more about my composition practise in the entry "Composition - General Introduction". This work is licensed under a Creative Commons "by-nc" License. You...

  12. Scientific Workflow Systems for 21st Century e-Science, New Bottle or New Wine?

    CERN Document Server

    Zhao, Yong; Foster, Ian

    2008-01-01

    With the advances in e-Sciences and the growing complexity of scientific analyses, more and more scientists and researchers are relying on workflow systems for process coordination, derivation automation, provenance tracking, and bookkeeping. While workflow systems have been in use for decades, it is unclear whether scientific workflows can or even should build on existing workflow technologies, or they require fundamentally new approaches. In this paper, we analyze the status and challenges of scientific workflows, investigate both existing technologies and emerging languages, platforms and systems, and identify the key challenges that must be addressed by workflow systems for e-science in the 21st century.

  13. A Drupal-Based Collaborative Framework for Science Workflows

    Science.gov (United States)

    Pinheiro da Silva, P.; Gandara, A.

    2010-12-01

    Cyber-infrastructure is built from utilizing technical infrastructure to support organizational practices and social norms to provide support for scientific teams working together or dependent on each other to conduct scientific research. Such cyber-infrastructure enables the sharing of information and data so that scientists can leverage knowledge and expertise through automation. Scientific workflow systems have been used to build automated scientific systems used by scientists to conduct scientific research and, as a result, create artifacts in support of scientific discoveries. These complex systems are often developed by teams of scientists who are located in different places, e.g., scientists working in distinct buildings, and sometimes in different time zones, e.g., scientist working in distinct national laboratories. The sharing of these specifications is currently supported by the use of version control systems such as CVS or Subversion. Discussions about the design, improvement, and testing of these specifications, however, often happen elsewhere, e.g., through the exchange of email messages and IM chatting. Carrying on a discussion about these specifications is challenging because comments and specifications are not necessarily connected. For instance, the person reading a comment about a given workflow specification may not be able to see the workflow and even if the person can see the workflow, the person may not specifically know to which part of the workflow a given comments applies to. In this paper, we discuss the design, implementation and use of CI-Server, a Drupal-based infrastructure, to support the collaboration of both local and distributed teams of scientists using scientific workflows. CI-Server has three primary goals: to enable information sharing by providing tools that scientists can use within their scientific research to process data, publish and share artifacts; to build community by providing tools that support discussions between

  14. Science Gateways, Scientific Workflows and Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.

    2014-12-01

    Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing

  15. Flash Builder and Flash Catalyst The New Workflow

    CERN Document Server

    Peeters, Steven

    2010-01-01

    The Flash Platform is changing. Flash Builder and Flash Catalyst have brought a new separation of design and coding to web development that enables a much more efficient and streamlined workflow. For designers and developers used to the close confines of Flash, this is a hugely liberating but at first alien concept. This book teaches you the new workflow for the Flash platform. It gives an overview of the technologies involved and provides you with real-world project examples and best-practice guidelines to get from design to implementation with the tools at hand. * Includes many examples* Foc

  16. Linking Geobiology Fieldwork and Data Curation Through Workflow Documentation

    Science.gov (United States)

    Thomer, A.; Baker, K. S.; Jett, J. G.; Gordon, S.; Palmer, C. L.

    2014-12-01

    Describing the specific processes and artifacts that lead to the creation of data products provides a detailed picture of data provenance in the form of a high-level workflow. The resulting diagram identifies:1. "points of intervention" at which curation processes can be moved upstream, and 2. data products that may be important for sharing and preservation. The Site-Based Data Curation project, an Institute of Museum and Library Services-funded project hosted by the Center for Informatics Research in Science and Scholarship at the University of Illinois, previously inferred a geobiologist's planning, field and laboratory workflows through close study of the data products produced during a single field trip to Yellowstone National Park (Wickett et al, 2013). We have since built on this work by documenting post hoc curation processes, and integrating them with the existing workflow. By holistically considering both data collection and curation, we are able to identify concrete steps that scientists can take to begin curating data in the field. This field-to-repository workflow represents a first step toward a more comprehensive and nuanced model of the research data lifecycle. Using our initial three-phase workflow, we identify key data products to prioritize for curation, and the points at which data curation best practices integrate with research processes with minimal interruption. We then document the processes that make key data products sharable and ready for preservation. We append the resulting curatorial phases to the field data collection workflow: Data Staging, Data Standardizing and Data Packaging. These refinements demonstrate:1) the interdependence of research and curatorial phases;2) the links between specific research products, research phases and curatorial processes; 3) the interdependence of laboratory-specific standards and community-wide best practices. We propose a poster that shows the six-phase workflow described above. We plan to discuss

  17. A Family of RBAC- Based Workflow Authorization Models

    Institute of Scientific and Technical Information of China (English)

    HONG Fan; XING Guang-lin

    2005-01-01

    A family of RBAC-based workflow authorization models, called RWAM, are proposed. RWAM consists of a basic model and other models constructed from the basic model. The basic model provides the notion of temporal permission which means that a user can perform certain operation on a task only for a time interval, this not only ensure that only authorized users could execute a task but also ensure that the authorization flow is synchronized with workflow. The two advance models of RWAM deal with role hierarchy and constraints respectively. RWAM ranges from simple to complex and provides a general reference model for other researches and developments of such area.

  18. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    a set of local process graphs communicating by shared events, such that the distributed execution of the local processes is equivalent to executing the original process. The technique is based on our recent similar work on safe distribution of Dynamic Condition Response (DCR) Graphs applied to cross......We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events...

  19. Visualized Workflow System Based on XML and Relational Database%基于XML和关系数据库的可视化工作流系统

    Institute of Scientific and Technical Information of China (English)

    陈谊; 侯堃; 新吉乐; 陈红倩

    2012-01-01

    In order to implement a workflow management system which can support visual designing and execution of a project management workflow, a 4-layer workflow management system framework based on XML and related database was proposed, which covered the concepts, history and current advance of workflow management system, with introduction of dominant standards and technologies, and described the 4-layer system structure and system composition, data structure, the connect method of each procedure in workflow, the parsing method for XML, and implementation techniques of visual design of workflow. The workflow management system was realized in ASP.NET environment. Application shows that the system could effectively realize the project management process.%以实现一个具有可视化工作流设计和执行功能的工作流管理系统为目标,提出了一种将XML技术和关系数据库技术相结合、具有层次体系结构的工作流管理系统框架。介绍了工作流管理技术的概念、历史现状以及主要标准和技术,给出了系统的层次结构和系统组成,详细描述了系统的数据存储结构、工作流交接处理过程、XML解析方法以及利用客户端脚本技术实现工作流设计可视化的具体方法,最后在.NET平台上实现了该系统。实际应用表明,该系统能够有效实现企业项目管理中的工作流程管理。

  20. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp...

  1. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...

  2. Putting Lipstick on Pig: Enabling Database-style Workflow Provenance

    CERN Document Server

    Amsterdamer, Yael; Deutch, Daniel; Milo, Tova; Stoyanovich, Julia; Tannen, Val

    2012-01-01

    Workflow provenance typically assumes that each module is a "black-box", so that each output depends on all inputs (coarse-grained dependencies). Furthermore, it does not model the internal state of a module, which can change between repeated executions. In practice, however, an output may depend on only a small subset of the inputs (fine-grained dependencies) as well as on the internal state of the module. We present a novel provenance framework that marries database-style and workflow-style provenance, by using Pig Latin to expose the functionality of modules, thus capturing internal state and fine-grained dependencies. A critical ingredient in our solution is the use of a novel form of provenance graph that models module invocations and yields a compact representation of fine-grained workflow provenance. It also enables a number of novel graph transformation operations, allowing to choose the desired level of granularity in provenance querying (ZoomIn and ZoomOut), and supporting "what-if" workflow analyti...

  3. Content and Workflow Management for Library Websites: Case Studies

    Science.gov (United States)

    Yu, Holly, Ed.

    2005-01-01

    Using database-driven web pages or web content management (WCM) systems to manage increasingly diverse web content and to streamline workflows is a commonly practiced solution recognized in libraries today. However, limited library web content management models and funding constraints prevent many libraries from purchasing commercially available…

  4. SURVEY OF WORKFLOW ANALYSIS IN PAST AND PRESENT ISSUES

    Directory of Open Access Journals (Sweden)

    SARAVANAN .M.S,

    2011-06-01

    Full Text Available This paper surveys the workflow analysis in the view of business process for all organizations. The business can be defined as an organization that provides goods and services to others, who want or need them. The concept of managing business processes is referred to as Business Process Management (BPM. A workflow is the automation of a business process, in whole or part, during which documents, information or tasks are passed from one participant to another for action, according to a set of procedural rules. The process mining aims at extracting useful and meaningful information from event logs, which is a set of real executions of business process at any organizations. This paper briefly reviews the state-or-the-art of business processes developed so far and the techniques adopted. Also presents, the survey of workflow analysis in the view of business process can be broadly classified into four major categories, they are Business Process Modeling, Ontology based Business Process Management, Workflow based Business Process Controlling and Business Process Mining.

  5. Actor-driven workflow execution in distributed environments

    NARCIS (Netherlands)

    F. Berretz; S. Skorupa; V. Sander; A. Belloum; M. Bubak

    2010-01-01

    Currently, most workflow management systems (WfMS) in Grid environments provide push-oriented task distribution strategies, where tasks are directly bound to suitable resources. In those scenarios the dedicated resources execute the submitted tasks according to the request of a WfMS or sometimes by

  6. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas

    2013-01-01

    with researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction...

  7. Images crossing borders: image and workflow sharing on multiple levels.

    Science.gov (United States)

    Ross, Peeter; Pohjonen, Hanna

    2011-04-01

    Digitalisation of medical data makes it possible to share images and workflows between related parties. In addition to linear data flow where healthcare professionals or patients are the information carriers, a new type of matrix of many-to-many connections is emerging. Implementation of shared workflow brings challenges of interoperability and legal clarity. Sharing images or workflows can be implemented on different levels with different challenges: inside the organisation, between organisations, across country borders, or between healthcare institutions and citizens. Interoperability issues vary according to the level of sharing and are either technical or semantic, including language. Legal uncertainty increases when crossing national borders. Teleradiology is regulated by multiple European Union (EU) directives and legal documents, which makes interpretation of the legal system complex. To achieve wider use of eHealth and teleradiology several strategic documents were published recently by the EU. Despite EU activities, responsibility for organising, providing and funding healthcare systems remains with the Member States. Therefore, the implementation of new solutions requires strong co-operation between radiologists, societies of radiology, healthcare administrators, politicians and relevant EU authorities. The aim of this article is to describe different dimensions of image and workflow sharing and to analyse legal acts concerning teleradiology in the EU.

  8. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, Paul

    2002-01-01

    Electronic service outsourcing creates a new paradigm for automated enterprise collaboration. The service-oriented paradigm requires a high level of flexibility of current workflow management systems and support for Business-to-Business (B2B) collaboration to realize collaborative enterprises. This

  9. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  10. CrossFlow: Integrating Workflow Management and Electronic Commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  11. Dynamic profit optimization of composite web services with SLAs

    NARCIS (Netherlands)

    Živković, M.; Bosman, J.W.; van den Berg, J.L.; van der Mei, R.D.; Meeuwissen, H.B.; Núñez-Queija, R.

    2011-01-01

    In this paper we investigate sequential decision mechanisms for composite web services. After executing each sub-service within a sequential workflow, decisions are made whether to terminate or continue the execution of the workflow. These decisions are based on observed response times, expected rew

  12. Dynamic Profit Optimization of Composite Web Services with SLAs

    NARCIS (Netherlands)

    Zivkovic, M.; Bosman, J.W.; Berg, J.L. van den; Mei, R.D. van der; Meeuwissen, H.B.; Nunez-Queija, R.

    2011-01-01

    In this paper we investigate sequential decision mechanisms for composite web services. After executing each sub-service within a sequential workflow, decisions are made whether to terminate or continue the execution of the workflow. These decisions are based on observed response times, expected rew

  13. An integrated development workflow for community-driven FOSS-projects using continuous integration tools

    Science.gov (United States)

    Bilke, Lars; Watanabe, Norihiro; Naumov, Dmitri; Kolditz, Olaf

    2016-04-01

    A complex software project in general with high standards regarding code quality requires automated tools to help developers in doing repetitive and tedious tasks such as compilation on different platforms and configurations, doing unit testing as well as end-to-end tests and generating distributable binaries and documentation. This is known as continuous integration (CI). A community-driven FOSS-project within the Earth Sciences benefits even more from CI as time and resources regarding software development are often limited. Therefore testing developed code on more than the developers PC is a task which is often neglected and where CI can be the solution. We developed an integrated workflow based on GitHub, Travis and Jenkins for the community project OpenGeoSys - a coupled multiphysics modeling and simulation package - allowing developers to concentrate on implementing new features in a tight feedback loop. Every interested developer/user can create a pull request containing source code modifications on the online collaboration platform GitHub. The modifications are checked (compilation, compiler warnings, memory leaks, undefined behaviors, unit tests, end-to-end tests, analyzing differences in simulation run results between changes etc.) from the CI system which automatically responds to the pull request or by email on success or failure with detailed reports eventually requesting to improve the modifications. Core team developers review the modifications and merge them into the main development line once they satisfy agreed standards. We aim for efficient data structures and algorithms, self-explaining code, comprehensive documentation and high test code coverage. This workflow keeps entry barriers to get involved into the project low and permits an agile development process concentrating on feature additions rather than software maintenance procedures.

  14. A Merit-Based Architecture for the Automatic Selection and Composition of Services in SOA-Based C4ISR Systems

    Science.gov (United States)

    2008-06-01

    25 Figure 6. Choreography (From: [39...WSDL and BPEL syntax to include timing deadlines, QoS, MoP, and MOE for the basic services, their choreography and alternative exception handlers...research better. Choreography is the composition of web services in a manner where each WS knows exactly when it executes its operations and who or what it

  15. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  16. Workflows for ingest of research data into digital archives - tests with Archivematica

    Science.gov (United States)

    Kirchner, I.; Bertelmann, R.; Gebauer, P.; Hasler, T.; Hirt, M.; Klump, J. F.; Peters-Kotting, W.; Rusch, B.; Ulbricht, D.

    2013-12-01

    Publication of research data and future re-use of measured data require the long-term preservation of digital objects. The ISO OAIS reference model defines responsibilities for long-term preservation of digital objects and although there is software available to support preservation of digital data, there are still problems remaining to be solved. A key task in preservation is to make the datasets ready for ingest into the archive, which is called the creation of Submission Information Packages (SIPs) in the OAIS model. This includes the creation of appropriate preservation metadata. Scientists need to be trained to deal with different types of data and to heighten their awareness for quality metadata. Other problems arise during the assembly of SIPs and during ingest into the archive because file format validators may produce conflicting output for identical data files and these conflicts are difficult to resolve automatically. Also, validation and identification tools are notorious for their poor performance. In the project EWIG Zuse-Institute Berlin acts as an infrastructure facility, while the Institute for Meteorology at FU Berlin and the German research Centre for Geosciences GFZ act as two different data producers. The aim of the project is to develop workflows for the transfer of research data into digital archives and the future re-use of data from long-term archives with emphasis on data from the geosciences. The technical work is supplemented by interviews with data practitioners at several institutions to identify problems in digital preservation workflows and by the development of university teaching materials to train students in the curation of research data and metadata. The free and open-source software Archivematica [1] is used as digital preservation system. The creation and ingest of SIPs has to meet several archival standards and be compatible to the Metadata Encoding and Transmission Standard (METS). The two data producers use different

  17. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  18. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  19. Characterizing workflow-based activity on a production e-infrastructure using provenance data.

    NARCIS (Netherlands)

    Madougou, S.; Shahand, S.; Santcroos, M.; van Schaik, B.; Benabdelkader, A.; van Kampen, A.; Olabarriaga, S.

    2013-01-01

    Grid computing and workflow management systems emerged as solutions to the challenges arising from the processing and storage of shear volumes of data generated by modern simulations and data acquisition devices. Workflow management systems usually document the process of the workflow execution eith

  20. Performance prediction for Grid workflow activities based on features-ranked RBF network

    Institute of Scientific and Technical Information of China (English)

    Wang Jie; Duan Rubing; Farrukh Nadeem

    2009-01-01

    Accurate performance prediction of Grid workflow activities can help Grid schedulers map activities to appropriate Grid sites. This paper describes an approach based on features-ranked RBF neural network to predict the performance of Grid workflow activities. Experimental results for two kinds of real world Grid workflow activities are presented to show effectiveness of our approach.

  1. 基于改进层级任务网络规划的Web服务自动组合%Automatic Web services composition based on improved HTN planning

    Institute of Scientific and Technical Information of China (English)

    萧毅鸿; 周献中; 朱亮; 苏正炼; 陆晓明

    2012-01-01

    为了在单一Web服务无法满足问题求解需求时可以自动地组合多个Web服务以解决复杂问题,利用服务本体描述语言OWL-S对常规Web服务进行语义封装,并选用层级任务网络(HTN)规划作为技术手段.在分析了OWL-S与HTN的相似性后,对HTN的操作算子(operator)和方法(method)定义进行改写,并对常规HTN规划算法进行扩展,使之成为具有领域知识的HTN规划,更适用于解决领域相关的服务组合问题.最后提出了一种基于改进HTN规划的服务自动组合规划器框架并搭建了试验系统.案例研究结果表明,这种改进的HTN规划算法在领域本体的配合下可以有效支持Web服务的自动组合.%In order to solve complex problems by automatical compositing of several Web services for no exact single one, an ontology language of web services called OWL-S was applied in semantic encapsulation of Web services with an AI planning method called hierarchical task network ( HTN) as technical means. Based on the similarity analysis of OWL-S and HTN, the operator and method definitions of HTN were redefined, and the conventional HTN planning algorithm was extended to semantic HTN. Based on a set of OWL-S descriptions and improved HTN planning, a planner frame for supporting services composition was introduced to establish experimental system. The results of case study show that automatic Web services composition with domain ontology can be availably realized by the improved HTN planning algorithm.

  2. A Community-Driven Workflow Recommendations and Reuse Infrastructure

    Science.gov (United States)

    Zhang, J.; Votava, P.; Lee, T. J.; Lee, C.; Xiao, S.; Nemani, R. R.; Foster, I.

    2013-12-01

    Aiming to connect the Earth science community to accelerate the rate of discovery, NASA Earth Exchange (NEX) has established an online repository and platform, so that researchers can publish and share their tools and models with colleagues. In recent years, workflow has become a popular technique at NEX for Earth scientists to define executable multi-step procedures for data processing and analysis. The ability to discover and reuse knowledge (sharable workflows or workflow) is critical to the future advancement of science. However, as reported in our earlier study, the reusability of scientific artifacts at current time is very low. Scientists often do not feel confident in using other researchers' tools and utilities. One major reason is that researchers are often unaware of the existence of others' data preprocessing processes. Meanwhile, researchers often do not have time to fully document the processes and expose them to others in a standard way. These issues cannot be overcome by the existing workflow search technologies used in NEX and other data projects. Therefore, this project aims to develop a proactive recommendation technology based on collective NEX user behaviors. In this way, we aim to promote and encourage process and workflow reuse within NEX. Particularly, we focus on leveraging peer scientists' best practices to support the recommendation of artifacts developed by others. Our underlying theoretical foundation is rooted in the social cognitive theory, which declares people learn by watching what others do. Our fundamental hypothesis is that sharable artifacts have network properties, much like humans in social networks. More generally, reusable artifacts form various types of social relationships (ties), and may be viewed as forming what organizational sociologists who use network analysis to study human interactions call a 'knowledge network.' In particular, we will tackle two research questions: R1: What hidden knowledge may be extracted from

  3. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  4. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  5. Workflow Modelling and Analysis Based on the Construction of Task Models

    Directory of Open Access Journals (Sweden)

    Glória Cravo

    2015-01-01

    Full Text Available We describe the structure of a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions in this paper. To each task an input/output logic operator is associated. Furthermore, we associate a Boolean term to each transition present in the workflow. We still identify the structure of workflows and describe their dynamism through the construction of new task models. This construction is very simple and intuitive since it is based on the analysis of all tasks present on the workflow that allows us to describe the dynamism of the workflow very easily. So, our approach has the advantage of being very intuitive, which is an important highlight of our work. We also introduce the concept of logical termination of workflows and provide conditions under which this property is valid. Finally, we provide a counter-example which shows that a conjecture presented in a previous article is false.

  6. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  7. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  8. Iterative Workflows for Numerical Simulations in Subsurface Sciences

    Energy Technology Data Exchange (ETDEWEB)

    Chase, Jared M.; Schuchardt, Karen L.; Chin, George; Daily, Jeffrey A.; Scheibe, Timothy D.

    2008-07-08

    Numerical simulators are frequently used to assess future risks, support remediation and monitoring program decisions, and assist in design of specific remedial actions with respect to groundwater contaminants. Due to the complexity of the subsurface environment and uncertainty in the models, many alternative simulations must be performed, each producing data that is typically post-processed and analyzed before deciding on the next set of simulations Though parts of the process are readily amenable to automation through scientific workflow tools, the larger”research workflow”, is not supported by current tools. We present a detailed use case for subsurface modeling, describe the use case in terms of workflow structure, briefly summarize a prototype that seeks to facilitate the overall modeling process, and discuss the many challenges for building such a comprehensive environment.

  9. Research on Architecture of Enterprise Modeling in Workflow System

    Institute of Scientific and Technical Information of China (English)

    李伟平; 齐慧彬; 薛劲松; 朱云龙

    2002-01-01

    The market that an enterprise is faced is changing and can 't be forecastedaccurately in this information time. In order to find the chances in the marketpractitioners have focused on business processes through their re-engineeringprogramme to improve enterprise efficiency. It is necessary to manage an enterpriseusing process-based method for the requirement of enhancing work efficiency and theability of competition in the market. And information system developers haveemphasized the use of standard models to accelerate the speed of configuration andimplementation of integrated systems for enterprises. So we have to model anenterprise with process-based modeling method. An architecture of enterprise modelingis presented in this paper. This architecture is composed of four views and supportingthe whole lifecycle of enterprise model. Because workflow management system is basedon process definition, this architecture can be directly used in the workflowmanagement system. The implement method of this model was thoroughly describedmeanwhile the workflow management software supporting the building and running themodel was also given.

  10. Enhanced Phosphoproteomic Profiling Workflow For Growth Factor Signaling Analysis

    DEFF Research Database (Denmark)

    Sylvester, Marc; Burbridge, Mike; Leclerc, Gregory;

    2010-01-01

    Background Our understanding of complex signaling networks is still fragmentary. Isolated processes have been studied extensively but cross-talk is omnipresent and precludes intuitive predictions of signaling outcomes. The need for quantitative data on dynamic systems is apparent especially for our...... A549 lung carcinoma cells were used as a model and stimulated with hepatocyte growth factor, epidermal growth factor or fibroblast growth factor. We employed a quick protein digestion workflow with spin filters without using urea. Phosphopeptides in general were enriched by sequential elution from...... transfer dissociation adds confidence in modification site assignment. The workflow is relatively simple but the integration of complementary techniques leads to a deeper insight into cellular signaling networks and the potential pharmacological intervention thereof....

  11. How Workflow Systems Facilitate Business Process Reengineering and Improvement

    Directory of Open Access Journals (Sweden)

    Mohamed El Khadiri

    2012-03-01

    Full Text Available This paper investigates the relationship between workflow systems and business process reengineering and improvement. The study is based on a real case study at the “Centre Rgional dInvestissement” (CRI of Marrakech, Morocco. The CRI is entrusted to coordinate various investment projects at the regional level. Our previous work has shown that workflow system can be a basis for business process reengineering. However, for continuous process improvement, the system has shown to be insufficient as it fails to deal with exceptions and problem resolutions that informal communications provide. However, when this system is augmented with an expanded corporate memory system that includes social tools, to capture informal communication and data, we are closer to a more complete system that facilitates business process reengineering and improvement.

  12. Building Scientific Workflows for the Geosciences with Open Community Software

    Science.gov (United States)

    Pierce, M. E.; Marru, S.; Weerawarana, S. M.

    2012-12-01

    We describe the design and development of the Apache Airavata scientific workflow software and its application to problems in geosciences. Airavata is based on Service Oriented Architecture principles and is developed as general purpose software for managing large-scale science applications on supercomputing resources such as the NSF's XSEDE. Based on the NSF-funded EarthCube Workflow Working Group activities, we discuss the application of this software relative to specific requirements (such as data stream data processing, event triggering, dealing with large data sets, and advanced distributed execution patterns involved in data mining). We also consider the role of governance in EarthCube software development and present the development of Airavata software through the Apache Software Foundation's community development model. We discuss the potential impacts on software accountability and sustainability using this model.

  13. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  14. A computational workflow for designing silicon donor qubits

    Science.gov (United States)

    Humble, Travis S.; Ericson, M. Nance; Jakowski, Jacek; Huang, Jingsong; Britton, Charles; Curtis, Franklin G.; Dumitrescu, Eugene F.; Mohiyaddin, Fahd A.; Sumpter, Bobby G.

    2016-10-01

    Developing devices that can reliably and accurately demonstrate the principles of superposition and entanglement is an on-going challenge for the quantum computing community. Modeling and simulation offer attractive means of testing early device designs and establishing expectations for operational performance. However, the complex integrated material systems required by quantum device designs are not captured by any single existing computational modeling method. We examine the development and analysis of a multi-staged computational workflow that can be used to design and characterize silicon donor qubit systems with modeling and simulation. Our approach integrates quantum chemistry calculations with electrostatic field solvers to perform detailed simulations of a phosphorus dopant in silicon. We show how atomistic details can be synthesized into an operational model for the logical gates that define quantum computation in this particular technology. The resulting computational workflow realizes a design tool for silicon donor qubits that can help verify and validate current and near-term experimental devices.

  15. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  16. Networked Print Production: Does JDF Provide a Perfect Workflow?

    OpenAIRE

    Bernd Zipper

    2004-01-01

    The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format) and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between p...

  17. Advanced Workflows for Fluid Transfer in Faulted Basins

    Directory of Open Access Journals (Sweden)

    Thibaut Muriel

    2014-07-01

    Full Text Available The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical slip along faults. Fault properties are modeled as vertical shear zones along which rock permeability is adjusted to enhance fluid flow or prevent flow to escape. For basins having experienced a more complex tectonic history, this approach is over-simplified. It fails in understanding and representing fluid flow paths due to structural evolution of the basin. This impacts overpressure build-up, and petroleum resources location. Over the past years, a new 3D basin forward code has been developed in IFP Energies nouvelles that is based on a cell centered finite volume discretization which preserves mass on an unstructured grid and describes the various changes in geometry and topology of a basin through time. At the same time, 3D restoration tools based on geomechanical principles of strain minimization were made available that offer a structural scenario at a discrete number of deformation stages of the basin. In this paper, we present workflows integrating these different innovative tools on complex faulted basin architectures where complex means moderate lateral as well as vertical deformation coupled with dynamic fault property modeling. Two synthetic case studies inspired by real basins have been used to illustrate how to apply the workflow, where the difficulties in the workflows are, and what the added value is compared with previous basin modeling approaches.

  18. Facilitating hydrological data analysis workflows in R: the RHydro package

    Science.gov (United States)

    Buytaert, Wouter; Moulds, Simon; Skoien, Jon; Pebesma, Edzer; Reusser, Dominik

    2015-04-01

    The advent of new technologies such as web-services and big data analytics holds great promise for hydrological data analysis and simulation. Driven by the need for better water management tools, it allows for the construction of much more complex workflows, that integrate more and potentially more heterogeneous data sources with longer tool chains of algorithms and models. With the scientific challenge of designing the most adequate processing workflow comes the technical challenge of implementing the workflow with a minimal risk for errors. A wide variety of new workbench technologies and other data handling systems are being developed. At the same time, the functionality of available data processing languages such as R and Python is increasing at an accelerating pace. Because of the large diversity of scientific questions and simulation needs in hydrology, it is unlikely that one single optimal method for constructing hydrological data analysis workflows will emerge. Nevertheless, languages such as R and Python are quickly gaining popularity because they combine a wide array of functionality with high flexibility and versatility. The object-oriented nature of high-level data processing languages makes them particularly suited for the handling of complex and potentially large datasets. In this paper, we explore how handling and processing of hydrological data in R can be facilitated further by designing and implementing a set of relevant classes and methods in the experimental R package RHydro. We build upon existing efforts such as the sp and raster packages for spatial data and the spacetime package for spatiotemporal data to define classes for hydrological data (HydroST). In order to handle simulation data from hydrological models conveniently, a HM class is defined. Relevant methods are implemented to allow for an optimal integration of the HM class with existing model fitting and simulation functionality in R. Lastly, we discuss some of the design challenges

  19. Advanced Workflows for Fluid Transfer in Faulted Basins.

    OpenAIRE

    Thibaut Muriel; Jardin Anne; Faille Isabelle; Willien Françoise; Guichet Xavier

    2014-01-01

    modélisation de bassin ; faille ; logiciel ;; International audience; The traditional 3D basin modeling workflow is made of the following steps: construction of present day basin architecture, reconstruction of the structural evolution through time, together with fluid flow simulation and heat transfers. In this case, the forward simulation is limited to basin architecture, mainly controlled by erosion, sedimentation and vertical compaction. The tectonic deformation is limited to vertical sli...

  20. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  1. Bioinformatics Workflow for Clinical Whole Genome Sequencing at Partners HealthCare Personalized Medicine

    Directory of Open Access Journals (Sweden)

    Ellen A. Tsai

    2016-02-01

    Full Text Available Effective implementation of precision medicine will be enhanced by a thorough understanding of each patient’s genetic composition to better treat his or her presenting symptoms or mitigate the onset of disease. This ideally includes the sequence information of a complete genome for each individual. At Partners HealthCare Personalized Medicine, we have developed a clinical process for whole genome sequencing (WGS with application in both healthy individuals and those with disease. In this manuscript, we will describe our bioinformatics strategy to efficiently process and deliver genomic data to geneticists for clinical interpretation. We describe the handling of data from FASTQ to the final variant list for clinical review for the final report. We will also discuss our methodology for validating this workflow and the cost implications of running WGS.

  2. Sufficient and Necessary Condition to Decide Compatibility for a Class of Interorganizational Workflow Nets

    Directory of Open Access Journals (Sweden)

    Guanjun Liu

    2015-01-01

    Full Text Available Interorganizational Workflow nets (IWF-nets can well model many concurrent systems such as web service composition, in which multiple processes interact via sending/receiving messages. Compatibility of IWF-nets is a crucial criterion for the correctness of these systems. It guarantees that a system has no deadlock, livelock, or dead tasks. In our previous work we proved that the compatibility problem is PSPACE-complete for safe IWF-nets. This paper defines a subclass of IWF-nets that can model many cases about interactions. Necessary and sufficient condition is presented to decide their compatibility, and it depends on the net structures only. Finally, an algorithm is developed based on the condition.

  3. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  4. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available.

  5. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  6. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  7. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  8. Comparison of Workflow Scheduling Algorithms in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Navjot Kaur

    2011-10-01

    Full Text Available Cloud computing has gained popularity in recent times. Cloud computing is internet based computing, whereby shared resources, software and information are provided to computers and other devices on demand, like a public utility. Cloud computing is technology that uses the internet and central remote servers to maintain data and applications. This technology allows consumers and businesses to use application without installation and access their personal files at any computer with internet access. The main aim of my work is to study various problems, issues and types of scheduling algorithms for cloud workflows as well as on designing new workflow algorithms for cloud Workflow management system. The proposed algorithms are implemented on real time cloud which is developed using Microsoft .Net Technologies. The algorithms are compared with each other on the basis of parameters like Total execution time, Execution time for algorithm, Estimated execution time. Experimental results generated via simulation shown that Algorithm 2 is much better than Algorithm 1, as it reduced makespan time.

  9. Optimal Workflow Scheduling in Critical Infrastructure Systems with Neural Networks

    Directory of Open Access Journals (Sweden)

    S. Vukmirović

    2012-04-01

    Full Text Available Critical infrastructure systems (CISs, such as power grids, transportation systems, communication networks and water systems are the backbone of a country’s national security and industrial prosperity. These CISs execute large numbers of workflows with very high resource requirements that can span through different systems and last for a long time. The proper functioning and synchronization of these workflows is essential since humanity’s well-being is connected to it. Because of this, the challenge of ensuring availability and reliability of these services in the face of a broad range of operating conditions is very complicated. This paper proposes an architecture which dynamically executes a scheduling algorithm using feedback about the current status of CIS nodes. Different artificial neural networks (ANNs were created in order to solve the scheduling problem. Their performances were compared and as the main result of this paper, an optimal ANN architecture for workflow scheduling in CISs is proposed. A case study is shown for a meter data management system with measurements from a power distribution management system in Serbia. Performance tests show that significant improvement of the overall execution time can be achieved by ANNs.

  10. Impact of survey workflow on precision and accuracy of terrestrial LiDAR datasets

    Science.gov (United States)

    Gold, P. O.; Cowgill, E.; Kreylos, O.

    2009-12-01

    Ground-based LiDAR (Light Detection and Ranging) survey techniques are enabling remote visualization and quantitative analysis of geologic features at unprecedented levels of detail. For example, digital terrain models computed from LiDAR data have been used to measure displaced landforms along active faults and to quantify fault-surface roughness. But how accurately do terrestrial LiDAR data represent the true ground surface, and in particular, how internally consistent and precise are the mosaiced LiDAR datasets from which surface models are constructed? Addressing this question is essential for designing survey workflows that capture the necessary level of accuracy for a given project while minimizing survey time and equipment, which is essential for effective surveying of remote sites. To address this problem, we seek to define a metric that quantifies how scan registration error changes as a function of survey workflow. Specifically, we are using a Trimble GX3D laser scanner to conduct a series of experimental surveys to quantify how common variables in field workflows impact the precision of scan registration. Primary variables we are testing include 1) use of an independently measured network of control points to locate scanner and target positions, 2) the number of known-point locations used to place the scanner and point clouds in 3-D space, 3) the type of target used to measure distances between the scanner and the known points, and 4) setting up the scanner over a known point as opposed to resectioning of known points. Precision of the registered point cloud is quantified using Trimble Realworks software by automatic calculation of registration errors (errors between locations of the same known points in different scans). Accuracy of the registered cloud (i.e., its ground-truth) will be measured in subsequent experiments. To obtain an independent measure of scan-registration errors and to better visualize the effects of these errors on a registered point

  11. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  12. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    Directory of Open Access Journals (Sweden)

    Elias Cesar Araujo de Carvalho

    Full Text Available BACKGROUND: With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. METHODOLOGY/PRINCIPAL FINDINGS: Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a lack of standardized process for data registration at source document, (b multiplicity of data repositories and (c scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. CONCLUSIONS/SIGNIFICANCE: Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and

  13. Mediation and Automatization.

    Science.gov (United States)

    Hutchins, Edwin

    This paper discusses the relationship between the mediation of task performance by some structure that is not inherent in the task domain itself and the phenomenon of automatization, in which skilled performance becomes effortless or phenomenologically "automatic" after extensive practice. The use of a common simple explicit mediating…

  14. Digital automatic gain control

    Science.gov (United States)

    Uzdy, Z.

    1980-01-01

    Performance analysis, used to evaluated fitness of several circuits to digital automatic gain control (AGC), indicates that digital integrator employing coherent amplitude detector (CAD) is best device suited for application. Circuit reduces gain error to half that of conventional analog AGC while making it possible to automatically modify response of receiver to match incoming signal conditions.

  15. Automatic Differentiation Package

    Energy Technology Data Exchange (ETDEWEB)

    2007-03-01

    Sacado is an automatic differentiation package for C++ codes using operator overloading and C++ templating. Sacado provide forward, reverse, and Taylor polynomial automatic differentiation classes and utilities for incorporating these classes into C++ codes. Users can compute derivatives of computations arising in engineering and scientific applications, including nonlinear equation solving, time integration, sensitivity analysis, stability analysis, optimization and uncertainity quantification.

  16. Information Management Workflow and Tools Enabling Multiscale Modeling Within ICME Paradigm

    Science.gov (United States)

    Arnold, Steven M.; Bednarcyk, Brett A.; Austin, Nic; Terentjev, Igor; Cebon, Dave; Marsden, Will

    2016-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for analytical tools that enable the virtual design and optimization of materials throughout their processing - internal structure - property - performance envelope, along with the capturing and storing of the associated material and model information across its lifecycle, has become critical. This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Fortunately, material information management systems and physics-based multiscale modeling methods have kept pace with the growing user demands. Herein, recent efforts to establish workflow for and demonstrate a unique set of web application tools for linking NASA GRC's Integrated Computational Materials Engineering (ICME) Granta MI database schema and NASA GRC's Integrated multiscale Micromechanics Analysis Code (ImMAC) software toolset are presented. The goal is to enable seamless coupling between both test data and simulation data, which is captured and tracked automatically within Granta MI®, with full model pedigree information. These tools, and this type of linkage, are foundational to realizing the full potential of ICME, in which materials processing, microstructure, properties, and performance are coupled to enable application-driven design and optimization of materials and structures.

  17. An automated and reproducible workflow for running and analyzing neural simulations using Lancet and IPython Notebook.

    Science.gov (United States)

    Stevens, Jean-Luc R; Elver, Marco; Bednar, James A

    2013-01-01

    Lancet is a new, simulator-independent Python utility for succinctly specifying, launching, and collating results from large batches of interrelated computationally demanding program runs. This paper demonstrates how to combine Lancet with IPython Notebook to provide a flexible, lightweight, and agile workflow for fully reproducible scientific research. This informal and pragmatic approach uses IPython Notebook to capture the steps in a scientific computation as it is gradually automated and made ready for publication, without mandating the use of any separate application that can constrain scientific exploration and innovation. The resulting notebook concisely records each step involved in even very complex computational processes that led to a particular figure or numerical result, allowing the complete chain of events to be replicated automatically. Lancet was originally designed to help solve problems in computational neuroscience, such as analyzing the sensitivity of a complex simulation to various parameters, or collecting the results from multiple runs with different random starting points. However, because it is never possible to know in advance what tools might be required in future tasks, Lancet has been designed to be completely general, supporting any type of program as long as it can be launched as a process and can return output in the form of files. For instance, Lancet is also heavily used by one of the authors in a separate research group for launching batches of microprocessor simulations. This general design will allow Lancet to continue supporting a given research project even as the underlying approaches and tools change.

  18. The BEL information extraction workflow (BELIEF): evaluation in the BioCreative V BEL and IAT track

    Science.gov (United States)

    Madan, Sumit; Hodapp, Sven; Senger, Philipp; Ansari, Sam; Szostak, Justyna; Hoeng, Julia; Peitsch, Manuel; Fluck, Juliane

    2016-01-01

    Network-based approaches have become extremely important in systems biology to achieve a better understanding of biological mechanisms. For network representation, the Biological Expression Language (BEL) is well designed to collate findings from the scientific literature into biological network models. To facilitate encoding and biocuration of such findings in BEL, a BEL Information Extraction Workflow (BELIEF) was developed. BELIEF provides a web-based curation interface, the BELIEF Dashboard, that incorporates text mining techniques to support the biocurator in the generation of BEL networks. The underlying UIMA-based text mining pipeline (BELIEF Pipeline) uses several named entity recognition processes and relationship extraction methods to detect concepts and BEL relationships in literature. The BELIEF Dashboard allows easy curation of the automatically generated BEL statements and their context annotations. Resulting BEL statements and their context annotations can be syntactically and semantically verified to ensure consistency in the BEL network. In summary, the workflow supports experts in different stages of systems biology network building. Based on the BioCreative V BEL track evaluation, we show that the BELIEF Pipeline automatically extracts relationships with an F-score of 36.4% and fully correct statements can be obtained with an F-score of 30.8%. Participation in the BioCreative V Interactive task (IAT) track with BELIEF revealed a systems usability scale (SUS) of 67. Considering the complexity of the task for new users—learning BEL, working with a completely new interface, and performing complex curation—a score so close to the overall SUS average highlights the usability of BELIEF. Database URL: BELIEF is available at http://www.scaiview.com/belief/ PMID:27694210

  19. Composites

    Science.gov (United States)

    Taylor, John G.

    The Composites market is arguably the most challenging and profitable market for phenolic resins aside from electronics. The variety of products and processes encountered creates the challenges, and the demand for high performance in critical operations brings value. Phenolic composite materials are rendered into a wide range of components to supply a diverse and fragmented commercial base that includes customers in aerospace (Space Shuttle), aircraft (interiors and brakes), mass transit (interiors), defense (blast protection), marine, mine ducting, off-shore (ducts and grating) and infrastructure (architectural) to name a few. For example, phenolic resin is a critical adhesive in the manufacture of honeycomb sandwich panels. Various solvent and water based resins are described along with resin characteristics and the role of metal ions for enhanced thermal stability of the resin used to coat the honeycomb. Featured new developments include pultrusion of phenolic grating, success in RTM/VARTM fabricated parts, new ballistic developments for military vehicles and high char yield carbon-carbon composites along with many others. Additionally, global regional market resin volumes and sales are presented and compared with other thermosetting resin systems.

  20. Information Issues and Contexts that Impair Team Based Communication Workflow: A Palliative Sedation Case Study.

    Science.gov (United States)

    Cornett, Alex; Kuziemsky, Craig

    2015-01-01

    Implementing team based workflows can be complex because of the scope of providers involved and the extent of information exchange and communication that needs to occur. While a workflow may represent the ideal structure of communication that needs to occur, information issues and contextual factors may impact how the workflow is implemented in practice. Understanding these issues will help us better design systems to support team based workflows. In this paper we use a case study of palliative sedation therapy (PST) to model a PST workflow and then use it to identify purposes of communication, information issues and contextual factors that impact them. We then suggest how our findings could inform health information technology (HIT) design to support team based communication workflows.

  1. Learning Clinical Workflows to Identify Subgroups of Heart Failure Patients

    Science.gov (United States)

    Yan, Chao; Chen, You; Li, Bo; Liebovitz, David; Malin, Bradley

    2016-01-01

    Heart Failure (HF) is one of the most common indications for readmission to the hospital among elderly patients. This is due to the progressive nature of the disease, as well as its association with complex comorbidities (e.g., anemia, chronic kidney disease, chronic obstructive pulmonary disease, hyper- and hypothyroidism), which contribute to increased morbidity and mortality, as well as a reduced quality of life. Healthcare organizations (HCOs) have established diverse treatment plans for HF patients, but such routines are not always formalized and may, in fact, arise organically as a patient’s management evolves over time. This investigation was motivated by the hypothesis that patients associated with a certain subgroup of HF should follow a similar workflow that, once made explicit, could be leveraged by an HCO to more effectively allocate resources and manage HF patients. Thus, in this paper, we introduce a method to identify subgroups of HF through a similarity analysis of event sequences documented in the clinical setting. Specifically, we 1) structure event sequences for HF patients based on the patterns of electronic medical record (EMR) system utilization, 2) identify subgroups of HF patients by applying a k-means clustering algorithm on utilization patterns, 3) learn clinical workflows for each subgroup, and 4) label each subgroup with diagnosis and procedure codes that are distinguishing in the set of all subgroups. To demonstrate its potential, we applied our method to EMR event logs for 785 HF inpatient stays over a 4 month period at a large academic medical center. Our method identified 8 subgroups of HF, each of which was found to associate with a canonical workflow inferred through an inductive mining algorithm. Each subgroup was further confirmed to be affiliated with specific comorbidities, such as hyperthyroidism and hypothyroidism. PMID:28269922

  2. An Adaptable Seismic Data Format for Modern Scientific Workflows

    Science.gov (United States)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  3. The Workflow Specification of Process Definition%工作流过程定义规范

    Institute of Scientific and Technical Information of China (English)

    缪晓阳; 石文俊; 吴朝晖

    2000-01-01

    This paper discusses the representation of a business process in a form.There are three basic aspects:the concept of workflow process definition,on which the idea of process definition interchange is raised;the meta-model of workflow,which is used to describe the entities and attributes of entities within the process definition;the workflow process definition language(WPDL),which is used to implement the Process definition.

  4. Design and Implementation of Visualized Workflow Modeling System Based on B/S Structure

    Institute of Scientific and Technical Information of China (English)

    WANG Jian; LI wei-li

    2007-01-01

    According to the necessity of flexible workflow management system, the solution to set up the visualized workflow modelling system based on B/S structure is put forward, which conforms to the relevant specifications of WfMC and the workflow process definition meta-modei. The design for system structure is presented in detail, and the key technologies for system implementation are also introduced. Additionally, an example is illustrated to demonstrate the validity of system.

  5. Leveraging an existing data warehouse to annotate workflow models for operations research and optimization.

    Science.gov (United States)

    Borlawsky, Tara; LaFountain, Jeanne; Petty, Lynda; Saltz, Joel H; Payne, Philip R O

    2008-11-06

    Workflow analysis is frequently performed in the context of operations research and process optimization. In order to develop a data-driven workflow model that can be employed to assess opportunities to improve the efficiency of perioperative care teams at The Ohio State University Medical Center (OSUMC), we have developed a method for integrating standard workflow modeling formalisms, such as UML activity diagrams with data-centric annotations derived from our existing data warehouse.

  6. Allocation optimale multicontraintes des workflows aux ressources d’un environnement Cloud Computing

    OpenAIRE

    Yassa, Sonia

    2014-01-01

    Cloud Computing is increasingly recognized as a new way to use on-demand, computing, storage and network services in a transparent and efficient way. In this thesis, we address the problem of workflows scheduling on distributed heterogeneous infrastructure of Cloud Computing. The existing workflows scheduling approaches mainly focus on the bi-objective optimization of the makespan and the cost. In this thesis, we propose news workflows scheduling algorithms based on metaheuristics. Our algori...

  7. Robust Workflow Systems + Flexible Geoprocessing Services = Geo-enabled Model Web?

    OpenAIRE

    GRANELL CANUT CARLOS

    2013-01-01

    The chapter begins briefly exploring the concept of modeling in geosciences which notably benefits from advances on the integration of geoprocessing services and workflow systems. In section 3, we provide a comprehensive background on the technology trends we treat in the chapter. On one hand we deal with workflow systems, categorized normally in the literature as scientific and business workflow systems (Barga and Gannon 2007). In particular, we introduce some prominent examples of scient...

  8. Big data analytics workflow management for eScience

    Science.gov (United States)

    Fiore, Sandro; D'Anca, Alessandro; Palazzo, Cosimo; Elia, Donatello; Mariello, Andrea; Nassisi, Paola; Aloisio, Giovanni

    2015-04-01

    In many domains such as climate and astrophysics, scientific data is often n-dimensional and requires tools that support specialized data types and primitives if it is to be properly stored, accessed, analysed and visualized. Currently, scientific data analytics relies on domain-specific software and libraries providing a huge set of operators and functionalities. However, most of these software fail at large scale since they: (i) are desktop based, rely on local computing capabilities and need the data locally; (ii) cannot benefit from available multicore/parallel machines since they are based on sequential codes; (iii) do not provide declarative languages to express scientific data analysis tasks, and (iv) do not provide newer or more scalable storage models to better support the data multidimensionality. Additionally, most of them: (v) are domain-specific, which also means they support a limited set of data formats, and (vi) do not provide a workflow support, to enable the construction, execution and monitoring of more complex "experiments". The Ophidia project aims at facing most of the challenges highlighted above by providing a big data analytics framework for eScience. Ophidia provides several parallel operators to manipulate large datasets. Some relevant examples include: (i) data sub-setting (slicing and dicing), (ii) data aggregation, (iii) array-based primitives (the same operator applies to all the implemented UDF extensions), (iv) data cube duplication, (v) data cube pivoting, (vi) NetCDF-import and export. Metadata operators are available too. Additionally, the Ophidia framework provides array-based primitives to perform data sub-setting, data aggregation (i.e. max, min, avg), array concatenation, algebraic expressions and predicate evaluation on large arrays of scientific data. Bit-oriented plugins have also been implemented to manage binary data cubes. Defining processing chains and workflows with tens, hundreds of data analytics operators is the

  9. WooW-II: Workshop on open workflows

    Directory of Open Access Journals (Sweden)

    Daniel Arribas-Bel

    2015-07-01

    Full Text Available This resource describes WooW-II, a two-day workshop on open workflows for quantitative social scientists. The workshop is broken down in five main parts, where each of them typically consists of an introductionary tutorial and a hands-on assignment. The specific tools discussed in this workshop are Markdown, Pandoc, Git, Github, R, and Rstudio, but the theoretical approach applies to a wider range of tools (e.g., LATEX and Python. By the end of the workshop, participants should be able to reproduce a paper of their own and make it available in an open form applying the concepts and tools introduced.

  10. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  11. PhyloGrid: a development for a workflow in Phylogeny

    CERN Document Server

    Montes, Esther; Mayo, Rafael

    2010-01-01

    In this work we present the development of a workflow based on Taverna which is going to be implemented for calculations in Phylogeny by means of the MrBayes tool. It has a friendly interface developed with the Gridsphere framework. The user is able to define the parameters for doing the Bayesian calculation, determine the model of evolution, check the accuracy of the results in the intermediate stages as well as do a multiple alignment of the sequences previously to the final result. To do this, no knowledge from his/her side about the computational procedure is required.

  12. Workflow for large-scale analysis of melanoma tissue samples

    Directory of Open Access Journals (Sweden)

    Maria E. Yakovleva

    2015-09-01

    Full Text Available The aim of the present study was to create an optimal workflow for analysing a large cohort of malignant melanoma tissue samples. Samples were lysed with urea and enzymatically digested with trypsin or trypsin/Lys C. Buffer exchange or dilution was used to reduce urea concentration prior to digestion. The tissue digests were analysed directly or following strong cation exchange (SCX fractionation by nano LC–MS/MS. The approach which resulted in the largest number of protein IDs involved a buffer exchange step before enzymatic digestion with trypsin and chromatographic separation in 120 min gradient followed by SCX–RP separation of peptides.

  13. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  14. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M Jr; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  15. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  16. Bandwidth-Aware Scheduling of Workflow Application on Multiple Grid Sites

    Directory of Open Access Journals (Sweden)

    Harshadkumar B. Prajapati

    2014-01-01

    Full Text Available Bandwidth-aware workflow scheduling is required to improve the performance of a workflow application in a multisite Grid environment, as the data movement cost between two low-bandwidth sites can adversely affect the makespan of the application. Pegasus WMS, an open-source and freely available WMS, cannot fully utilize its workflow mapping capability due to unavailability of integration of any bandwidth monitoring infrastructure in it. This paper develops the integration of Network Weather Service (NWS in Pegasus WMS to enable the bandwidth-aware mapping of scientific workflows. Our work demonstrates the applicability of the integration of NWS by making existing Heft site-selector of Pegasus WMS bandwidth aware. Furthermore, this paper proposes and implements a new workflow scheduling algorithm—Level based Highest Input and Processing Weight First. The results of the performed experiments indicate that the bandwidth-aware workflow scheduling algorithms perform better than bandwidth-unaware algorithms: Random and Heft of Pegasus WMS. Moreover, our proposed workflow scheduling algorithm performs better than the bandwidth-aware Heft algorithms. Thus, the proposed bandwidth-aware workflow scheduling enhances capability of Pegasus WMS and can increase performance of workflow applications.

  17. Pegasus: A Framework for Mapping Complex Scientific Workflows onto Distributed Systems

    Directory of Open Access Journals (Sweden)

    Ewa Deelman

    2005-01-01

    Full Text Available This paper describes the Pegasus framework that can be used to map complex scientific workflows onto distributed resources. Pegasus enables users to represent the workflows at an abstract level without needing to worry about the particulars of the target execution systems. The paper describes general issues in mapping applications and the functionality of Pegasus. We present the results of improving application performance through workflow restructuring which clusters multiple tasks in a workflow into single entities. A real-life astronomy application is used as the basis for the study.

  18. Word Automaticity of Tree Automatic Scattered Linear Orderings Is Decidable

    CERN Document Server

    Huschenbett, Martin

    2012-01-01

    A tree automatic structure is a structure whose domain can be encoded by a regular tree language such that each relation is recognisable by a finite automaton processing tuples of trees synchronously. Words can be regarded as specific simple trees and a structure is word automatic if it is encodable using only these trees. The question naturally arises whether a given tree automatic structure is already word automatic. We prove that this problem is decidable for tree automatic scattered linear orderings. Moreover, we show that in case of a positive answer a word automatic presentation is computable from the tree automatic presentation.

  19. A High Throughput Workflow Environment for Cosmological Simulations

    CERN Document Server

    Erickson, Brandon M S; Evrard, August E; Becker, Matthew R; Busha, Michael T; Kravtsov, Andrey V; Marru, Suresh; Pierce, Marlon; Wechsler, Risa H

    2012-01-01

    The next generation of wide-area sky surveys offer the power to place extremely precise constraints on cosmological parameters and to test the source of cosmic acceleration. These observational programs will employ multiple techniques based on a variety of statistical signatures of galaxies and large-scale structure. These techniques have sources of systematic error that need to be understood at the percent-level in order to fully leverage the power of next-generation catalogs. Simulations of large-scale structure provide the means to characterize these uncertainties. We are using XSEDE resources to produce multiple synthetic sky surveys of galaxies and large-scale structure in support of science analysis for the Dark Energy Survey. In order to scale up our production to the level of fifty 10^10-particle simulations, we are working to embed production control within the Apache Airavata workflow environment. We explain our methods and report how the workflow has reduced production time by 40% compared to manua...

  20. Designing Collaborative Healthcare Technology for the Acute Care Workflow

    Directory of Open Access Journals (Sweden)

    Michael Gonzales

    2015-10-01

    Full Text Available Preventable medical errors in hospitals are the third leading cause of death in the United States. Many of these are caused by poor situational awareness, especially in acute care resuscitation scenarios. While a number of checklists and technological interventions have been developed to reduce cognitive load and improve situational awareness, these tools often do not fit the clinical workflow. To better understand the challenges faced by clinicians in acute care codes, we conducted a qualitative study with interprofessional clinicians at three regional hospitals. Our key findings are: Current documentation processes are inadequate (with information recorded on paper towels; reference guides can serve as fixation points, reducing rather than enhancing situational awareness; the physical environment imposes significant constraints on workflow; homegrown solutions may be used often to solve unstandardized processes; simulation scenarios do not match real-world practice. We present a number of considerations for collaborative healthcare technology design and discuss the implications of our findings on current work for the development of more effective interventions for acute care resuscitation scenarios.

  1. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  2. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  3. A novel spectral library workflow to enhance protein identifications.

    Science.gov (United States)

    Li, Haomin; Zong, Nobel C; Liang, Xiangbo; Kim, Allen K; Choi, Jeong Ho; Deng, Ning; Zelaya, Ivette; Lam, Maggie; Duan, Huilong; Ping, Peipei

    2013-04-09

    The innovations in mass spectrometry-based investigations in proteome biology enable systematic characterization of molecular details in pathophysiological phenotypes. However, the process of delineating large-scale raw proteomic datasets into a biological context requires high-throughput data acquisition and processing. A spectral library search engine makes use of previously annotated experimental spectra as references for subsequent spectral analyses. This workflow delivers many advantages, including elevated analytical efficiency and specificity as well as reduced demands in computational capacity. In this study, we created a spectral matching engine to address challenges commonly associated with a library search workflow. Particularly, an improved sliding dot product algorithm, that is robust to systematic drifts of mass measurement in spectra, is introduced. Furthermore, a noise management protocol distinguishes spectra correlation attributed from noise and peptide fragments. It enables elevated separation between target spectral matches and false matches, thereby suppressing the possibility of propagating inaccurate peptide annotations from library spectra to query spectra. Moreover, preservation of original spectra also accommodates user contributions to further enhance the quality of the library. Collectively, this search engine supports reproducible data analyses using curated references, thereby broadening the accessibility of proteomics resources to biomedical investigators. This article is part of a Special Issue entitled: From protein structures to clinical applications.

  4. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  5. Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews%Integrated Enterprise Modeling Method Based on Workflow Model and Multiviews

    Institute of Scientific and Technical Information of China (English)

    林慧苹; 范玉顺; 吴澄

    2001-01-01

    Many enterprise modeling methods are proposed to model thebusiness process of enterprises and to implement CIM systems. But difficulties are still encountered when these methods are applied to the CIM system design and implementation. This paper proposes a new integrated enterprise modeling methodology based on the workflow model. The system architecture and the integrated modeling environment are described with a new simulation strategy. The modeling process and the relationship between the workflow model and the views are discussed.

  6. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  7. CrossFlow: cross-organizational workflow management in dynamic virtual enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    2000-01-01

    This paper gives a detailed overview of the approach to cross-organizational workflow management developed in the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enter

  8. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  9. MapReduce Operations with WS-VLAM Workflow Management System

    NARCIS (Netherlands)

    Baranowski, M.; Belloum, A.; Bubak, M.

    2013-01-01

    Workflow management systems are widely used to solve scientific problems as they enable orchestration of remote and lo- cal services such as database queries, job submission and running an application. To extend the role that workflow systems play in data-intensive science, we propose a solution tha

  10. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  11. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  12. The Impact of Computerized Provider Order Entry Systems on Inpatient Clinical Workflow: A Literature Review

    NARCIS (Netherlands)

    Z. Niazkhani (Zahra); H. Pirnejad (Habibollah); M. Berg (Marc); J.E.C.M. Aarts (Jos)

    2009-01-01

    textabstractPrevious studies have shown the importance of workflow issues in the implementation of CPOE systems and patient safety practices. To understand the impact of CPOE on clinical workflow, we developed a conceptual framework and conducted a literature search for CPOE evaluations between 1990

  13. CrossFlow: Cross-Organizational Workflow Management for Service Outsourcing in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, Paul; Aberer, Karl; Ludwig, Heiko; Hoffner, Yigal

    2001-01-01

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on dyna

  14. A Six‐Stage Workflow for Robust Application of Systems Pharmacology

    Science.gov (United States)

    Gadkar, K; Kirouac, DC; Mager, DE; van der Graaf, PH

    2016-01-01

    Quantitative and systems pharmacology (QSP) is increasingly being applied in pharmaceutical research and development. One factor critical to the ultimate success of QSP is the establishment of commonly accepted language, technical criteria, and workflows. We propose an integrated workflow that bridges conceptual objectives with underlying technical detail to support the execution, communication, and evaluation of QSP projects. PMID:27299936

  15. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  16. An Improvement on Algorithm of Grid-Workflow Based on QoS

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yun-feng; GE Wei

    2004-01-01

    With the emergence of grid computing, new challenges have arisen in workflow tasks scheduling.The goal of grid-workflow task scheduling is to achieve high system throughput and to match the application needs with the available computing resources.This matching of resources in a non-deterministically share heterogeneous environment leads to concerns on quality of service (QoS).Grid concept is presented in this paper, coupled with the QoS requirement of workflow task and an improved algorithm-ILGSS algorithm, has been brought out.The complexity of the improved scheduling algorithm has been analyzed.The experiment results show that the improved algorithm can lead to significant performance gain in various applications.An important research domain-adaptive workflow transaction in grid computing environment, has been explored and a new solution for the scheduling of distribute workflow has been bring forward in grid environment.

  17. SemanticSCo: A platform to support the semantic composition of services for gene expression analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Ferreira Pires, Luís; da Silva, Eduardo G; de Farias, Cléver R G

    2017-02-01

    Gene expression studies often require the combined use of a number of analysis tools. However, manual integration of analysis tools can be cumbersome and error prone. To support a higher level of automation in the integration process, efforts have been made in the biomedical domain towards the development of semantic web services and supporting composition environments. Yet, most environments consider only the execution of simple service behaviours and requires users to focus on technical details of the composition process. We propose a novel approach to the semantic composition of gene expression analysis services that addresses the shortcomings of the existing solutions. Our approach includes an architecture designed to support the service composition process for gene expression analysis, and a flexible strategy for the (semi) automatic composition of semantic web services. Finally, we implement a supporting platform called SemanticSCo to realize the proposed composition approach and demonstrate its functionality by successfully reproducing a microarray study documented in the literature. The SemanticSCo platform provides support for the composition of RESTful web services semantically annotated using SAWSDL. Our platform also supports the definition of constraints/conditions regarding the order in which service operations should be invoked, thus enabling the definition of complex service behaviours. Our proposed solution for semantic web service composition takes into account the requirements of different stakeholders and addresses all phases of the service composition process. It also provides support for the definition of analysis workflows at a high-level of abstraction, thus enabling users to focus on biological research issues rather than on the technical details of the composition process. The SemanticSCo source code is available at https://github.com/usplssb/SemanticSCo.

  18. Automatic Program Development

    DEFF Research Database (Denmark)

    by members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers......Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers...

  19. The Distributed Workflow Management System--FlowAgent

    Institute of Scientific and Technical Information of China (English)

    王文军; 仲萃豪

    2000-01-01

    While mainframe or 2-tier client/server system have serious problems in flexibility and scalability for the large-scale business processes, 3-tier client/server architecture and object-oriented system modeling which construct business process on service components seem to bring software system some scalability. As enabling infrastructure for object-oriented methodology, distributed WFMS (Work-flow Management System) can flexibly describe business rules among autonomous 'service tasks', and support scalability of large-scale business process. But current distributed WFMS still have difficulty to manage a large number of distributed tasks, the 'multi-TaskDomain' architecture of FlowAgent will try to solve this problem, and bring a dynamic and distributed environment for task-scheduling.

  20. Approach for workflow modeling using π-calculus

    Institute of Scientific and Technical Information of China (English)

    杨东; 张申生

    2003-01-01

    As a variant of process algebra, π-calculus can describe the interactions between evolving processes. By modeling activity as a process interacting with other processes through ports, this paper presents a new approach: representing workilow models using ~-calculus. As a result, the model can characterize the dynamic behaviors of the workflow process in terms of the LTS ( Labeled Transition Semantics) semantics of π-calculus. The main advantage of the worktlow model's formal semantic is that it allows for verification of the model's properties, such as deadlock-free and normal termination. Moreover, the equivalence of worktlow models can be checked thlx)ugh weak bisimulation theorem in the π-caleulus, thus facilitating the optimizationof business processes.

  1. A Model of Workflow-oriented Attributed Based Access Control

    Directory of Open Access Journals (Sweden)

    Guoping Zhang

    2011-02-01

    Full Text Available the emergence of “Internet of Things” breaks previous traditional thinking, which integrates physical infrastructure and network infrastructure into unified infrastructure. There will be a lot of resources or information in IoT, so computing and processing of information is the core supporting of IoT. In this paper, we introduce “Service-Oriented Computing” to solve the problem where each device can offer its functionality as standard services. Here we mainly discuss the access control issue of service-oriented computing in Internet of Things. This paper puts forward a model of Workflow-oriented Attributed Based Access Control (WABAC, and design an access control framework based on WABAC model. The model grants permissions to subjects according to subject atttribute, resource attribute, environment attribute and current task, meeting access control request of SOC. Using the approach presented can effectively enhance the access control security for SOC applications, and prevent the abuse of subject permissions.

  2. A Workflow for Differentially-Private Graph Synthesis

    CERN Document Server

    Proserpio, Davide; McSherry, Frank

    2012-01-01

    We present a new workflow for differentially-private publication of graph topologies. First, we produce differentially private measurements of interesting graph statistics using our new version of the PINQ programming language, Weighted PINQ, which is based on a generalization of differential privacy to weighted sets. Next, we show how to generate graphs that fit any set of measured graph statistics, even if they are inconsistent (due to noise), or if they are only indirectly related to actual statistics that we want our synthetic graph to preserve. We do this by combining the answers to Weighted PINQ queries with an incremental evaluator (Markov Chain Monte Carlo (MCMC)) that allows us to synthesize graphs where the statistic of interest aligns with that of the protected graph. This paper presents our preliminary results; we show how to cast a few graph statistics (degree distribution, edge multiplicity, joint degree distribution) as queries in Weighted PINQ, and then present experimental results of syntheti...

  3. A Component Based Approach to Scientific Workflow Management

    CERN Document Server

    Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  4. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  5. Using Simulations to Integrate Technology into Health Care Aidesཿ Workflow

    Directory of Open Access Journals (Sweden)

    Sharla King

    2013-07-01

    Full Text Available Health care aides (HCAs are critical to home care, providing a range of services to people with chronic conditions, aging or are unable to care for themselves independently. The current HCA supply will not keep up with this increasing demand without fundamental changes in their work environment. One possible solution to some of the workflow challenges and workplace stress of HCAs is hand-held tablet technology. In order to introduce the use of tablets with HCAs, simulations were developed. Once an HCA was comfortable with the tablet, a simulated client was introduced. The HCA interacted with the simulated client and used the tablet applications to assist with providing care. After the simulations, the HCAs participated in a focus group. HCAs completed a survey before and after the tablet training and simulation to determine their perception and acceptance of the tablet. Future deployment and implementation of technologies in home care should be further evaluated for outcomes.

  6. Automatic Atlas Based Electron Density and Structure Contouring for MRI-based Prostate Radiation Therapy on the Cloud

    Science.gov (United States)

    Dowling, J. A.; Burdett, N.; Greer, P. B.; Sun, J.; Parker, J.; Pichler, P.; Stanwell, P.; Chandra, S.; Rivest-Hénault, D.; Ghose, S.; Salvado, O.; Fripp, J.

    2014-03-01

    Our group have been developing methods for MRI-alone prostate cancer radiation therapy treatment planning. To assist with clinical validation of the workflow we are investigating a cloud platform solution for research purposes. Benefits of cloud computing can include increased scalability, performance and extensibility while reducing total cost of ownership. In this paper we demonstrate the generation of DICOM-RT directories containing an automatic average atlas based electron density image and fast pelvic organ contouring from whole pelvis MR scans.

  7. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  8. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  9. Automatic Generation of Building Models with Levels of Detail 1-3

    Science.gov (United States)

    Nguatem, W.; Drauschke, M.; Mayer, H.

    2016-06-01

    We present a workflow for the automatic generation of building models with levels of detail (LOD) 1 to 3 according to the CityGML standard (Gröger et al., 2012). We start with orienting unsorted image sets employing (Mayer et al., 2012), we compute depth maps using semi-global matching (SGM) (Hirschmüller, 2008), and fuse these depth maps to reconstruct dense 3D point clouds (Kuhn et al., 2014). Based on planes segmented from these point clouds, we have developed a stochastic method for roof model selection (Nguatem et al., 2013) and window model selection (Nguatem et al., 2014). We demonstrate our workflow up to the export into CityGML.

  10. Workflow Management Application Programming Interface Specification%工作流管理应用编程接口规范

    Institute of Scientific and Technical Information of China (English)

    刘华伟; 吴朝晖

    2000-01-01

    The document 'Workflow Management Application Programming Interface Specification'is distributed by the Workflow Management Coalition to specify stadard APIs which can be supported by workflow management products.In this paper,we first introduce two parts of this interface,then discuss the standardized data structure and functions definition,finally address the future work.

  11. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  12. Exploring Automatization Processes.

    Science.gov (United States)

    DeKeyser, Robert M.

    1996-01-01

    Presents the rationale for and the results of a pilot study attempting to document in detail how automatization takes place as the result of different kinds of intensive practice. Results show that reaction times and error rates gradually decline with practice, and the practice effect is skill-specific. (36 references) (CK)

  13. Automatic Texture Mapping of Architectural and Archaeological 3d Models

    Science.gov (United States)

    Kersten, T. P.; Stallmann, D.

    2012-07-01

    Today, detailed, complete and exact 3D models with photo-realistic textures are increasingly demanded for numerous applications in architecture and archaeology. Manual texture mapping of 3D models by digital photographs with software packages, such as Maxon Cinema 4D, Autodesk 3Ds Max or Maya, still requires a complex and time-consuming workflow. So, procedures for automatic texture mapping of 3D models are in demand. In this paper two automatic procedures are presented. The first procedure generates 3D surface models with textures by web services, while the second procedure textures already existing 3D models with the software tmapper. The program tmapper is based on the Multi Layer 3D image (ML3DImage) algorithm and developed in the programming language C++. The studies showing that the visibility analysis using the ML3DImage algorithm is not sufficient to obtain acceptable results of automatic texture mapping. To overcome the visibility problem the Point Cloud Painter algorithm in combination with the Z-buffer-procedure will be applied in the future.

  14. Automaticity and Reading: Perspectives from the Instance Theory of Automatization.

    Science.gov (United States)

    Logan, Gordon D.

    1997-01-01

    Reviews recent literature on automaticity, defining the criteria that distinguish automatic processing from non-automatic processing, and describing modern theories of the underlying mechanisms. Focuses on evidence from studies of reading and draws implications from theory and data for practical issues in teaching reading. Suggests that…

  15. Workflow continuity--moving beyond business continuity in a multisite 24-7 healthcare organization.

    Science.gov (United States)

    Kolowitz, Brian J; Lauro, Gonzalo Romero; Barkey, Charles; Black, Harry; Light, Karen; Deible, Christopher

    2012-12-01

    As hospitals move towards providing in-house 24 × 7 services, there is an increasing need for information systems to be available around the clock. This study investigates one organization's need for a workflow continuity solution that provides around the clock availability for information systems that do not provide highly available services. The organization investigated is a large multifacility healthcare organization that consists of 20 hospitals and more than 30 imaging centers. A case analysis approach was used to investigate the organization's efforts. The results show an overall reduction in downtimes where radiologists could not continue their normal workflow on the integrated Picture Archiving and Communications System (PACS) solution by 94 % from 2008 to 2011. The impact of unplanned downtimes was reduced by 72 % while the impact of planned downtimes was reduced by 99.66 % over the same period. Additionally more than 98 h of radiologist impact due to a PACS upgrade in 2008 was entirely eliminated in 2011 utilizing the system created by the workflow continuity approach. Workflow continuity differs from high availability and business continuity in its design process and available services. Workflow continuity only ensures that critical workflows are available when the production system is unavailable due to scheduled or unscheduled downtimes. Workflow continuity works in conjunction with business continuity and highly available system designs. The results of this investigation revealed that this approach can add significant value to organizations because impact on users is minimized if not eliminated entirely.

  16. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  17. myExperiment: a repository and social network for the sharing of bioinformatics workflows.

    Science.gov (United States)

    Goble, Carole A; Bhagat, Jiten; Aleksejevs, Sergejs; Cruickshank, Don; Michaelides, Danius; Newman, David; Borkum, Mark; Bechhofer, Sean; Roos, Marco; Li, Peter; De Roure, David

    2010-07-01

    myExperiment (http://www.myexperiment.org) is an online research environment that supports the social sharing of bioinformatics workflows. These workflows are procedures consisting of a series of computational tasks using web services, which may be performed on data from its retrieval, integration and analysis, to the visualization of the results. As a public repository of workflows, myExperiment allows anybody to discover those that are relevant to their research, which can then be reused and repurposed to their specific requirements. Conversely, developers can submit their workflows to myExperiment and enable them to be shared in a secure manner. Since its release in 2007, myExperiment currently has over 3500 registered users and contains more than 1000 workflows. The social aspect to the sharing of these workflows is facilitated by registered users forming virtual communities bound together by a common interest or research project. Contributors of workflows can build their reputation within these communities by receiving feedback and credit from individuals who reuse their work. Further documentation about myExperiment including its REST web service is available from http://wiki.myexperiment.org. Feedback and requests for support can be sent to bugs@myexperiment.org.

  18. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  19. 基于数据驱动的工作流引擎的设计与实现%Design and Implementation of Data-Driven Based Workflow Engine

    Institute of Scientific and Technical Information of China (English)

    陈义松; 汪芸

    2012-01-01

    Workflow is a computerized business process.Workflow engine provides the runtime environment for workflow processes,playing a key role in the implementation of processes.The traditional workflow technology reflects a number of deficiencies in dealing with complicated and changing process.This paper proposes a data-driven based mode to achieve loose coupling between activities,and designs and implements workflow engine with the purpose of supporting this kind of data-driven mode.The engine copes with the complicated business processes with strong processing capabilities and can automatically mode the processes.%工作流是计算机化的业务流程.工作流引擎为工作流流程提供运行环境,在流程的执行过程中起着关键性作用.传统的工作流技术在处理复杂多变的流程时,缺乏良好的建模以及适应机制.本文提出一种基于数据驱动的工作流运行方式,实现了活动与活动之间的松散耦合,并以此为基础设计并实现支持这种数据驱动方式的工作流引擎.该引擎在应对复杂多变的业务流程时具有较强的处理能力,并能够实现流程的自动建模.

  20. Automatic brush-plating technology for component remanufacturing

    Institute of Scientific and Technical Information of China (English)

    WU Bin; XU Bin-shi; JING Xue-dong; LIU Cun-long; ZHANG Bin

    2005-01-01

    An automatic brush-plating system was developed for component remanufacturing. With this system, Ni/nano-alumina composite coatings from an electrolyte containing 20 g/L nano-alumina particles were prepared.Microstructure, surface morphology, microhardness and wear resistance of automatically plated coatings and manu ally plated coatings were investigated comparatively. The results show that the automatically plated coatings are relatively dense and uniform and have lower friction coefficient of 0. 089 under lubricant condition, when compared with manually plated coatings with friction coefficient of 0. 14.

  1. Bioconductor workflow for microbiome data analysis: from raw reads to community analyses.

    Science.gov (United States)

    Callahan, Ben J; Sankaran, Kris; Fukuyama, Julia A; McMurdie, Paul J; Holmes, Susan P

    2016-01-01

    High-throughput sequencing of PCR-amplified taxonomic markers (like the 16S rRNA gene) has enabled a new level of analysis of complex bacterial communities known as microbiomes. Many tools exist to quantify and compare abundance levels or microbial composition of communities in different conditions. The sequencing reads have to be denoised and assigned to the closest taxa from a reference database. Common approaches use a notion of 97% similarity and normalize the data by subsampling to equalize library sizes. In this paper, we show that statistical models allow more accurate abundance estimates. By providing a complete workflow in R, we enable the user to do sophisticated downstream statistical analyses, including both parameteric and nonparametric methods. We provide examples of using the R packages dada2, phyloseq, DESeq2, ggplot2 and vegan to filter, visualize and test microbiome data. We also provide examples of supervised analyses using random forests, partial least squares and linear models as well as nonparametric testing using community networks and the ggnetwork package.

  2. Workflow in the operating room: review of Arrowhead 2004 seminar on imaging and informatics (Invited Paper)

    Science.gov (United States)

    Lemke, Heinz U.; Ratib, Osman M.; Horii, Steven C.

    2005-04-01

    This review paper is based on the 2004 UCLA Seminar on Imaging and Informatics (http://www.radnet.ucla.edu/Arrowhead2004/) which is a joint endeavour between the UCLA and the CARS organization, focussing on workflow analysis tools and the digital operating room. Eleven specific presentations of the Arrowhead Seminar have been summarized in this review referring to redesigning perioperative care for a high velocity OR, intraoperative ultrasound process and model, surgical workflow and surgical PACS, an integrated view , interactions in the surgical OR, workflow automation strategies and target applications, visualisation solutions for the operating room, navigating the fifth dimension, and design of digital operating rooms and interventional suites

  3. Flexible Data-Aware Scheduling for Workflows over an In-Memory Object Store

    Energy Technology Data Exchange (ETDEWEB)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin; Wozniak, Justin M.; Carretero, Jesus; Ross, Rob

    2016-01-01

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  4. Implementation of a Workflow Management System for Non-Expert Users

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2016-01-01

    In the Danish CLARIN-DK infrastructure, chaining language technology (LT) tools into a workflow is easy even for a non-expert user, because she only needs to specify the input and the desired output of the workflow. With this information and the registered input and output profiles of the available...... tools, the CLARIN-DK workflow management system (WMS) computes combinations of tools that will give the desired result. This advanced functionality was originally not envisaged, but came within reach by writing the WMS partly in Java and partly in a programming language for symbolic computation, Bracmat...

  5. An Inter-enterprise Workflow Model for Supply Chain and B2B E-commerce

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    The goals of B2B electronic commerce and supply chain management system are to implement interoperability of independent enterprises, to smooth the information flow between them and to deploy business processes over multiple enterprises. The inherent characteristics of workflow system make it suitable to implement the cross organization management. This paper, firstly, proposes an inter-enterprises workflow model based on the agreement to support the construction of supply chain management system and B2B electronic commerce. This model has extended the standard workflow model. After that, an architecture which supports the model has been introduced, especially it details the structure and implementation of interfaces between enterprises.

  6. Inheritance Optimization of Extend Case Transfer Model of Interoganization Workflows Management

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    Extend case transfer architecture inter-organization workflow management fits the needs of Collaboration commerce. However, during the third step of extend case transfer architecture, modifications of private workflows might cause some fatal problems, such as dead locks, live locks and dead tasks. These problems could change the soundness and efficiency of overall work flow. This paper presents a Petri net based approach to protect the inheritance of public work flows in private domains, and discusses an implementation of our collaboration commerce workflow model.

  7. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat Sam; Halama, James; Bova, Davide

    2017-03-09

    A written directive (WD) is a requirement of the United States Nuclear Regulatory Commission (USNRC) regulations and is required for all uses of I-131 above 1.11 MBq (30 microcuries) and for patients receiving therapy with radiopharmaceuticals. These regulations have also been adopted and are required to be enforced by the agreement states. A paper trail method of WD management is inefficient and prone to error, loss, and duplication. As the options for therapy in Nuclear Medicine increase with the introduction of new radiopharmaceuticals, the time spent on the regulatory burden and paperwork has also increased. The management of regulatory requirements has a significant impact on physician and technologist time utilization and these pressures may increase the potential for inaccurate or incomplete WD data and subsequent regulatory violations. A software tool for the management of WDs using a HIPAA compliant database has been created. This WD software allows for the secure sharing of data among physicians, technologists and managers while saving time, reducing errors and eliminating the possibility of loss and duplication. Methods: Software development was performed using Microsoft Visual Basic® (Microsoft Corporation, Redmond, WA) which is part of the Microsoft Visual Studio® development environment for the Microsoft Windows® platform. The database repository for patient data is Microsoft Access® and stored locally on a HIPAA secure server or hard disk. Once a working version was developed, it was installed and used at our institution for the management of WDs. Updates and modifications were released regularly until no significant problems were found with the operation of the software. Results: The software has been in use at our institution for over two years and has reliably kept track of all directives during that time. All physicians and technologists use the software as part of their daily workflow and find it superior to paper directives. We are able to

  8. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    Science.gov (United States)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  9. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  10. DNA qualification workflow for next generation sequencing of histopathological samples.

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  11. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    Science.gov (United States)

    Bistacchi, Andrea

    2015-04-01

    ). This dataset can be used as-is (PC-DOM), or a 3D triangulated surface can be interpolated from the point cloud, and images can be used to associate a texture to this surface (TS-DOM). In the DOMstudio workflow we use both PC-DOMs and TS-DOMs. Particularly, the latter are obtained projecting the original images onto the triangulated surface, without any downsampling, thus retaining the original resolution and quality of images collected in the field. In the DOMstudio interpretation step, PC-DOM is considered the best option for fracture analysis in outcrops where facets corresponding to fractures are present. This allows obtaining orientation statistics (e.g. stereoplots, Fisher statistics, etc.) directly from a point cloud where, for each point, the unit vector normal to the outcrop surface has been calculated. A recent development in this kind of processing is represented by the possibility to automatically select (segment) subset point clouds representing single fracture surfaces, which can be used for studies on fracture length, spacing, etc., allowing to obtain parameters like the frequency-length distribution, P21, etc. PC-DOM interpretation can be combined or complemented, depending on the outcrop morphology, with an interpretation carried out on a TS-DOM in terms of traces, which are the linear intersection of "geological" surfaces (fractures, faults, bedding, etc.) with the outcrop surface. This kind of interpretation is very well suited for outcrops with smooth surfaces, and can be performed either by manual picking, or by applying image analysis techniques on the images associated with the DOM. In this case, a huge mass of data, with very high resolution, can be collected very effectively. If we consider applications like lithological or mineral map-ping, TS-DOM datasets are the only suitable support. Finally, the DOMstudio workflow produces output in formats that are compatible with all common geomodelling packages (e.g. Gocad/Skua, Petrel, Move), allowing

  12. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    aspects of the construct, such as routine, inertia, automaticity, or very little conscious deliberation. The data consist of 2962 consumers participating in a large European survey. The results show that habit strength significantly moderates the association between satisfaction and action loyalty, and......This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychological......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  13. Automatic Ultrasound Scanning

    DEFF Research Database (Denmark)

    Moshavegh, Ramin

    Medical ultrasound has been a widely used imaging modality in healthcare platforms for examination, diagnostic purposes, and for real-time guidance during surgery. However, despite the recent advances, medical ultrasound remains the most operator-dependent imaging modality, as it heavily relies...... on the user adjustments on the scanner interface to optimize the scan settings. This explains the huge interest in the subject of this PhD project entitled “AUTOMATIC ULTRASOUND SCANNING”. The key goals of the project have been to develop automated techniques to minimize the unnecessary settings...... on the scanners, and to improve the computer-aided diagnosis (CAD) in ultrasound by introducing new quantitative measures. Thus, four major issues concerning automation of the medical ultrasound are addressed in this PhD project. They touch upon gain adjustments in ultrasound, automatic synthetic aperture image...

  14. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  15. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  16. Automatization of lexicographic work

    Directory of Open Access Journals (Sweden)

    Iztok Kosem

    2013-12-01

    Full Text Available A new approach to lexicographic work, in which the lexicographer is seen more as a validator of the choices made by computer, was recently envisaged by Rundell and Kilgarriff (2011. In this paper, we describe an experiment using such an approach during the creation of Slovene Lexical Database (Gantar, Krek, 2011. The corpus data, i.e. grammatical relations, collocations, examples, and grammatical labels, were automatically extracted from 1,18-billion-word Gigafida corpus of Slovene. The evaluation of the extracted data consisted of making a comparison between the time spent writing a manual entry and a (semi-automatic entry, and identifying potential improvements in the extraction algorithm and in the presentation of data. An important finding was that the automatic approach was far more effective than the manual approach, without any significant loss of information. Based on our experience, we would propose a slightly revised version of the approach envisaged by Rundell and Kilgarriff in which the validation of data is left to lower-level linguists or crowd-sourcing, whereas high-level tasks such as meaning description remain the domain of lexicographers. Such an approach indeed reduces the scope of lexicographer’s work, however it also results in the ability of bringing the content to the users more quickly.

  17. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Science.gov (United States)

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Ryabinkin, E.; Wenaus, T.

    2016-02-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed Analysis)Workload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF), is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF's Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  18. A novel workflow for seismic net pay estimation with uncertainty

    CERN Document Server

    Glinsky, Michael E; Unaldi, Muhlis; Nagassar, Vishal

    2016-01-01

    This paper presents a novel workflow for seismic net pay estimation with uncertainty. It is demonstrated on the Cassra/Iris Field. The theory for the stochastic wavelet derivation (which estimates the seismic noise level along with the wavelet, time-to-depth mapping, and their uncertainties), the stochastic sparse spike inversion, and the net pay estimation (using secant areas) along with its uncertainty; will be outlined. This includes benchmarking of this methodology on a synthetic model. A critical part of this process is the calibration of the secant areas. This is done in a two step process. First, a preliminary calibration is done with the stochastic reflection response modeling using rock physics relationships derived from the well logs. Second, a refinement is made to the calibration to account for the encountered net pay at the wells. Finally, a variogram structure is estimated from the extracted secant area map, then used to build in the lateral correlation to the ensemble of net pay maps while matc...

  19. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  20. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  1. Automating radiologist workflow part 1: the digital consultation.

    Science.gov (United States)

    Reiner, Bruce

    2008-10-01

    With the widespread adoption of picture archiving and communication systems and filmless imaging, ubiquitous and instantaneous access to imaging data has resulted in decreased radiologist-clinician consultations. It is therefore imperative that the radiology community develop new communication strategies to improve both the timeliness and the perceived value of the radiology report. One strategy to accomplish this goal is the creation of electronic consultation tools, which can effectively recreate radiologist workflow and the identification of key pathologic findings in an easy-to-use, well-organized, and timely fashion. This would be accomplished by recording radiologist-computer interactions using an electronic auditing tool, storing these interactions in an extensible markup language schema, which can subsequently be played back at a later point in time to recreate the radiologist consultation. This approach has the added benefits of allowing the radiologist to selectively edit content to the needs of different clinician users, index the comprehensive consultation into pathology-specific components, and perform asynchronous bidirectional consultations. This electronic consultation tool would result in the creation of context and user-specific consultation files, which can in turn be integrated with clinical data from electronic medical records.

  2. A workflow model to analyse pediatric emergency overcrowding.

    Science.gov (United States)

    Zgaya, Hayfa; Ajmi, Ines; Gammoudi, Lotfi; Hammadi, Slim; Martinot, Alain; Beuscart, Régis; Renard, Jean-Marie

    2014-01-01

    The greatest source of delay in patient flow is the waiting time from the health care request, and especially the bed request to exit from the Pediatric Emergency Department (PED) for hospital admission. It represents 70% of the time that these patients occupied in the PED waiting rooms. Our objective in this study is to identify tension indicators and bottlenecks that contribute to overcrowding. Patient flow mapping through the PED was carried out in a continuous 2 years period from January 2011 to December 2012. Our method is to use the collected real data, basing on accurate visits made in the PED of the Regional University Hospital Center (CHRU) of Lille (France), in order to construct an accurate and complete representation of the PED processes. The result of this representation is a Workflow model of the patient journey in the PED representing most faithfully possible the reality of the PED of CHRU of Lille. This model allowed us to identify sources of delay in patient flow and aspects of the PED activity that could be improved. It must be enough retailed to produce an analysis allowing to identify the dysfunctions of the PED and also to propose and to estimate prevention indicators of tensions. Our survey is integrated into the French National Research Agency project, titled: "Hospital: optimization, simulation and avoidance of strain" (ANR HOST).

  3. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  4. A cross-package Bioconductor workflow for analysing methylation array data [version 2; referees: 3 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Jovana Maksimovic

    2016-07-01

    Full Text Available Methylation in the human genome is known to be associated with development and disease. The Illumina Infinium methylation arrays are by far the most common way to interrogate methylation across the human genome. This paper provides a Bioconductor workflow using multiple packages for the analysis of methylation array data. Specifically, we demonstrate the steps involved in a typical differential methylation analysis pipeline including: quality control, filtering, normalization, data exploration and statistical testing for probe-wise differential methylation. We further outline other analyses such as differential methylation of regions, differential variability analysis, estimating cell type composition and gene ontology testing. Finally, we provide some examples of how to visualise methylation array data.

  5. A cross-package Bioconductor workflow for analysing methylation array data [version 1; referees: 3 approved, 1 approved with reservations

    Directory of Open Access Journals (Sweden)

    Jovana Maksimovic

    2016-06-01

    Full Text Available Methylation in the human genome is known to be associated with development and disease. The Illumina Infinium methylation arrays are by far the most common way to interrogate methylation across the human genome. This paper provides a Bioconductor workflow using multiple packages for the analysis of methylation array data. Specifically, we demonstrate the steps involved in a typical differential methylation analysis pipeline including: quality control, filtering, normalization, data exploration and statistical testing for probe-wise differential methylation. We further outline other analyses such as differential methylation of regions, differential variability analysis, estimating cell type composition and gene ontology testing. Finally, we provide some examples of how to visualise methylation array data.

  6. Formal Verification of Temporal Properties for Reduced Overhead in Grid Scientific Workflows

    Institute of Scientific and Technical Information of China (English)

    Jun-Wei Cao; Fan Zhang; Ke Xu; Lian-Chcn Liu; Chcng Wu

    2011-01-01

    With quick development of grid techniques and growing complexity of grid applications,it is becoming critical for reasoning temporal properties of grid workflows to probe potential pitfalls and errors,in order to ensure reliability and trustworthiness at the initial design phase.A state Pi calculus is proposed and implemented in this work,which not only enables flexible abstraction and management of historical grid system events,but also facilitates modeling and temporal verification of grid workflows.Furthermore,a relaxed region analysis (RRA) approach is proposed to decompose large scale grid workflows into sequentially composed regions with relaxation of parallel workflow branches,and corresponding verification strategies are also decomposed following modular verification principles.Performance evaluation results show that the RRA approach can dramatically reduce CPU time and memory usage of formal verification.

  7. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  8. Visual compression of workflow visualizations with automated detection of macro motifs.

    Science.gov (United States)

    Maguire, Eamonn; Rocca-Serra, Philippe; Sansone, Susanna-Assunta; Davies, Jim; Chen, Min

    2013-12-01

    This paper is concerned with the creation of 'macros' in workflow visualization as a support tool to increase the efficiency of data curation tasks. We propose computation of candidate macros based on their usage in large collections of workflows in data repositories. We describe an efficient algorithm for extracting macro motifs from workflow graphs. We discovered that the state transition information, used to identify macro candidates, characterizes the structural pattern of the macro and can be harnessed as part of the visual design of the corresponding macro glyph. This facilitates partial automation and consistency in glyph design applicable to a large set of macro glyphs. We tested this approach against a repository of biological data holding some 9,670 workflows and found that the algorithmically generated candidate macros are in keeping with domain expert expectations.

  9. Task-driven equipment inspection system based on safe workflow model

    Science.gov (United States)

    Guo, Xinyou; Liu, Yangguang

    2010-12-01

    An equipment inspection system is one that contains a number of equipment queues served in cyclic order. In order to satisfy multi-task scheduling and multi-task combination requirements for equipment inspection system, we propose a model based on inspection workflow in this paper. On the one hand, the model organizes all kinds of equipments according to inspection workflow, elemental work units according to inspection tasks, combination elements according to the task defined by users. We proposed a 3-dimensional workflow model for equipments inspection system including organization sub-model, process sub-model and data sub-model. On the other hand, the model is based on the security authorization which defined by relation between roles, tasks, pre-defined business workflows and inspection data. The system based on proposed framework is safe and efficient. Our implement shows that the system is easy to operate and manage according to the basic performance.

  10. Database supported electronic retrospective analyses in radiation oncology. Establishing a workflow using the example of pancreatic cancer; Datenbankbasierte digitale retrospektive Auswertung von Patientenkollektiven in der Radioonkologie. Etablierung eines Workflows am Beispiel des Pankreaskarzinoms

    Energy Technology Data Exchange (ETDEWEB)

    Kessel, K.A.; Habermehl, D.; Bougatf, N.; Debus, J.; Combs, S.E. [Universitatesklinikum Heidelberg (Germany). Abt. fuer Radioonkologie und Strahlentherapie; Bohn, C. [CHILI GmbH, Dossenheim (Germany); Jaeger, A.; Floca, R.O.; Zhang, L. [Deutsches Krebsforschungszentrum (DKFZ), Heidelberg (Germany); Bendl, R. [Hochschule Heilbronn (Germany). Fakultaet fuer Medizinische Informatik

    2012-12-15

    Purpose: Especially in the field of radiation oncology, handling a large variety of voluminous datasets from various information systems in different documentation styles efficiently is crucial for patient care and research. To date, conducting retrospective clinical analyses is rather difficult and time consuming. With the example of patients with pancreatic cancer treated with radio-chemotherapy, we performed a therapy evaluation by using an analysis system connected with a documentation system. Materials and methods: A total number of 783 patients have been documented into a professional, database-based documentation system. Information about radiation therapy, diagnostic images and dose distributions have been imported into the web-based system. Results: For 36 patients with disease progression after neoadjuvant chemoradiation, we designed and established an analysis workflow. After an automatic registration of the radiation plans with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence. All results are saved in the database and included in statistical calculations. Conclusion: The main goal of using an automatic analysis tool is to reduce time and effort conducting clinical analyses, especially with large patient groups. We showed a first approach and use of some existing tools, however manual interaction is still necessary. Further steps need to be taken to enhance automation. Already, it has become apparent that the benefits of digital data management and analysis lie in the central storage of data and reusability of the results. Therefore, we intend to adapt the analysis system to other types of tumors in radiation oncology. (orig.)

  11. Data Intensive Scientific Workflows on a Federated Cloud: CRADA Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Garzoglio, Gabriele [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-31

    The Fermilab Scientific Computing Division and the KISTI Global Science Experimental Data Hub Center have built a prototypical large-scale infrastructure to handle scientific workflows of stakeholders to run on multiple cloud resources. The demonstrations have been in the areas of (a) Data-Intensive Scientific Workflows on Federated Clouds, (b) Interoperability and Federation of Cloud Resources, and (c) Virtual Infrastructure Automation to enable On-Demand Services.

  12. Workflows for fluxomics in the framework of PhenoMeNal project

    OpenAIRE

    2016-01-01

    In the framework of the PhenoMeNal project and e-infrastructures (www.phenomenal-h2020.eu/home/), two workflows were prepared using previously developed programs for analysis of intracellular fluxes from isotopologue distributions. Isotopologue distributions will be available on data repositories in MetaboLights (www.ebi.ac.uk/metabolights). With this aim, programs were adapted to the workflows, docker container dockerfiles for each program were uploaded to GitHub repositories, added to Jenki...

  13. An analytical method for well-formed workflow/Petri net verification of classical soundness

    Directory of Open Access Journals (Sweden)

    Clempner Julio

    2014-12-01

    Full Text Available In this paper we consider workflow nets as dynamical systems governed by ordinary difference equations described by a particular class of Petri nets. Workflow nets are a formal model of business processes. Well-formed business processes correspond to sound workflow nets. Even if it seems necessary to require the soundness of workflow nets, there exist business processes with conditional behavior that will not necessarily satisfy the soundness property. In this sense, we propose an analytical method for showing that a workflow net satisfies the classical soundness property using a Petri net. To present our statement, we use Lyapunov stability theory to tackle the classical soundness verification problem for a class of dynamical systems described by Petri nets. This class of Petri nets allows a dynamical model representation that can be expressed in terms of difference equations. As a result, by applying Lyapunov theory, the classical soundness property for workflow nets is solved proving that the Petri net representation is stable. We show that a finite and non-blocking workflow net satisfies the sound property if and only if its corresponding PN is stable, i.e., given the incidence matrix A of the corresponding PN, there exists a Փ strictly positive m vector such that AՓ≤ 0. The key contribution of the paper is the analytical method itself that satisfies part of the definition of the classical soundness requirements. The method is designed for practical applications, guarantees that anomalies can be detected without domain knowledge, and can be easily implemented into existing commercial systems that do not support the verification of workflows. The validity of the proposed method is successfully demonstrated by application examples.

  14. An Optimization Algorithm for Multipath Parallel Allocation for Service Resource in the Simulation Task Workflow

    OpenAIRE

    Zhiteng Wang; Hongjun Zhang; Rui Zhang; Yong Li; Xuliang Zhang

    2014-01-01

    Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimiz...

  15. Data Processing Workflows to Support Reproducible Data-driven Research in Hydrology

    Science.gov (United States)

    Goodall, J. L.; Essawy, B.; Xu, H.; Rajasekar, A.; Moore, R. W.

    2015-12-01

    Geoscience analyses often require the use of existing data sets that are large, heterogeneous, and maintained by different organizations. A particular challenge in creating reproducible analyses using these data sets is automating the workflows required to transform raw datasets into model specific input files and finally into publication ready visualizations. Data grids, such as the Integrated Rule-Oriented Data System (iRODS), are architectures that allow scientists to access and share large data sets that are geographically distributed on the Internet, but appear to the scientist as a single file management system. The DataNet Federation Consortium (DFC) project is built on iRODS and aims to demonstrate data and computational interoperability across scientific communities. This paper leverages iRODS and the DFC to demonstrate how hydrological modeling workflows can be encapsulated as workflows using the iRODS concept of Workflow Structured Objects (WSO). An example use case is presented for automating hydrologic model post-processing routines that demonstrates how WSOs can be created and used within the DFC to automate the creation of data visualizations from large model output collections. By co-locating the workflow used to create the visualization with the data collection, the use case demonstrates how data grid technology aids in reuse, reproducibility, and sharing of workflows within scientific communities.

  16. An iterative expanding and shrinking process for processor allocation in mixed-parallel workflow scheduling.

    Science.gov (United States)

    Huang, Kuo-Chan; Wu, Wei-Ya; Wang, Feng-Jian; Liu, Hsiao-Ching; Hung, Chun-Hao

    2016-01-01

    Parallel computation has been widely applied in a variety of large-scale scientific and engineering applications. Many studies indicate that exploiting both task and data parallelisms, i.e. mixed-parallel workflows, to solve large computational problems can get better efficacy compared with either pure task parallelism or pure data parallelism. Scheduling traditional workflows of pure task parallelism on parallel systems has long been known to be an NP-complete problem. Mixed-parallel workflow scheduling has to deal with an additional challenging issue of processor allocation. In this paper, we explore the processor allocation issue in scheduling mixed-parallel workflows of moldable tasks, called M-task, and propose an Iterative Allocation Expanding and Shrinking (IAES) approach. Compared to previous approaches, our IAES has two distinguishing features. The first is allocating more processors to the tasks on allocated critical paths for effectively reducing the makespan of workflow execution. The second is allowing the processor allocation of an M-task to shrink during the iterative procedure, resulting in a more flexible and effective process for finding better allocation. The proposed IAES approach has been evaluated with a series of simulation experiments and compared to several well-known previous methods, including CPR, CPA, MCPA, and MCPA2. The experimental results indicate that our IAES approach outperforms those previous methods significantly in most situations, especially when nodes of the same layer in a workflow might have unequal workloads.

  17. Quantitative ethnographic study of physician workflow and interactions with electronic health record systems.

    Science.gov (United States)

    Asan, Onur; Chiou, Erin; Montague, Enid

    2015-09-01

    This study explores the relationship between primary care physicians' interactions with health information technology and primary care workflow. Clinical encounters were recorded with high-resolution video cameras to capture physicians' workflow and interaction with two objects of interest, the electronic health record (EHR) system, and their patient. To analyze the data, a coding scheme was developed based on a validated list of primary care tasks to define the presence or absence of a task, the time spent on each task, and the sequence of tasks. Results revealed divergent workflows and significant differences between physicians' EHR use surrounding common workflow tasks: gathering information, documenting information, and recommend/discuss treatment options. These differences suggest impacts of EHR use on primary care workflow, and capture types of workflows that can be used to inform future studies with larger sample sizes for more effective designs of EHR systems in primary care clinics. Future research on this topic and design strategies for effective health information technology in primary care are discussed.

  18. Automatic differentiation using vectorized hyper dual numbers

    Science.gov (United States)

    Swaroop, Kshitiz

    Sensitivity analysis is a method to measure the change in a dependent variable with respect to one or more independent variables with uses including optimization, design analysis and risk modeling. Conventional methods like finite difference suffer from both truncation, subtraction errors and cannot be used to simultaneously calculate derivatives of an output with respect to multiple inputs (commonly seen in optimization problems). Automatic Differentiation tackles all these issues successfully allowing us to calculate derivatives of any variable with respect to the independent variables in a computer program up to machine precision without any significant user input. Vectorized Hyper Dual Numbers, an extension of Hyper Dual Numbers, which allows the user to automatically calculate both the Hessian and derivative along with the function evaluation is developed for this thesis. The method is then used for the sizing and layup of a composite wind turbine blade as a proof of concept.

  19. Design and realization of semi-automatic English composition correction component base on B/S structure%B/S模式下的半自动化外语作文批改组件的设计与实现

    Institute of Scientific and Technical Information of China (English)

    王志瑞; 黄慧; 刘正涛

    2015-01-01

    For the need of English composition correction in online English learning, by studying of LanguageTool, JavaScript and Jquery, this paper designs and realizes a semi-automatic English composition correction component. By using of LanguageTool, the component can complete the automatic spelling and grammar check for English composition, and on the basis of checking result, allows the reviewer on the browser side to continue the correction. It is convenient for teacher to correct English composition online, and convenient for student to see the result of correction. The component was used to the intelligent foreign language learning platform of Easy cube. It can provide references for other online language learning platform.%基于在线外语学习中英语作文在线批改的需要,在研究LanguageTool、JavaScript和Jquery技术基础上,设计和实现了一个基于浏览器的半自动化的作文批改组件。该组件能够借助LanguageTool实现对作文内容的自动拼写和语法检测,并在检测结果的基础上,允许批阅者在浏览器端对中间结果进行继续批改,方便了教师在线作文批阅,也方便了学生在线查看作文的批改结果。该组件已应用到易立方智能化外语学习平台中,可以为其他在线外语学习平台提供参考。

  20. Molecular simulation workflows as parallel algorithms: the execution engine of Copernicus, a distributed high-performance computing platform.

    Science.gov (United States)

    Pronk, Sander; Pouya, Iman; Lundborg, Magnus; Rotskoff, Grant; Wesén, Björn; Kasson, Peter M; Lindahl, Erik

    2015-06-01

    Computational chemistry and other simulation fields are critically dependent on computing resources, but few problems scale efficiently to the hundreds of thousands of processors available in current supercomputers-particularly for molecular dynamics. This has turned into a bottleneck as new hardware generations primarily provide more processing units rather than making individual units much faster, which simulation applications are addressing by increasingly focusing on sampling with algorithms such as free-energy perturbation, Markov state modeling, metadynamics, or milestoning. All these rely on combining results from multiple simulations into a single observation. They are potentially powerful approaches that aim to predict experimental observables directly, but this comes at the expense of added complexity in selecting sampling strategies and keeping track of dozens to thousands of simulations and their dependencies. Here, we describe how the distributed execution framework Copernicus allows the expression of such algorithms in generic workflows: dataflow programs. Because dataflow algorithms explicitly state dependencies of each constituent part, algorithms only need to be described on conceptual level, after which the execution is maximally parallel. The fully automated execution facilitates the optimization of these algorithms with adaptive sampling, where undersampled regions are automatically detected and targeted without user intervention. We show how several such algorithms can be formulated for computational chemistry problems, and how they are executed efficiently with many loosely coupled simulations using either distributed or parallel resources with Copernicus.

  1. Atarrabi - A Workflow System for the Publication of Environmental Data

    Directory of Open Access Journals (Sweden)

    Florian Quadt

    2012-11-01

    Full Text Available In a research project funded by the German Research Foundation, meteorologists, data publication experts, and computer scientists optimised the publication process of meteorological data and developed software that supports metadata review. The project group placed particular emphasis on scientific and technical quality assurance of primary data and metadata. At the end, the software automatically registers a Digital Object Identifier at DataCite. The software has been successfully integrated into the infrastructure of the World Data Center for Climate, but a key objective was to make the results applicable to data publication processes in other sciences as well.

  2. ANALYSIS OF WORKFLOW ON DESIGN PROJECTS IN INDIA

    Directory of Open Access Journals (Sweden)

    Senthilkumar Venkatachalam

    2010-12-01

    Full Text Available Proposal: The increase in privately funded infrastructure construction in India had compelled project owners to demand highly compressed project schedules due to political risks and early revenue generation. As a result, many of the contracts are based on EPC (Engineering Procurement and Construction contract enabling the contractor to plan and control the EPC phases. Sole responsibility for the three phases has facilitated the use of innovative approaches such as fast-track construction and concurrent engineering in order to minimize project duration. As a part of a research study to improve design processes, the first author spent a year as an observer in two design projects which was done by a leading EPC contractor in India. Both projects required accelerated design and fast-track construction. The first project involved the detailed design of a coal handling unit for a power plant and second the preliminary phase of a large airport design project. The research team had the mandate to analyze the design process and suggest changes to make it more efficient. On the first project, detailed data on the design/drawing workflow was collected and analyzed. The paper presents the analysis of the data identifying the bottlenecks in the process and compares the analysis results with the perceptions of the design team. On the second project, the overall organizational structure for coordinating the interfaces between the design processes was evaluated. The paper presents a structured method to organize the interface and interactions between the various design disciplines. The details of the method proposed, implementation issues and outcomes of implementation are also discussed.

  3. Integrated exploration workflow in the south Middle Magdalena Valley (Colombia)

    Science.gov (United States)

    Moretti, Isabelle; Charry, German Rodriguez; Morales, Marcela Mayorga; Mondragon, Juan Carlos

    2010-03-01

    The HC exploration is presently active in the southern part of the Middle Magdalena Valley but only moderate size discoveries have been made up to date. The majority of these discoveries are at shallow depth in the Tertiary section. The structures located in the Valley are faulted anticlines charged by lateral migration from the Cretaceous source rocks that are assumed to be present and mature eastward below the main thrusts and the Guaduas Syncline. Upper Cretaceous reservoirs have also been positively tested. To reduce the risks linked to the exploration of deeper structures below the western thrusts of the Eastern Cordillera, an integrated study was carried out. It includes the acquisition of new seismic data, the integration of all surface and subsurface data within a 3D-geomodel, a quality control of the structural model by restoration and a modeling of the petroleum system (presence and maturity of the Cretaceous source rocks, potential migration pathways). The various steps of this workflow will be presented as well as the main conclusions in term of source rock, deformation phases and timing of the thrust emplacement versus oil maturation and migration. Our data suggest (or confirm) The good potential of the Umir Fm as a source rock. The early (Paleogene) deformation of the Bituima Trigo fault area. The maturity gap within the Cretaceous source rock between the hangingwall and footwall of the Bituima fault that proves an initial offset of Cretaceous burial in the range of 4.5 km between the Upper Cretaceous series westward and the Lower Cretaceous ones eastward of this fault zone. The post Miocene weak reactivation as dextral strike slip of Cretaceous faults such as the San Juan de Rio Seco fault that corresponds to change in the Cretaceous thickness and therefore in the depth of the thrust decollement.

  4. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  5. Automatic image cropping for republishing

    Science.gov (United States)

    Cheatle, Phil

    2010-02-01

    Image cropping is an important aspect of creating aesthetically pleasing web pages and repurposing content for different web or printed output layouts. Cropping provides both the possibility of improving the composition of the image, and also the ability to change the aspect ratio of the image to suit the layout design needs of different document or web page formats. This paper presents a method for aesthetically cropping images on the basis of their content. Underlying the approach is a novel segmentation-based saliency method which identifies some regions as "distractions", as an alternative to the conventional "foreground" and "background" classifications. Distractions are a particular problem with typical consumer photos found on social networking websites such as FaceBook, Flickr etc. Automatic cropping is achieved by identifying the main subject area of the image and then using an optimization search to expand this to form an aesthetically pleasing crop. Evaluation of aesthetic functions like auto-crop is difficult as there is no single correct solution. A further contribution of this paper is an automated evaluation method which goes some way towards handling the complexity of aesthetic assessment. This allows crop algorithms to be easily evaluated against a large test set.

  6. Comparison of automatic control systems

    Science.gov (United States)

    Oppelt, W

    1941-01-01

    This report deals with a reciprocal comparison of an automatic pressure control, an automatic rpm control, an automatic temperature control, and an automatic directional control. It shows the difference between the "faultproof" regulator and the actual regulator which is subject to faults, and develops this difference as far as possible in a parallel manner with regard to the control systems under consideration. Such as analysis affords, particularly in its extension to the faults of the actual regulator, a deep insight into the mechanism of the regulator process.

  7. A workflow example of PBPK modeling to support pediatric research and development: case study with lorazepam.

    Science.gov (United States)

    Maharaj, A R; Barrett, J S; Edginton, A N

    2013-04-01

    The use of physiologically based pharmacokinetic (PBPK) models in the field of pediatric drug development has garnered much interest of late due to a recent Food and Drug Administration recommendation. The purpose of this study is to illustrate the developmental processes involved in creation of a pediatric PBPK model incorporating existing adult drug data. Lorazepam, a benzodiazepine utilized in both adults and children, was used as an example. A population-PBPK model was developed in PK-Sim v4.2® and scaled to account for age-related changes in size and composition of tissue compartments, protein binding, and growth/maturation of elimination processes. Dose (milligrams per kilogram) requirements for children aged 0-18 years were calculated based on simulations that achieved targeted exposures based on adult references. Predictive accuracy of the PBPK model for producing comparable plasma concentrations among 63 pediatric subjects was assessed using average-fold error (AFE). Estimates of clearance (CL) and volume of distribution (V(ss)) were compared with observed values for a subset of 15 children using fold error (FE). Pediatric dose requirements in young children (1-3 years) exceeded adult levels on a linear weight-adjusted (milligrams per kilogram) basis. AFE values for model-derived concentration estimates were within 1.5- and 2-fold deviation from observed values for 73% and 92% of patients, respectively. For CL, 60% and 80% of predictions were within 1.5 and 2 FE, respectively. Comparatively, predictions of V(ss) were more accurate with 80% and 100% of estimates within 1.5 and 2 FE, respectively. Using the presented workflow, the developed pediatric model estimated lorazepam pharmacokinetics in children as a function of age.

  8. Assessing color reproduction tolerances in commercial print workflow

    Science.gov (United States)

    Beretta, Giordano B.; Hoarau, Eric; Kothari, Sunil; Lin, I.-Jong; Zeng, Jun

    2012-01-01

    Except for linear devices like CRTs, color transformations from colorimetric specifications to device coordinates are mostly obtained by measuring a set of samples, inverting the table, and looking up values in the table (including interpolation), and mapping the gamut from input to output device. The accuracy of a transformation is determined by reproducing a second set of samples and measuring the reproduction errors. Accuracy as the average predicted perceptual error is then used as a metric for quality. Accuracy and precision are important metrics in commercial print because a print service provider can charge a higher price for more accurate color, or can widen his tolerances when customers prefer cheap prints. The disadvantage of determining tolerances through averaging perceptual errors is that the colors in the sample sets are independent and this is not necessarily a good correlate of print quality as determined through psychophysics studies. Indeed, images consist of color palettes and the main quality factor is not color fidelity but color integrity. For example, if the divergence of the field of error vectors is zero, color constancy is likely to take over and humans will perceive the color reproduction as being of good quality, even if the average error is relatively large. However, if the errors are small but in random directions, the perceived image quality is poor because the relation among colors is altered. We propose a standard practice to determine tolerance based on the Farnsworth-Munsell 100-hue test (FM-100) for the second set and to evaluate the color transpositions-a metric for color integrity-instead of the color differences. The quality metric is then the FM-100 score. There are industry standards for the tolerances of color judges, and the same tolerances and classification can be use for print workflows or its components (e.g., presses, proofers, displays). We generalize this practice to arbitrary perceptually uniform scales tailored to

  9. A workflow for the 3D visualization of meteorological data

    Science.gov (United States)

    Helbig, Carolin; Rink, Karsten

    2014-05-01

    In the future, climate change will strongly influence our environment and living conditions. To predict possible changes, climate models that include basic and process conditions have been developed and big data sets are produced as a result of simulations. The combination of various variables of climate models with spatial data from different sources helps to identify correlations and to study key processes. For our case study we use results of the weather research and forecasting (WRF) model of two regions at different scales that include various landscapes in Northern Central Europe and Baden-Württemberg. We visualize these simulation results in combination with observation data and geographic data, such as river networks, to evaluate processes and analyze if the model represents the atmospheric system sufficiently. For this purpose, a continuous workflow that leads from the integration of heterogeneous raw data to visualization using open source software (e.g. OpenGeoSys Data Explorer, ParaView) is developed. These visualizations can be displayed on a desktop computer or in an interactive virtual reality environment. We established a concept that includes recommended 3D representations and a color scheme for the variables of the data based on existing guidelines and established traditions in the specific domain. To examine changes over time in observation and simulation data, we added the temporal dimension to the visualization. In a first step of the analysis, the visualizations are used to get an overview of the data and detect areas of interest such as regions of convection or wind turbulences. Then, subsets of data sets are extracted and the included variables can be examined in detail. An evaluation by experts from the domains of visualization and atmospheric sciences establish if they are self-explanatory and clearly arranged. These easy-to-understand visualizations of complex data sets are the basis for scientific communication. In addition, they have

  10. An Exercise in Invariant-based Programming with Interactive and Automatic Theorem Prover Support

    CERN Document Server

    Back, Ralph-Johan; 10.4204/EPTCS.79.2

    2012-01-01

    Invariant-Based Programming (IBP) is a diagram-based correct-by-construction programming methodology in which the program is structured around the invariants, which are additionally formulated before the actual code. Socos is a program construction and verification environment built specifically to support IBP. The front-end to Socos is a graphical diagram editor, allowing the programmer to construct invariant-based programs and check their correctness. The back-end component of Socos, the program checker, computes the verification conditions of the program and tries to prove them automatically. It uses the theorem prover PVS and the SMT solver Yices to discharge as many of the verification conditions as possible without user interaction. In this paper, we first describe the Socos environment from a user and systems level perspective; we then exemplify the IBP workflow by building a verified implementation of heapsort in Socos. The case study highlights the role of both automatic and interactive theorem provi...

  11. 仿形钻削与干冰喷射复合倒空弹丸装药自动生产线%Copying Drilling and Dry Ice Blasting Composite Projectile Charge Emptied Automatic Production Line

    Institute of Scientific and Technical Information of China (English)

    罗同杰; 张保良

    2016-01-01

    In view of the shortcomings of existing process of removing shell charge in terms of disposing various explosives and treating “three wastes” (waste gas, waste water and waste residues), this paper proposes an automatic integrated plant that can removing various shell charges by means of profiling drilling and dry ice jetting. It explains the concept and the major structural components of the system, which can be automatically controlled by Siemens PLCS7-200. The plant will be flexible for disposing different types of projectiles by changing the machinery and tools. Analysis shows that the automatic plant can empty various types of projectiles filled with different kinds of explosives efficiently and environment-friendly, and the recycled explosives and shells (empty) are good in quality.%针对国内现有倒空弹丸装药工艺方法在处理所装炸药类别和治理“三废”方面存在的局限性,介绍一种适用于各类炸药从弹丸中倒出的仿形钻削与干冰喷射复合自动生产线。论述了生产线的原理及主要组成结构,通过采用西门子 PLCS7-200型控制器,实现了生产线的自动化控制;更换相应工装,满足不同弹丸的柔性化生产。分析结果表明:该自动生产线可倒空多类别炸药装填的不同弹丸,效率高、无污染;倒出回收的炸药和倒空的弹体品质好。

  12. Incorporating Semantics into Data Driven Workflows for Content Based Analysis

    Science.gov (United States)

    Argüello, M.; Fernandez-Prieto, M. J.

    Finding meaningful associations between text elements and knowledge structures within clinical narratives in a highly verbal domain, such as psychiatry, is a challenging goal. The research presented here uses a small corpus of case histories and brings into play pre-existing knowledge, and therefore, complements other approaches that use large corpus (millions of words) and no pre-existing knowledge. The paper describes a variety of experiments for content-based analysis: Linguistic Analysis using NLP-oriented approaches, Sentiment Analysis, and Semantically Meaningful Analysis. Although it is not standard practice, the paper advocates providing automatic support to annotate the functionality as well as the data for each experiment by performing semantic annotation that uses OWL and OWL-S. Lessons learnt can be transmitted to legacy clinical databases facing the conversion of clinical narratives according to prominent Electronic Health Records standards.

  13. Seamless online science workflow development and collaboration using IDL and the ENVI Services Engine

    Science.gov (United States)

    Harris, A. T.; Ramachandran, R.; Maskey, M.

    2013-12-01

    The Exelis-developed IDL and ENVI software are ubiquitous tools in Earth science research environments. The IDL Workbench is used by the Earth science community for programming custom data analysis and visualization modules. ENVI is a software solution for processing and analyzing geospatial imagery that combines support for multiple Earth observation scientific data types (optical, thermal, multi-spectral, hyperspectral, SAR, LiDAR) with advanced image processing and analysis algorithms. The ENVI & IDL Services Engine (ESE) is an Earth science data processing engine that allows researchers to use open standards to rapidly create, publish and deploy advanced Earth science data analytics within any existing enterprise infrastructure. Although powerful in many ways, the tools lack collaborative features out-of-box. Thus, as part of the NASA funded project, Collaborative Workbench to Accelerate Science Algorithm Development, researchers at the University of Alabama in Huntsville and Exelis have developed plugins that allow seamless research collaboration from within IDL workbench. Such additional features within IDL workbench are possible because IDL workbench is built using the Eclipse Rich Client Platform (RCP). RCP applications allow custom plugins to be dropped in for extended functionalities. Specific functionalities of the plugins include creating complex workflows based on IDL application source code, submitting workflows to be executed by ESE in the cloud, and sharing and cloning of workflows among collaborators. All these functionalities are available to scientists without leaving their IDL workbench. Because ESE can interoperate with any middleware, scientific programmers can readily string together IDL processing tasks (or tasks written in other languages like C++, Java or Python) to create complex workflows for deployment within their current enterprise architecture (e.g. ArcGIS Server, GeoServer, Apache ODE or SciFlo from JPL). Using the collaborative IDL

  14. Automatic Fixture Planning

    Institute of Scientific and Technical Information of China (English)

    1999-01-01

    Fixture planning is a crucial problem in the field of fixture design. In this paper, the research scope and research methods of the computer-aided fixture planning are presented. Based on positioning principles of typical workparts, an ANN algorithm, namely Hopfield algorithm, is adopted for the automatic fixture planning. Also, this paper leads a deep research into the selection of positioning and clamping surfaces (or points) on workparts using positioning-clamping-surface-selecting rules and matrix evaluation of deterministic workpart positioning. In the end of this paper, the methods to select positioning and clamping elements from database and the layout algorithm to assemble the selected fixture elements into a tangible fixture are developed.

  15. Exploring Two Approaches for an End-to-End Scientific Analysis Workflow

    Science.gov (United States)

    Dodelson, Scott; Kent, Steve; Kowalkowski, Jim; Paterno, Marc; Sehrish, Saba

    2015-12-01

    The scientific discovery process can be advanced by the integration of independently-developed programs run on disparate computing facilities into coherent workflows usable by scientists who are not experts in computing. For such advancement, we need a system which scientists can use to formulate analysis workflows, to integrate new components to these workflows, and to execute different components on resources that are best suited to run those components. In addition, we need to monitor the status of the workflow as components get scheduled and executed, and to access the intermediate and final output for visual exploration and analysis. Finally, it is important for scientists to be able to share their workflows with collaborators. We have explored two approaches for such an analysis framework for the Large Synoptic Survey Telescope (LSST) Dark Energy Science Collaboration (DESC); the first one is based on the use and extension of Galaxy, a web-based portal for biomedical research, and the second one is based on a programming language, Python. In this paper, we present a brief description of the two approaches, describe the kinds of extensions to the Galaxy system we have found necessary in order to support the wide variety of scientific analysis in the cosmology community, and discuss how similar efforts might be of benefit to the HEP community.

  16. Time-efficient CT colonography interpretation using an advanced image-gallery-based, computer-aided ''first-reader'' workflow for the detection of colorectal adenomas

    Energy Technology Data Exchange (ETDEWEB)

    Mang, Thomas; Ringl, Helmut; Weber, Michael; Mueller-Mang, Christina [Medical University of Vienna, Department of Radiology, Vienna (Austria); Hermosillo, Gerardo; Wolf, Matthias; Bogoni, Luca; Salganicoff, Marcos; Raykar, Vikas [Siemens Healthcare, Siemens Medical Solutions, H IM SY CAD R and D, Malvern, PA (United States); Graser, Anno [University of Munich - Grosshadern Campus, Department of Clinical Radiology, Munich (Germany)

    2012-12-15

    To assess the performance of an advanced ''first-reader'' workflow for computer-aided detection (CAD) of colorectal adenomas {>=} 6 mm at computed tomographic colonography (CTC) in a low-prevalence cohort. A total of 616 colonoscopy-validated CTC patient-datasets were retrospectively reviewed by a radiologist using a ''first-reader'' CAD workflow. CAD detections were presented as galleries of six automatically generated two-dimensional (2D) and three-dimensional (3D) images together with interactive 3D target views and 2D multiplanar views of the complete dataset. Each patient-dataset was interpreted by initially using CAD image-galleries followed by a fast 2D review to address unprompted colonic areas. Per-patient, per-polyp, and per-adenoma sensitivities were calculated for lesions {>=} 6 mm. Statistical testing employed Fisher's exact and McNemar tests. In 91/616 patients, 131 polyps (92 adenomas, 39 non-adenomas) {>=} 6 mm and two cancers were identified by reference standard. Using the CAD gallery-based first-reader workflow, the radiologist detected all adenomas {>=} 10 mm (34/34) and cancers. Per-patient and polyp sensitivities for lesions {>=} 6 mm were 84.3 % (75/89), and 83.2 % (109/131), respectively, with 89.1 % (57/64) and 85.9 % (79/92) for adenomas. Overall specificity was 95.6 % (504/527). Mean interpretation time was 3.1 min per patient. A CAD algorithm, applied in an image-gallery-based first-reader workflow, can substantially decrease reading times while enabling accurate detection of colorectal adenomas in a low-prevalence population. (orig.)

  17. Management of natural resources through automatic cartographic inventory

    Science.gov (United States)

    Rey, P. A.; Gourinard, Y.; Cambou, F. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. Significant correspondence codes relating ERTS imagery to ground truth from vegetation and geology maps have been established. The use of color equidensity and color composite methods for selecting zones of equal densitometric value on ERTS imagery was perfected. Primary interest of temporal color composite is stressed. A chain of transfer operations from ERTS imagery to the automatic mapping of natural resources was developed.

  18. RECOMMENDATION FOR WEB SERVICE COMPOSITION BY MINING USAGE LOGS

    Directory of Open Access Journals (Sweden)

    Vivek R

    2016-03-01

    Full Text Available Web service composition has been one of the most researched topics of the past decade. Novel methods of web service composition are being proposed in the literature include Semantics-based composition, WSDLbased composition. Although these methods provide promising results for composition, search and discovery of web service based on QoS parameter of network and semantics or ontology associated with WSDL, they do not address composition based on usage of web service. Web Service usage logs capture time series data of web service invocation by business objects, which innately captures patterns or workflows associated with business operations. Web service composition based on such patterns and workflows can greatly streamline the business operations. In this research work, we try to explore and implement methods of mining web service usage logs. Main objectives include Identifying usage association of services. Linking one service invocation with other, Evaluation of the causal relationship between associations of services.

  19. An Integrated Workflow For Secondary Use of Patient Data for Clinical Research.

    Science.gov (United States)

    Bouzillé, Guillaume; Sylvestre, Emmanuelle; Campillo-Gimenez, Boris; Renault, Eric; Ledieu, Thibault; Delamarre, Denis; Cuggia, Marc

    2015-01-01

    This work proposes an integrated workflow for secondary use of medical data to serve feasibility studies, and the prescreening and monitoring of research studies. All research issues are initially addressed by the Clinical Research Office through a research portal and subsequently redirected to relevant experts in the determined field of concentration. For secondary use of data, the workflow is then based on the clinical data warehouse of the hospital. A datamart with potentially eligible research candidates is constructed. Datamarts can either produce aggregated data, de-identified data, or identified data, according to the kind of study being treated. In conclusion, integrating the secondary use of data process into a general research workflow allows visibility of information technologies and improves the accessability of clinical data.

  20. Talkoot Portals: Discover, Tag, Share, and Reuse Collaborative Science Workflows (Invited)

    Science.gov (United States)

    Wilson, B. D.; Ramachandran, R.; Lynnes, C.

    2009-12-01

    A small but growing number of scientists are beginning to harness Web 2.0 technologies, such as wikis, blogs, and social tagging, as a transformative way of doing science. These technologies provide researchers easy mechanisms to critique, suggest and share ideas, data and algorithms. At the same time, large suites of algorithms for science analysis are being made available as remotely-invokable Web Services, which can be chained together to create analysis workflows. This provides the research community an unprecedented opportunity to collaborate by sharing their workflows with one another, reproducing and analyzing research results, and leveraging colleagues’ expertise to expedite the process of scientific discovery. However, wikis and similar technologies are limited to text, static images and hyperlinks, providing little support for collaborative data analysis. A team of information technology and Earth science researchers from multiple institutions have come together to improve community collaboration in science analysis by developing a customizable “software appliance” to build collaborative portals for Earth Science services and analysis workflows. The critical requirement is that researchers (not just information technologists) be able to build collaborative sites around service workflows within a few hours. We envision online communities coming together, much like Finnish “talkoot” (a barn raising), to build a shared research space. Talkoot extends a freely available, open source content management framework with a series of modules specific to Earth Science for registering, creating, managing, discovering, tagging and sharing Earth Science web services and workflows for science data processing, analysis and visualization. Users will be able to author a “science story” in shareable web notebooks, including plots or animations, backed up by an executable workflow that directly reproduces the science analysis. New services and workflows of

  1. The EOS imaging system: Workflow and radiation dose in scoliosis examinations

    DEFF Research Database (Denmark)

    Mussmann, Bo; Torfing, Trine; Jespersen, Stig;

    Introduction The EOS imaging system is a biplane slot beam scanner capable of full body scans at low radiation dose and without geometrical distortion. It was implemented in our department primo 2012 and all scoliosis examinations are now performed in EOS. The system offers improved possibility...... The purpose of the study was to evaluate workflow defined as scheduled time pr. examination and radiation dose in scoliosis examinations in EOS compared to conventional x-ray evaluation. Materials and Methods: The Dose Area Product (DAP) was measured with a dosimeter and a comparison between conventional X......-ray and EOS was made. The Workflow in 2011 was compared to the workflow in 2013 with regards to the total number of examinations and the scheduled examination time for scoliosis examinations. Results: DAP for a scoliosis examination in conventional X-ray was 185 mGy*cm2 and 60.36 mGy*cm2 for EOS...

  2. Parametric Workflow (BIM) for the Repair Construction of Traditional Historic Architecture in Taiwan

    Science.gov (United States)

    Ma, Y.-P.; Hsu, C. C.; Lin, M.-C.; Tsai, Z.-W.; Chen, J.-Y.

    2015-08-01

    In Taiwan, numerous existing traditional buildings are constructed with wooden structures, brick structures, and stone structures. This paper will focus on the Taiwan traditional historic architecture and target the traditional wooden structure buildings as the design proposition and process the BIM workflow for modeling complex wooden combination geometry, integrating with more traditional 2D documents and for visualizing repair construction assumptions within the 3D model representation. The goal of this article is to explore the current problems to overcome in wooden historic building conservation, and introduce the BIM technology in the case of conserving, documenting, managing, and creating full engineering drawings and information for effectively support historic conservation. Although BIM is mostly oriented to current construction praxis, there have been some attempts to investigate its applicability in historic conservation projects. This article also illustrates the importance and advantages of using BIM workflow in repair construction process, when comparing with generic workflow.

  3. Reduction of Hospital Physicians' Workflow Interruptions: A Controlled Unit-Based Intervention Study

    Directory of Open Access Journals (Sweden)

    Matthias Weigl

    2012-01-01

    Full Text Available Highly interruptive clinical environments may cause work stress and suboptimal clinical care. This study features an intervention to reduce workflow interruptions by re-designing work and organizational practices in hospital physicians providing ward coverage. A prospective, controlled intervention was conducted in two surgical and two internal wards. The intervention was based on physician quality circles - a participative technique to involve employees in the development of solutions to overcome work-related stressors. Outcome measures were the frequency of observed workflow interruptions. Workflow interruptions by fellow physicians and nursing staff were significantly lower after the intervention. However, a similar decrease was also observed in control units. Additional interviews to explore process-related factors suggested that there might have been spill-over effects in the sense that solutions were not strictly confined to the intervention group. Recommendations for further research on the effectiveness and consequences of such interventions for professional communication and patient safety are discussed.

  4. Loosen Couple Workflow Mode of Lean Operator Improvement Based on Positive Feedback

    Directory of Open Access Journals (Sweden)

    Yao Li

    2013-04-01

    Full Text Available In order to promote the core competitive power for telecom operating enterprises to face market fine operation, this article compares the ECTA mode (Extension Case Transmission Mode and the LCA mode (Loosen Couple Mode, both of which are promoted by WfMC. By comparing these two modes, the suitable situations for these two modes are determined. We also carry out empirical analysis based on the customization mode of mobile phones between China telecom and mobile phone manufacturers and to expound the ascension effect of mechanism based on the agile telecom loose coupling workflow with positive feedback to the telecom enterprises. Finally, on the basis of positive feedback system, the task complexity and information transparency of LCA mode are improved, so that the semantics of public flow mode is kept unchanged and the sub workflow is optimized when modifying the sub workflow.

  5. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes...... representing the events that can happen and arrows representing four relations between events: condition, response, include, and exclude. Distributed DCR Graphs is then obtained by assigning roles to events and principals. We give a graphical notation inspired by related work by van der Aalst et al. We...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  6. Armadillo 1.1: an original workflow platform for designing and conducting phylogenetic analysis and simulations.

    Directory of Open Access Journals (Sweden)

    Etienne Lord

    Full Text Available In this paper we introduce Armadillo v1.1, a novel workflow platform dedicated to designing and conducting phylogenetic studies, including comprehensive simulations. A number of important phylogenetic and general bioinformatics tools have been included in the first software release. As Armadillo is an open-source project, it allows scientists to develop their own modules as well as to integrate existing computer applications. Using our workflow platform, different complex phylogenetic tasks can be modeled and presented in a single workflow without any prior knowledge of programming techniques. The first version of Armadillo was successfully used by professors of bioinformatics at Université du Quebec à Montreal during graduate computational biology courses taught in 2010-11. The program and its source code are freely available at: .

  7. New strategies for medical data mining, part 3: automated workflow analysis and optimization.

    Science.gov (United States)

    Reiner, Bruce

    2011-02-01

    The practice of evidence-based medicine calls for the creation of "best practice" guidelines, leading to improved clinical outcomes. One of the primary factors limiting evidence-based medicine in radiology today is the relative paucity of standardized databases. The creation of standardized medical imaging databases offer the potential to enhance radiologist workflow and diagnostic accuracy through objective data-driven analytics, which can be categorized in accordance with specific variables relating to the individual examination, patient, provider, and technology being used. In addition to this "global" database analysis, "individual" radiologist workflow can be analyzed through the integration of electronic auditing tools into the PACS. The combination of these individual and global analyses can ultimately identify best practice patterns, which can be adapted to the individual attributes of end users and ultimately used in the creation of automated evidence-based medicine workflow templates.

  8. Differentiated protection services with failure probability guarantee for workflow-based applications

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  9. An Extended Policy Language for Role Resolution in Project-Oriented Workflow

    Institute of Scientific and Technical Information of China (English)

    张晓光; 曹健; 张申生; 牟玉洁

    2004-01-01

    HP defines an SQL-like language to specify organizational policies (or constraints) in workflow systems.Three types of policies were studied including qualification, requirements and substitution policies which can not handle complex role resolution such as Separation of Roles and Binding of Roles, and several exception situations,such as Role Delegation and Role Unavailable. From the perspective of project-oriented workflow, a project and its sub-projects can be under the charge of teams (or virtual teams). The teams should satisfy the role resolution of the projects managed by the team. To support the above requirements, based on team-enabled organization model,this paper extended HP's policy language to support the role resolution in project-oriented workflow, and provided its modeling and enforcement mechanism.

  10. EPUB as publication format in Open Access journals: Tools and workflow

    Directory of Open Access Journals (Sweden)

    Trude Eikebrokk

    2014-04-01

    Full Text Available In this article, we present a case study of how the main publishing format of an Open Access journal was changed from PDF to EPUB by designing a new workflow using JATS as the basic XML source format. We state the reasons and discuss advantages for doing this, how we did it, and the costs of changing an established Microsoft Word workflow. As an example, we use one typical sociology article with tables, illustrations and references. We then follow the article from JATS markup through different transformations resulting in XHTML, EPUB and MOBI versions. In the end, we put everything together in an automated XProc pipeline. The process has been developed on free and open source tools, and we describe and evaluate these tools in the article. The workflow is suitable for non-professional publishers, and all code is attached and free for reuse by others.

  11. KNOWLEDGE MANAGEMENT DRIVEN BUSINESS PROCESS AND WORKFLOW MODELING WITHIN AN ORGANIZATION FOR CUSTOMER SATISFACTION

    Directory of Open Access Journals (Sweden)

    Atsa Etoundi roger,

    2010-12-01

    Full Text Available Enterprises to deal with the competitive pressure of the network economy have to design their business processes and workflow systems based on the satisfaction of customers. Therefore, mass product productions have to be abandoned for individual and customized products. Enterprises that fail to meet this challenge will beobliged to step down. Those which tackle this problem need to manage the knowledge of various customers in order to come out with a set of criteria for the delivery of services or production of products. These criteria should then be use to reengineer the business processes and workflows accordingly for the satisfaction of eachof the customers. In this paper, based on the knowledge management approach, we define an enterprise business process and workflow model for the delivery of services and production of goods based on the satisfaction of customers.

  12. Comprehensive workflow for wireline fluid sampling in an unconsolidated formations utilizing new large volume sampling equipment

    Energy Technology Data Exchange (ETDEWEB)

    Kvinnsland, S.; Brun, M. [TOTAL EandP Norge (Norway); Achourov, V.; Gisolf, A. [Schlumberger (Canada)

    2011-07-01

    Precise and accurate knowledge of fluid properties is essential in unconsolidated formations to the design of production facilities. Wireline formation testers (WFT) have a wide range of applications and the latest WFT can be used to define fluid properties in the wells drilled with oil based mud (OBM) by acquiring PVT and large volume samples. To use these technologies, a comprehensive workflow has to be implemented and the aim of this paper is to present such a workflow. A sampling was conducted in highly unconsolidated sand saturated with biodegradable fluid in the Hild filed in the North Sea. Results showed the use of comprehensive workflow to be successful in obtaining large volume samples with the contamination level below 1%. Oil was precisely characterized thanks to these samples and design updates to the project were made possible. This paper highlighted that the use of the latest WFT technologies can help better characterize fluids in unconsolidated formations and thus optimize production facilities design.

  13. Automatic Integration Testbeds validation on Open Science Grid

    Science.gov (United States)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  14. TLSpy: An Open-Source Addition to Terrestrial Lidar Workflows

    Science.gov (United States)

    Frechette, J. D.; Weissmann, G. S.; Wawrzyniec, T. F.

    2008-12-01

    points, and deleting points below that plane. This plugin simplifies the process by automatically identifying waterline points using characteristic changes in geometry and intensity. Automatic identification is often faster and more reliable than manual identification, however, manual control is retained as a fallback for degenerate cases.

  15. Perti Net-Based Workflow Access Control Model%基于Perti网的工作流访问控制模型研究

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems.This paper shows how to make access control in workflow system.We give a workflow access control model (WACM) based on several current access control models.The model supports roles assignment and dynamic authorization.The paper defines the workflow using Petri net.It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM).Finally, an example of an e-commerce workflow access control model is discussed in detail.

  16. Creating OGC Web Processing Service workflows using a web-based editor

    Science.gov (United States)

    de Jesus, J.; Walker, P.; Grant, M.

    2012-04-01

    The OGC WPS (Web Processing Service) specifies how geospatial algorithms may be accessed in an SOA (Service Oriented Architecture). Service providers can encode both simple and sophisticated algorithms as WPS processes and publish them as web services. These services are not only useful individually but may be built into complex processing chains (workflows) that can solve complex data analysis and/or scientific problems. The NETMAR project has extended the Web Processing Service (WPS) framework to provide transparent integration between it and the commonly used WSDL (Web Service Description Language) that describes the web services and its default SOAP (Simple Object Access Protocol) binding. The extensions allow WPS services to be orchestrated using commonly used tools (in this case Taverna Workbench, but BPEL based systems would also be an option). We have also developed a WebGUI service editor, based on HTML5 and the WireIt! Javascript API, that allows users to create these workflows using only a web browser. The editor is coded entirely in Javascript and performs all XSLT transformations needed to produce a Taverna compatible (T2FLOW) workflow description which can be exported and run on a local Taverna Workbench or uploaded to a web-based orchestration server and run there. Here we present the NETMAR WebGUI service chain editor and discuss the problems associated with the development of a WebGUI for scientific workflow editing; content transformation into the Taverna orchestration language (T2FLOW/SCUFL); final orchestration in the Taverna engine and how to deal with the large volumes of data being transferred between different WPS services (possibly running on different servers) during workflow orchestration. We will also demonstrate using the WebGUI for creating a simple workflow making use of published web processing services, showing how simple services may be chained together to produce outputs that would previously have required a GIS (Geographic

  17. A semi-automated workflow for biodiversity data retrieval, cleaning, and quality control

    Directory of Open Access Journals (Sweden)

    Cherian Mathew

    2014-12-01

    Full Text Available The compilation and cleaning of data needed for analyses and prediction of species distributions is a time consuming process requiring a solid understanding of data formats and service APIs provided by biodiversity informatics infrastructures. We designed and implemented a Taverna-based Data Refinement Workflow which integrates taxonomic data retrieval, data cleaning, and data selection into a consistent, standards-based, and effective system hiding the complexity of underlying service infrastructures. The workflow can be freely used both locally and through a web-portal which does not require additional software installations by users.

  18. Implementation of workflow engine technology to deliver basic clinical decision support functionality

    Directory of Open Access Journals (Sweden)

    Oberg Ryan

    2011-04-01

    Full Text Available Abstract Background Workflow engine technology represents a new class of software with the ability to graphically model step-based knowledge. We present application of this novel technology to the domain of clinical decision support. Successful implementation of decision support within an electronic health record (EHR remains an unsolved research challenge. Previous research efforts were mostly based on healthcare-specific representation standards and execution engines and did not reach wide adoption. We focus on two challenges in decision support systems: the ability to test decision logic on retrospective data prior prospective deployment and the challenge of user-friendly representation of clinical logic. Results We present our implementation of a workflow engine technology that addresses the two above-described challenges in delivering clinical decision support. Our system is based on a cross-industry standard of XML (extensible markup language process definition language (XPDL. The core components of the system are a workflow editor for modeling clinical scenarios and a workflow engine for execution of those scenarios. We demonstrate, with an open-source and publicly available workflow suite, that clinical decision support logic can be executed on retrospective data. The same flowchart-based representation can also function in a prospective mode where the system can be integrated with an EHR system and respond to real-time clinical events. We limit the scope of our implementation to decision support content generation (which can be EHR system vendor independent. We do not focus on supporting complex decision support content delivery mechanisms due to lack of standardization of EHR systems in this area. We present results of our evaluation of the flowchart-based graphical notation as well as architectural evaluation of our implementation using an established evaluation framework for clinical decision support architecture. Conclusions We

  19. Magnetic resonance only workflow and validation of dose calculations for radiotherapy of prostate cancer

    DEFF Research Database (Denmark)

    Christiansen, Rasmus Lübeck; Jensen, Henrik R.; Brink, Carsten

    2017-01-01

    Background: Current state of the art radiotherapy planning of prostate cancer utilises magnetic resonance (MR) for soft tissue delineation and computed tomography (CT) to provide an electron density map for dose calculation. This dual scan workflow is prone to setup and registration error....... This study evaluates the feasibility of an MR-only workflow and the validity of dose calculation from an MR derived pseudo CT. Material and methods: Thirty prostate cancer patients were CT and MR scanned. Clinical treatment plans were generated on CT using a single 18 MV arc volumetric modulated arc therapy...

  20. How to Take HRMS Process Management to the Next Level with Workflow Business Event System

    Science.gov (United States)

    Rajeshuni, Sarala; Yagubian, Aram; Kunamaneni, Krishna

    2006-01-01

    Oracle Workflow with the Business Event System offers a complete process management solution for enterprises to manage business processes cost-effectively. Using Workflow event messaging, event subscriptions, AQ Servlet and advanced queuing technologies, this presentation will demonstrate the step-by-step design and implementation of system solutions in order to integrate two dissimilar systems and establish communication remotely. As a case study, the presentation walks you through the process of propagating organization name changes in other applications that originated from the HRMS module without changing applications code. The solution can be applied to your particular business cases for streamlining or modifying business processes across Oracle and non-Oracle applications.

  1. Automatic categorization of diverse experimental information in the bioscience literature

    Directory of Open Access Journals (Sweden)

    Fang Ruihua

    2012-01-01

    the curation work flow at WormBase for automatic association of newly published papers with ten data types including RNAi, antibody, phenotype, gene regulation, mutant allele sequence, gene expression, gene product interaction, overexpression phenotype, gene interaction, and gene structure correction. Conclusions Our methods are applicable to a variety of data types with training set containing several hundreds to a few thousand documents. It is completely automatic and, thus can be readily incorporated to different workflow at different literature-based databases. We believe that the work presented here can contribute greatly to the tremendous task of automating the important yet labor-intensive biocuration effort.

  2. Automatic aircraft recognition

    Science.gov (United States)

    Hmam, Hatem; Kim, Jijoong

    2002-08-01

    Automatic aircraft recognition is very complex because of clutter, shadows, clouds, self-occlusion and degraded imaging conditions. This paper presents an aircraft recognition system, which assumes from the start that the image is possibly degraded, and implements a number of strategies to overcome edge fragmentation and distortion. The current vision system employs a bottom up approach, where recognition begins by locating image primitives (e.g., lines and corners), which are then combined in an incremental fashion into larger sets of line groupings using knowledge about aircraft, as viewed from a generic viewpoint. Knowledge about aircraft is represented in the form of whole/part shape description and the connectedness property, and is embedded in production rules, which primarily aim at finding instances of the aircraft parts in the image and checking the connectedness property between the parts. Once a match is found, a confidence score is assigned and as evidence in support of an aircraft interpretation is accumulated, the score is increased proportionally. Finally a selection of the resulting image interpretations with the highest scores, is subjected to competition tests, and only non-ambiguous interpretations are allowed to survive. Experimental results demonstrating the effectiveness of the current recognition system are given.

  3. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  4. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  5. The Automatic Telescope Network (ATN)

    CERN Document Server

    Mattox, J R

    1999-01-01

    Because of the scheduled GLAST mission by NASA, there is strong scientific justification for preparation for very extensive blazar monitoring in the optical bands to exploit the opportunity to learn about blazars through the correlation of variability of the gamma-ray flux with flux at lower frequencies. Current optical facilities do not provide the required capability.Developments in technology have enabled astronomers to readily deploy automatic telescopes. The effort to create an Automatic Telescope Network (ATN) for blazar monitoring in the GLAST era is described. Other scientific applications of the networks of automatic telescopes are discussed. The potential of the ATN for science education is also discussed.

  6. Characterizing Strain Variation in Engineered E. coli Using a Multi-Omics-Based Workflow

    DEFF Research Database (Denmark)

    Brunk, Elizabeth; George, Kevin W.; Alonso-Gutierrez, Jorge;

    2016-01-01

    . Application of this workflow identified the roles of candidate genes, pathways, and biochemical reactions in observed experimental phenomena and facilitated the construction of a mutant strain with improved productivity. The contributed workflow is available as an open-source tool in the form of iPython...... notebooks....

  7. Automatic Evaluation of Machine Translation

    DEFF Research Database (Denmark)

    Martinez, Mercedes Garcia; Koglin, Arlene; Mesa-Lao, Bartolomé

    2015-01-01

    The availability of systems capable of producing fairly accurate translations has increased the popularity of machine translation (MT). The translation industry is steadily incorporating MT in their workflows engaging the human translator to post-edit the raw MT output in order to comply with a set......-DB) *. On the one hand, post-editing temporal effort was measured using FDur values (duration of segment production time excluding keystroke pauses >_ 200 seconds) and KDur values (duration of coherent keyboard activity excluding keystroke pauses >_ 5 seconds). On the other hand, post-editing technical effort...

  8. DiscopFlow: A new Tool for Discovering Organizational Structures and Interaction Protocols in WorkFlow

    CERN Document Server

    Abdelkafi, Mahdi; Gargouri, Faiez

    2012-01-01

    This work deals with Workflow Mining (WM) a very active and promising research area. First, in this paper we give a critical and comparative study of three representative WM systems of this area: the ProM, InWolve and WorkflowMiner systems. The comparison is made according to quality criteria that we have defined such as the capacity to filter and convert a Workflow log, the capacity to discover workflow perspectives and the capacity to support Multi-Analysis of processes. The major drawback of these systems is the non possibility to deal with organizational perspective discovering issue. We mean by organizational perspective, the organizational structures (federation, coalition, market or hierarchy) and interaction protocols (contract net, auction or vote). This paper defends the idea that organizational dimension in Multi-Agent System is an appropriate approach to support the discovering of this organizational perspective. Second, the paper proposes a Workflow log meta-model which extends the classical one ...

  9. Automatic Coarse Graining of Polymers

    OpenAIRE

    Faller, Roland

    2003-01-01

    Several recently proposed semi--automatic and fully--automatic coarse--graining schemes for polymer simulations are discussed. All these techniques derive effective potentials for multi--atom units or super--atoms from atomistic simulations. These include techniques relying on single chain simulations in vacuum and self--consistent optimizations from the melt like the simplex method and the inverted Boltzmann method. The focus is on matching the polymer structure on different scales. Several ...

  10. Automatic Sarcasm Detection: A Survey

    OpenAIRE

    Joshi, Aditya; Bhattacharyya, Pushpak; Carman, Mark James

    2016-01-01

    Automatic sarcasm detection is the task of predicting sarcasm in text. This is a crucial step to sentiment analysis, considering prevalence and challenges of sarcasm in sentiment-bearing text. Beginning with an approach that used speech-based features, sarcasm detection has witnessed great interest from the sentiment analysis community. This paper is the first known compilation of past work in automatic sarcasm detection. We observe three milestones in the research so far: semi-supervised pat...

  11. Prospects for de-automatization.

    Science.gov (United States)

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness.

  12. The automatization of journalistic narrative

    Directory of Open Access Journals (Sweden)

    Naara Normande

    2013-06-01

    Full Text Available This paper proposes an initial discussion about the production of automatized journalistic narratives. Despite being a topic discussed in specialized sites and international conferences in communication area, the concepts are still deficient in academic research. For this article, we studied the concepts of narrative, databases and algorithms, indicating a theoretical trend that explains this automatized journalistic narratives. As characterization, we use the cases of Los Angeles Times, Narrative Science and Automated Insights.

  13. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  14. a Geometric Processing Workflow for Transforming Reality-Based 3d Models in Volumetric Meshes Suitable for Fea

    Science.gov (United States)

    Gonizzi Barsanti, S.; Guidi, G.

    2017-02-01

    Conservation of Cultural Heritage is a key issue and structural changes and damages can influence the mechanical behaviour of artefacts and buildings. The use of Finite Elements Methods (FEM) for mechanical analysis is largely used in modelling stress behaviour. The typical workflow involves the use of CAD 3D models made by Non-Uniform Rational B-splines (NURBS) surfaces, representing the ideal shape of the object to be simulated. Nowadays, 3D documentation of CH has been widely developed through reality-based approaches, but the models are not suitable for a direct use in FEA: the mesh has in fact to be converted to volumetric, and the density has to be reduced since the computational complexity of a FEA grows exponentially with the number of nodes. The focus of this paper is to present a new method aiming at generate the most accurate 3D representation of a real artefact from highly accurate 3D digital models derived from reality-based techniques, maintaining the accuracy of the high-resolution polygonal models in the solid ones. The approach proposed is based on a wise use of retopology procedures and a transformation of this model to a mathematical one made by NURBS surfaces suitable for being processed by volumetric meshers typically embedded in standard FEM packages. The strong simplification with little loss of consistency possible with the retopology step is used for maintaining as much coherence as possible between the original acquired mesh and the simplified model, creating in the meantime a topology that is more favourable for the automatic NURBS conversion.

  15. Refinement of arrival-time picks using a cross-correlation based workflow

    Science.gov (United States)

    Akram, Jubran; Eaton, David W.

    2016-12-01

    We propose a new iterative workflow based on cross-correlation for improved arrival-time picking on microseismic data. In this workflow, signal-to-noise ratio (S/N) and polarity weighted stacking are used to minimize the effect of S/N and polarity fluctuations on the pilot waveform computation. We use an exhaustive search technique for polarity estimation through stack power maximization. We use pseudo-synthetic and real microseismic data from western Canada in order to demonstrate the effectiveness of proposed workflow relative to Akaike information criterion (AIC) and a previously published cross-correlation based method. The pseudo-synthetic microseismic waveforms are obtained by introducing Gaussian noise and polarity fluctuations into waveforms from a high S/N microseismic event. We find that the cross-correlation based approaches yield more accurate arrival-time picks as compared to AIC for low S/N waveforms. AIC is not affected by waveform polarities as it works on individual receiver levels whereas the accuracy of existing cross-correlation method decreases in spite of using envelope correlation. We show that our proposed workflow yields better and consistent arrival-time picks regardless of waveform amplitude and polarity variations within the receiver array. After refinement, the initial arrival-time picks are located closer to the best estimated manual picks.

  16. Effects of Lean Work Organization and Industrialization on Workflow and Productive Time in Housing Renovation Projects

    NARCIS (Netherlands)

    Vrijhoef, Ruben

    2016-01-01

    This paper presents work aimed at improved organization and performance of production in housing renovation projects. The purpose is to explore and demonstrate the potential of lean work organization and industrialized product technology to improve workflow and productive time. The research included

  17. An optimization algorithm for multipath parallel allocation for service resource in the simulation task workflow.

    Science.gov (United States)

    Wang, Zhiteng; Zhang, Hongjun; Zhang, Rui; Li, Yong; Zhang, Xuliang

    2014-01-01

    Service oriented modeling and simulation are hot issues in the field of modeling and simulation, and there is need to call service resources when simulation task workflow is running. How to optimize the service resource allocation to ensure that the task is complete effectively is an important issue in this area. In military modeling and simulation field, it is important to improve the probability of success and timeliness in simulation task workflow. Therefore, this paper proposes an optimization algorithm for multipath service resource parallel allocation, in which multipath service resource parallel allocation model is built and multiple chains coding scheme quantum optimization algorithm is used for optimization and solution. The multiple chains coding scheme quantum optimization algorithm is to extend parallel search space to improve search efficiency. Through the simulation experiment, this paper investigates the effect for the probability of success in simulation task workflow from different optimization algorithm, service allocation strategy, and path number, and the simulation result shows that the optimization algorithm for multipath service resource parallel allocation is an effective method to improve the probability of success and timeliness in simulation task workflow.

  18. Asking for Permission: A Survey of Copyright Workflows for Institutional Repositories

    Science.gov (United States)

    Hanlon, Ann; Ramirez, Marisa

    2011-01-01

    An online survey of institutional repository (IR) managers identified copyright clearance trends in staffing and workflows. The majority of respondents followed a mediated deposit model, and reported that library personnel, instead of authors, engaged in copyright clearance activities for IRs. The most common "information gaps" pertained to the…

  19. Automation of lidar-based hydrologic feature extraction workflows using GIS

    Science.gov (United States)

    Borlongan, Noel Jerome B.; de la Cruz, Roel M.; Olfindo, Nestor T.; Perez, Anjillyn Mae C.

    2016-10-01

    With the advent of LiDAR technology, higher resolution datasets become available for use in different remote sensing and GIS applications. One significant application of LiDAR datasets in the Philippines is in resource features extraction. Feature extraction using LiDAR datasets require complex and repetitive workflows which can take a lot of time for researchers through manual execution and supervision. The Development of the Philippine Hydrologic Dataset for Watersheds from LiDAR Surveys (PHD), a project under the Nationwide Detailed Resources Assessment Using LiDAR (Phil-LiDAR 2) program, created a set of scripts, the PHD Toolkit, to automate its processes and workflows necessary for hydrologic features extraction specifically Streams and Drainages, Irrigation Network, and Inland Wetlands, using LiDAR Datasets. These scripts are created in Python and can be added in the ArcGIS® environment as a toolbox. The toolkit is currently being used as an aid for the researchers in hydrologic feature extraction by simplifying the workflows, eliminating human errors when providing the inputs, and providing quick and easy-to-use tools for repetitive tasks. This paper discusses the actual implementation of different workflows developed by Phil-LiDAR 2 Project 4 in Streams, Irrigation Network and Inland Wetlands extraction.

  20. Design and Evaluation of Data Annotation Workflows for CAVE-like Virtual Environments.

    Science.gov (United States)

    Pick, Sebastian; Weyers, Benjamin; Hentschel, Bernd; Kuhlen, Torsten W

    2016-04-01

    Data annotation finds increasing use in Virtual Reality applications with the goal to support the data analysis process, such as architectural reviews. In this context, a variety of different annotation systems for application to immersive virtual environments have been presented. While many interesting interaction designs for the data annotation workflow have emerged from them, important details and evaluations are often omitted. In particular, we observe that the process of handling metadata to interactively create and manage complex annotations is often not covered in detail. In this paper, we strive to improve this situation by focusing on the design of data annotation workflows and their evaluation. We propose a workflow design that facilitates the most important annotation operations, i.e., annotation creation, review, and modification. Our workflow design is easily extensible in terms of supported annotation and metadata types as well as interaction techniques, which makes it suitable for a variety of application scenarios. To evaluate it, we have conducted a user study in a CAVE-like virtual environment in which we compared our design to two alternatives in terms of a realistic annotation creation task. Our design obtained good results in terms of task performance and user experience.