WorldWideScience

Sample records for normal acquisitions workflow

  1. iLAP: a workflow-driven software for experimental protocol development, data acquisition and analysis

    Directory of Open Access Journals (Sweden)

    McNally James

    2009-01-01

    Full Text Available Abstract Background In recent years, the genome biology community has expended considerable effort to confront the challenges of managing heterogeneous data in a structured and organized way and developed laboratory information management systems (LIMS for both raw and processed data. On the other hand, electronic notebooks were developed to record and manage scientific data, and facilitate data-sharing. Software which enables both, management of large datasets and digital recording of laboratory procedures would serve a real need in laboratories using medium and high-throughput techniques. Results We have developed iLAP (Laboratory data management, Analysis, and Protocol development, a workflow-driven information management system specifically designed to create and manage experimental protocols, and to analyze and share laboratory data. The system combines experimental protocol development, wizard-based data acquisition, and high-throughput data analysis into a single, integrated system. We demonstrate the power and the flexibility of the platform using a microscopy case study based on a combinatorial multiple fluorescence in situ hybridization (m-FISH protocol and 3D-image reconstruction. iLAP is freely available under the open source license AGPL from http://genome.tugraz.at/iLAP/. Conclusion iLAP is a flexible and versatile information management system, which has the potential to close the gap between electronic notebooks and LIMS and can therefore be of great value for a broad scientific community.

  2. Development of a data independent acquisition mass spectrometry workflow to enable glycopeptide analysis without predefined glycan compositional knowledge.

    Science.gov (United States)

    Lin, Chi-Hung; Krisp, Christoph; Packer, Nicolle H; Molloy, Mark P

    2018-02-10

    Glycoproteomics investigates glycan moieties in a site specific manner to reveal the functional roles of protein glycosylation. Identification of glycopeptides from data-dependent acquisition (DDA) relies on high quality MS/MS spectra of glycopeptide precursors and often requires manual validation to ensure confident assignments. In this study, we investigated pseudo-MRM (MRM-HR) and data-independent acquisition (DIA) as alternative acquisition strategies for glycopeptide analysis. These approaches allow data acquisition over the full MS/MS scan range allowing data re-analysis post-acquisition, without data re-acquisition. The advantage of MRM-HR over DDA for N-glycopeptide detection was demonstrated from targeted analysis of bovine fetuin where all three N-glycosylation sites were detected, which was not the case with DDA. To overcome the duty cycle limitation of MRM-HR acquisition needed for analysis of complex samples such as plasma we trialed DIA. This allowed development of a targeted DIA method to identify N-glycopeptides without pre-defined knowledge of the glycan composition, thus providing the potential to identify N-glycopeptides with unexpected structures. This workflow was demonstrated by detection of 59 N-glycosylation sites from 41 glycoproteins from a HILIC enriched human plasma tryptic digest. 21 glycoforms of IgG1 glycopeptides were identified including two truncated structures that are rarely reported. We developed a data-independent mass spectrometry workflow to identify specific glycopeptides from complex biological mixtures. The novelty is that this approach does not require glycan composition to be pre-defined, thereby allowing glycopeptides carrying unexpected glycans to be identified. This is demonstrated through the analysis of immunoglobulins in human plasma where we detected two IgG1 glycoforms that are rarely observed. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Evaluation of MRI acquisition workflow with lean six sigma method: case study of liver and knee examinations.

    Science.gov (United States)

    Roth, Christopher J; Boll, Daniel T; Wall, Lisa K; Merkle, Elmar M

    2010-08-01

    The purpose of this investigation was to assess workflow for medical imaging studies, specifically comparing liver and knee MRI examinations by use of the Lean Six Sigma methodologic framework. The hypothesis tested was that the Lean Six Sigma framework can be used to quantify MRI workflow and to identify sources of inefficiency to target for sequence and protocol improvement. Audio-video interleave streams representing individual acquisitions were obtained with graphic user interface screen capture software in the examinations of 10 outpatients undergoing MRI of the liver and 10 outpatients undergoing MRI of the knee. With Lean Six Sigma methods, the audio-video streams were dissected into value-added time (true image data acquisition periods), business value-added time (time spent that provides no direct patient benefit but is requisite in the current system), and non-value-added time (scanner inactivity while awaiting manual input). For overall MRI table time, value-added time was 43.5% (range, 39.7-48.3%) of the time for liver examinations and 89.9% (range, 87.4-93.6%) for knee examinations. Business value-added time was 16.3% of the table time for the liver and 4.3% of the table time for the knee examinations. Non-value-added time was 40.2% of the overall table time for the liver and 5.8% for the knee examinations. Liver MRI examinations consume statistically significantly more non-value-added and business value-added times than do knee examinations, primarily because of respiratory command management and contrast administration. Workflow analyses and accepted inefficiency reduction frameworks can be applied with use of a graphic user interface screen capture program.

  4. Evaluation of a pre-surgical functional MRI workflow: From data acquisition to reporting.

    Science.gov (United States)

    Pernet, Cyril R; Gorgolewski, Krzysztof J; Job, Dominic; Rodriguez, David; Storkey, Amos; Whittle, Ian; Wardlaw, Joanna

    2016-02-01

    Present and assess clinical protocols and associated automated workflow for pre-surgical functional magnetic resonance imaging in brain tumor patients. Protocols were validated using a single-subject reliability approach based on 10 healthy control subjects. Results from the automated workflow were evaluated in 9 patients with brain tumors, comparing fMRI results to direct electrical stimulation (DES) of the cortex. Using a new approach to compute single-subject fMRI reliability in controls, we show that not all tasks are suitable in the clinical context, even if they show meaningful results at the group level. Comparison of the fMRI results from patients to DES showed good correspondence between techniques (odds ratio 36). Providing that validated and reliable fMRI protocols are used, fMRI can accurately delineate eloquent areas, thus providing an aid to medical decision regarding brain tumor surgery. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  5. MimiLook: A Phylogenetic Workflow for Detection of Gene Acquisition in Major Orthologous Groups of Megavirales.

    Science.gov (United States)

    Jain, Sourabh; Panda, Arup; Colson, Philippe; Raoult, Didier; Pontarotti, Pierre

    2017-04-07

    With the inclusion of new members, understanding about evolutionary mechanisms and processes by which members of the proposed order, Megavirales, have evolved has become a key area of interest. The central role of gene acquisition has been shown in previous studies. However, the major drawback in gene acquisition studies is the focus on few MV families or putative families with large variation in their genetic structure. Thus, here we have tried to develop a methodology by which we can detect horizontal gene transfers (HGTs), taking into consideration orthologous groups of distantly related Megavirale families. Here, we report an automated workflow MimiLook, prepared as a Perl command line program, that deduces orthologous groups (OGs) from ORFomes of Megavirales and constructs phylogenetic trees by performing alignment generation, alignment editing and protein-protein BLAST (BLASTP) searching across the National Center for Biotechnology Information (NCBI) non-redundant (nr) protein sequence database. Finally, this tool detects statistically validated events of gene acquisitions with the help of the T-REX algorithm by comparing individual gene tree with NCBI species tree. In between the steps, the workflow decides about handling paralogs, filtering outputs, identifying Megavirale specific OGs, detection of HGTs, along with retrieval of information about those OGs that are monophyletic with organisms from cellular domains of life. By implementing MimiLook, we noticed that nine percent of Megavirale gene families (i.e., OGs) have been acquired by HGT, 80% OGs were Megaviralespecific and eight percent were found to be sharing common ancestry with members of cellular domains (Eukaryote, Bacteria, Archaea, Phages or other viruses) and three percent were ambivalent. The results are briefly discussed to emphasize methodology. Also, MimiLook is relevant for detecting evolutionary scenarios in other targeted phyla with user defined modifications. It can be accessed at

  6. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    International Nuclear Information System (INIS)

    Kulkarni, Shilpa; Koller, Antonius; Mani, Kartik M.; Wen, Ruofeng; Alfieri, Alan; Saha, Subhrajit; Wang, Jian; Patel, Purvi; Bandeira, Nuno; Guha, Chandan

    2016-01-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  7. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Kulkarni, Shilpa [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Koller, Antonius [Proteomics Center, Stony Brook University School of Medicine, Stony Brook, New York (United States); Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Mani, Kartik M. [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wen, Ruofeng [Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, New York (United States); Alfieri, Alan; Saha, Subhrajit [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wang, Jian [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Patel, Purvi [Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Department of Pharmacological Sciences, Stony Brook University, Stony Brook, New York (United States); Bandeira, Nuno [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Skaggs School of Pharmacy and Pharmaceutical Sciences, University of California, San Diego, California (United States); Guha, Chandan, E-mail: cguha@montefiore.org [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); and others

    2016-11-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  8. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    Science.gov (United States)

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  9. Histogram-based normalization technique on human brain magnetic resonance images from different acquisitions.

    Science.gov (United States)

    Sun, Xiaofei; Shi, Lin; Luo, Yishan; Yang, Wei; Li, Hongpeng; Liang, Peipeng; Li, Kuncheng; Mok, Vincent C T; Chu, Winnie C W; Wang, Defeng

    2015-07-28

    Intensity normalization is an important preprocessing step in brain magnetic resonance image (MRI) analysis. During MR image acquisition, different scanners or parameters would be used for scanning different subjects or the same subject at a different time, which may result in large intensity variations. This intensity variation will greatly undermine the performance of subsequent MRI processing and population analysis, such as image registration, segmentation, and tissue volume measurement. In this work, we proposed a new histogram normalization method to reduce the intensity variation between MRIs obtained from different acquisitions. In our experiment, we scanned each subject twice on two different scanners using different imaging parameters. With noise estimation, the image with lower noise level was determined and treated as the high-quality reference image. Then the histogram of the low-quality image was normalized to the histogram of the high-quality image. The normalization algorithm includes two main steps: (1) intensity scaling (IS), where, for the high-quality reference image, the intensities of the image are first rescaled to a range between the low intensity region (LIR) value and the high intensity region (HIR) value; and (2) histogram normalization (HN),where the histogram of low-quality image as input image is stretched to match the histogram of the reference image, so that the intensity range in the normalized image will also lie between LIR and HIR. We performed three sets of experiments to evaluate the proposed method, i.e., image registration, segmentation, and tissue volume measurement, and compared this with the existing intensity normalization method. It is then possible to validate that our histogram normalization framework can achieve better results in all the experiments. It is also demonstrated that the brain template with normalization preprocessing is of higher quality than the template with no normalization processing. We have proposed

  10. Digital Workflow for the Acquisition and Elaboration of 3d Data in a Monumental Complex: the Fortress of Saint John the Baptist in Florence

    Science.gov (United States)

    Tucci, G.; Bonora, V.; Conti, A.; Fiorini, L.

    2017-08-01

    In recent years, the GeCo Laboratory has undertaken numerous projects to digitalize vast and complex buildings; the specific nature of the different projects has resulted in a case-by-case approach, each time working on past experiences and updating not only the hardware and software tools but also the management and processing methods. This paper presents the workflow followed for the survey of the Fortress of Saint John the Baptist in Florence, an on-going interdisciplinary project. Presently Florence's main trade fair congress centre, at the same time it hosts various buildings that bear witness to the fortress's life-history, combining constructions from the Medici and Lorraine eras with recently built exhibition facilities. Now new research has been required due to the realization of new pavilions and the regeneration of the whole complex. This has included a critical survey, material testing, diagnostic investigations and stratigraphic analyses to define the building's state of preservation. The working group comprises specialists from different institutions, amongst which the Italian Military Geographic Institute, the University of Florence, the National Research Council Institute for the Preservation and Enhancement of the Cultural Heritage, and the Florence City Council.

  11. Bioconductor workflow for single-cell RNA sequencing: Normalization, dimensionality reduction, clustering, and lineage inference [version 1; referees: 1 approved, 2 approved with reservations

    Directory of Open Access Journals (Sweden)

    Fanny Perraudeau

    2017-07-01

    Full Text Available Novel single-cell transcriptome sequencing assays allow researchers to measure gene expression levels at the resolution of single cells and offer the unprecendented opportunity to investigate at the molecular level fundamental biological questions, such as stem cell differentiation or the discovery and characterization of rare cell types. However, such assays raise challenging statistical and computational questions and require the development of novel methodology and software. Using stem cell differentiation in the mouse olfactory epithelium as a case study, this integrated workflow provides a step-by-step tutorial to the methodology and associated software for the following four main tasks: (1 dimensionality reduction accounting for zero inflation and over dispersion and adjusting for gene and cell-level covariates; (2 cell clustering using resampling-based sequential ensemble clustering; (3 inference of cell lineages and pseudotimes; and (4 differential expression analysis along lineages.

  12. Considering Time in Orthophotography Production: from a General Workflow to a Shortened Workflow for a Faster Disaster Response

    Science.gov (United States)

    Lucas, G.

    2015-08-01

    This article overall deals with production time with orthophoto imagery with medium size digital frame camera. The workflow examination follows two main parts: data acquisition and post-processing. The objectives of the research are fourfold: 1/ gathering time references for the most important steps of orthophoto production (it turned out that literature is missing on this topic); these figures are used later for total production time estimation; 2/ identifying levers for reducing orthophoto production time; 3/ building a simplified production workflow for emergency response: less exigent with accuracy and faster; and compare it to a classical workflow; 4/ providing methodical elements for the estimation of production time with a custom project. In the data acquisition part a comprehensive review lists and describes all the factors that may affect the acquisition efficiency. Using a simulation with different variables (average line length, time of the turns, flight speed) their effect on acquisition efficiency is quantitatively examined. Regarding post-processing, the time references figures were collected from the processing of a 1000 frames case study with 15 cm GSD covering a rectangular area of 447 km2; the time required to achieve each step during the production is written down. When several technical options are possible, each one is tested and time documented so as all alternatives are available. Based on a technical choice with the workflow and using the compiled time reference of the elementary steps, a total time is calculated for the post-processing of the 1000 frames. Two scenarios are compared as regards to time and accuracy. The first one follows the "normal" practices, comprising triangulation, orthorectification and advanced mosaicking methods (feature detection, seam line editing and seam applicator); the second is simplified and make compromise over positional accuracy (using direct geo-referencing) and seamlines preparation in order to achieve

  13. Evaluation of Normalization Methods on GeLC-MS/MS Label-Free Spectral Counting Data to Correct for Variation during Proteomic Workflows

    Science.gov (United States)

    Gokce, Emine; Shuford, Christopher M.; Franck, William L.; Dean, Ralph A.; Muddiman, David C.

    2011-12-01

    Normalization of spectral counts (SpCs) in label-free shotgun proteomic approaches is important to achieve reliable relative quantification. Three different SpC normalization methods, total spectral count (TSpC) normalization, normalized spectral abundance factor (NSAF) normalization, and normalization to selected proteins (NSP) were evaluated based on their ability to correct for day-to-day variation between gel-based sample preparation and chromatographic performance. Three spectral counting data sets obtained from the same biological conidia sample of the rice blast fungus Magnaporthe oryzae were analyzed by 1D gel and liquid chromatography-tandem mass spectrometry (GeLC-MS/MS). Equine myoglobin and chicken ovalbumin were spiked into the protein extracts prior to 1D-SDS- PAGE as internal protein standards for NSP. The correlation between SpCs of the same proteins across the different data sets was investigated. We report that TSpC normalization and NSAF normalization yielded almost ideal slopes of unity for normalized SpC versus average normalized SpC plots, while NSP did not afford effective corrections of the unnormalized data. Furthermore, when utilizing TSpC normalization prior to relative protein quantification, t-testing and fold-change revealed the cutoff limits for determining real biological change to be a function of the absolute number of SpCs. For instance, we observed the variance decreased as the number of SpCs increased, which resulted in a higher propensity for detecting statistically significant, yet artificial, change for highly abundant proteins. Thus, we suggest applying higher confidence level and lower fold-change cutoffs for proteins with higher SpCs, rather than using a single criterion for the entire data set. By choosing appropriate cutoff values to maintain a constant false positive rate across different protein levels (i.e., SpC levels), it is expected this will reduce the overall false negative rate, particularly for proteins with

  14. Workflow in Almaraz NPP

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    2000-01-01

    Almaraz NPP decided to incorporate Workflow into its information system in response to the need to provide exhaustive follow-up and monitoring of each phase of the different procedures it manages. Oracle's Workflow was chosen for this purpose and it was integrated with previously developed applications. The objectives to be met in the incorporation of Workflow were as follows: Strict monitoring of procedures and processes. Detection of bottlenecks in the flow of information. Notification of those affected by pending tasks. Flexible allocation of tasks to user groups. Improved monitoring of management procedures. Improved communication. Similarly, special care was taken to: Integrate workflow processes with existing control panels. Synchronize workflow with installation procedures. Ensure that the system reflects use of paper forms. At present the Corrective Maintenance Request module is being operated using Workflow and the Work Orders and Notice of Order modules are about to follow suit. (Author)

  15. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  16. Cue acquisition: A feature of Malawian midwives decision making process to support normality during the first stage of labour.

    Science.gov (United States)

    Chodzaza, Elizabeth; Haycock-Stuart, Elaine; Holloway, Aisha; Mander, Rosemary

    2018-03-01

    to explore Malawian midwives decision making when caring for women during the first stage of labour in the hospital setting. this focused ethnographic study examined the decision making process of 9 nurse-midwives with varying years of clinical experience in the real world setting of an urban and semi urban hospital from October 2013 to May 2014.This was done using 27 participant observations and 27 post-observation in-depth interviews over a period of six months. Qualitative data analysis software, NVivo 10, was used to assist with data management for the analysis. All data was analysed using the principle of theme and category formation. analysis revealed a six-stage process of decision making that include a baseline for labour, deciding to admit a woman to labour ward, ascertaining the normal physiological progress of labour, supporting the normal physiological progress of labour, embracing uncertainty: the midwives' construction of unusual labour as normal, dealing with uncertainty and deciding to intervene in unusual labour. This six-stage process of decision making is conceptualised as the 'role of cue acquisition', illustrating the ways in which midwives utilise their assessment of labouring women to reason and make decisions on how to care for them in labour. Cue acquisition involved the midwives piecing together segments of information they obtained from the women to formulate an understanding of the woman's birthing progress and inform the midwives decision making process. This understanding of cue acquisition by midwives is significant for supporting safe care in the labour setting. When there was uncertainty in a woman's progress of labour, midwives used deductive reasoning, for example, by cross-checking and analysing the information obtained during the span of labour. Supporting normal labour physiological processes was identified as an underlying principle that shaped the midwives clinical judgement and decision making when they cared for women in

  17. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  18. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  19. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  20. Development of normal fetal brain by MRI with a half-Fourier rapid acquisition with relaxation enhancement sequence

    International Nuclear Information System (INIS)

    Li Meilan; Liu Xuejun; Wang Jianhong; Zhao Cheng; Li Xiang

    2006-01-01

    Objective: To evaluate normal maturation of the fetal brain with half-Fourier rapid acquisition with relaxation enhancement (RARE) MRI. Methods: The normal brains of 25 fetuses of 12-38 weeks gestational age were examined in utero with half-Fourier RARE imaging. Gyrus maturation, gray and white matter differentiation, ventricle-to-brain diameter ratio, and subarachnoid space size were evaluated with respect to gestational age. Results: At 12-23 weeks, the brain had a smooth surface, and two or three layers were differentiated in the cerebral cortex. At 24-26 weeks, only a few shallow grooves were seen in the central sulcus, and three layers, including the immature cortex, intermediate zone, and germinal matrix, were differentiated in all fetuses. At 27-29 weeks, sulcus formation was observed in various regions of the brain parenchyma, and the germinal matrix became invisible. Sulcation was seen in the whole cerebral cortex from 30 weeks on. However, the cortex did not undergo infolding, and opercular formation was not seen before 33 weeks. At 23 weeks and earlier, the cerebral ventricles were large; thereafter, they gradually became smaller. The subarachnoid space overlying the cortical convexities was slightly dilated at all gestational ages, most markedly at 21-26 weeks. Conclusion: Changes in brain maturation proceed through stages in an orderly and predictable fashion and can be evaluated reliably with half-Fourier RARE MRI. (authors)

  1. Workflow management: an overview

    NARCIS (Netherlands)

    Ouyang, C.; Adams, M.; Wynn, M.T.; Hofstede, ter A.H.M.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Workflow management has its origin in the office automation systems of the seventies, but it is not until fairly recently that conceptual and technological breakthroughs have led to its widespread adoption. In fact, nowadays, processawareness has become an accepted and integral part of various types

  2. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  3. Language Acquisition and Assessment in Normal and Handicapped Preschool Children: A Review of the Literature. Final Report. Volume II.

    Science.gov (United States)

    Longhurst, Thomas M.

    The second of four documents provides a summary of the scientific literature pertaining to spontaneous language acquisition in handicapped preschool children, and reviews and evaluates procedures for assessing language acquisition in these children. Chapter l focuses on language development in nonhandicapped children after they have acquired their…

  4. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  5. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  6. The usefulness and the problems of attenuation correction using simultaneous transmission and emission data acquisition method. Studies on normal volunteers and phantom

    International Nuclear Information System (INIS)

    Kijima, Tetsuji; Kumita, Shin-ichiro; Mizumura, Sunao; Cho, Keiichi; Ishihara, Makiko; Toba, Masahiro; Kumazaki, Tatsuo; Takahashi, Munehiro.

    1997-01-01

    Attenuation correction using simultaneous transmission data (TCT) and emission data (ECT) acquisition method was applied to 201 Tl myocardial SPECT with ten normal adults and the phantom in order to validate the efficacy of attenuation correction using this method. Normal adults study demonstrated improved 201 Tl accumulation to the septal wall and the posterior wall of the left ventricle and relative decreased activities in the lateral wall with attenuation correction (p 201 Tl uptake organs such as the liver and the stomach pushed up the activities in the septal wall and the posterior wall. Cardiac dynamic phantom studies showed partial volume effect due to cardiac motion contributed to under-correction of the apex, which might be overcome using gated SPECT. Although simultaneous TCT and ECT acquisition was conceived of the advantageous method for attenuation correction, miss-correction of the special myocardial segments should be taken into account in assessment of attenuation correction compensated images. (author)

  7. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  8. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  9. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  10. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  11. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  12. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  14. Privacy-aware workflow management

    NARCIS (Netherlands)

    Alhaqbani, B.; Adams, M.; Fidge, C.J.; Hofstede, ter A.H.M.; Glykas, M.

    2013-01-01

    Information security policies play an important role in achieving information security. Confidentiality, Integrity, and Availability are classic information security goals attained by enforcing appropriate security policies. Workflow Management Systems (WfMSs) also benefit from inclusion of these

  15. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  16. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  17. Development of the workflow kine systems for support on KAIZEN.

    Science.gov (United States)

    Mizuno, Yuki; Ito, Toshihiko; Yoshikawa, Toru; Yomogida, Satoshi; Morio, Koji; Sakai, Kazuhiro

    2012-01-01

    In this paper, we introduce the new workflow line system consisted of the location and image recording, which led to the acquisition of workflow information and the analysis display. From the results of workflow line investigation, we considered the anticipated effects and the problems on KAIZEN. Workflow line information included the location information and action contents information. These technologies suggest the viewpoints to help improvement, for example, exclusion of useless movement, the redesign of layout and the review of work procedure. Manufacturing factory, it was clear that there was much movement from the standard operation place and accumulation residence time. The following was shown as a result of this investigation, to be concrete, the efficient layout was suggested by this system. In the case of the hospital, similarly, it is pointed out that the workflow has the problem of layout and setup operations based on the effective movement pattern of the experts. This system could adapt to routine work, including as well as non-routine work. By the development of this system which can fit and adapt to industrial diversification, more effective "visual management" (visualization of work) is expected in the future.

  18. The CMS tracker calibration workflow: Experience with cosmic ray data

    International Nuclear Information System (INIS)

    Frosali, Simone

    2010-01-01

    During the second part of 2008 a CMS commissioning was performed with the acquisition of cosmic events in global runs. Cosmic rays detected in the muon chambers were used to trigger the readout of all CMS subdetectors in the general data acquisition system. A total of about 300M of tracks were collected by the CMS Muon Chambers with a 3.8T magnetic field produced by the CMS superconducting solenoid, 6M of which pointing to the tracker region and reconstructed by the Si-Strip Tracker (SST) detectors. Other 1M of cosmic tracks were collected with the magnetic field off. Using the cosmic data available it was possible to validate the performances of the CMS tracker calibration workflows. In this paper the adopted calibration workflow is described. In particular, the three main calibration workflows requested for the low level reconstruction of the SST, i.e. gain calibration, Lorentz angle calibration and bad components identification, are described. The results obtained using cosmic tracks for these three calibration workflows are also presented.

  19. A new Approach to the Study of Russian Language Acquisition in Preschool Children with Normal and Abnormal Development

    Directory of Open Access Journals (Sweden)

    Lebedeva T.V

    2014-11-01

    Full Text Available We discuss the possibilities of using a standardized method of psychological evaluation of the Russian language development in preschool children. We provide a rationale for the relevance of timely differentiation of children with language and speech difficulties in modern educational practice. We present the results of comparative analysis of language and speech development in the two groups of children 5-6 years old: normally developing (N=92 and with language and speech disorders (N=59. We describe the diagnostic potential of this research tool for clinical sample of children with speech and language disorders, reveal differences in the development of Russian language between the two groups of children. The data obtained can be used in solving the problems of differentiated correctional help to pre-school children with impaired language and speech development.

  20. Developing 3D Imaging Programmes-Workflow and Quality Control

    OpenAIRE

    Hess, M.; Robson, S.; Serpico, M.; Amati, G.; Pridden, I.; Nelson, T.

    2016-01-01

    This article reports on a successful project for 3D imaging research, digital applications, and use of new technologies in the museum. The article will focus on the development and implementation of a viable workflow for the production of high-quality 3D models of museum objects, based on the 3D laser scanning and photogrammetry of selected ancient Egyptian artefacts. The development of a robust protocol for the complete process chain for imaging cultural heritage artefacts, from the acquisit...

  1. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical....... In this paper, we pro- pose a notion of concurrency for declarative process models, formulated in the context of Dynamic Condition Response (DCR) graphs, and exploiting the so-called “true concurrency” semantics of Labelled Asynchronous Transition Systems. We demonstrate how this semantic underpinning...

  2. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  3. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  4. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  5. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  6. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  7. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  8. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  9. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  10. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  11. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  12. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  13. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  14. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  15. Behavioral technique for workflow abstraction and matching

    NARCIS (Netherlands)

    Klai, K.; Ould Ahmed M'bareck, N.; Tata, S.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    This work is in line with the CoopFlow approach dedicated for workflow advertisement, interconnection, and cooperation in virtual organizations. In order to advertise workflows into a registry, we present in this paper a novel method to abstract behaviors of workflows into symbolic observation

  16. A performance study of grid workflow engines

    NARCIS (Netherlands)

    Stratan, C.; Iosup, A.; Epema, D.H.J.

    2008-01-01

    To benefit from grids, scientists require grid workflow engines that automatically manage the execution of inter-related jobs on the grid infrastructure. So far, the workflows community has focused on scheduling algorithms and on interface tools. Thus, while several grid workflow engines have been

  17. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  18. Dual-Energy Computed Tomography: Image Acquisition, Processing, and Workflow.

    Science.gov (United States)

    Megibow, Alec J; Kambadakone, Avinash; Ananthakrishnan, Lakshmi

    2018-07-01

    Dual energy computed tomography has been available for more than 10 years; however, it is currently on the cusp of widespread clinical use. The way dual energy data are acquired and assembled must be appreciated at the clinical level so that the various reconstruction types can extend its diagnostic power. The type of scanner that is present in a given practice dictates the way in which the dual energy data can be presented and used. This article compares and contrasts how dual source, rapid kV switching, and spectral technologies acquire and present dual energy reconstructions to practicing radiologists. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Mergers and Acquisitions

    OpenAIRE

    Frasch, Manfred; Leptin, Maria

    2000-01-01

    Mergers and acquisitions (M&As) are booming a strategy of choice for organizations attempting to maintain a competitive advantage. Previous research on mergers and acquisitions declares that acquirers do not normally benefit from acquisitions. Targets, on the other hand, have a tendency of gaining positive returns in the few days surrounding merger announcements due to several characteristic on the acquisitions deal. The announcement period wealth effect on acquiring firms, however, is as cle...

  20. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  1. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  2. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  3. Multidetector-row CT: economics and workflow

    International Nuclear Information System (INIS)

    Pottala, K.M.; Kalra, M.K.; Saini, S.; Ouellette, K.; Sahani, D.; Thrall, J.H.

    2005-01-01

    With rapid evolution of multidetector-row CT (MDCT) technology and applications, several factors such ad technology upgrade and turf battles for sharing cost and profitability affect MDCT workflow and economics. MDCT workflow optimization can enhance productivity and reduce unit costs as well as increase profitability, in spite of decrease in reimbursement rates. Strategies for workflow management include standardization, automation, and constant assessment of various steps involved in MDCT operations. In this review article, we describe issues related to MDCT economics and workflow. (orig.)

  4. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, Cirano; Chiao, Carolina; Hess, Guillermo; Nascimento, Gleison; Thom, Lucinéia; Reichert, Manfred

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  5. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  6. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  7. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  8. Verifying generalized soundness for workflow nets

    NARCIS (Netherlands)

    Hee, van K.M.; Oanea, O.I.; Sidorova, N.; Voorhoeve, M.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    We improve the decision procedure from [10] for the problem of generalized soundness of workflow nets. A workflow net is generalized sound iff every marking reachable from an initial marking with k tokens on the initial place terminates properly, i.e. it can reach a marking with k tokens on the

  9. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  10. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  11. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  12. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  13. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  14. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  15. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  16. Acquisitions Everywhere: Modeling an Acquisitions Data Standard to Connect a Distributed Environment

    OpenAIRE

    Hanson, Eric M.; Lightcap, Paul W.; Miguez, Matthew R.

    2016-01-01

    Acquisitions functions remain operationally crucial in providing access to paid information resources, but data formats and workflows utilized within library acquisitions remain primarily within the traditional integrated library system (ILS). As libraries have evolved to use distributed systems to manage information resources, so too must acquisitions functions adapt to an environment that may include the ILS, e‐resource management systems (ERMS), institutional repositories (IR), and other d...

  17. Inverse IMRT workflow process at Austin health

    International Nuclear Information System (INIS)

    Rykers, K.; Fernando, W.; Grace, M.; Liu, G.; Rolfo, A.; Viotto, A.; Mantle, C.; Lawlor, M.; Au-Yeung, D.; Quong, G.; Feigen, M.; Lim-Joon, D.; Wada, M.

    2004-01-01

    Full text: The work presented here will review the strategies adopted at Austin Health to bring IMRT into clinical use. IMRT is delivered using step and shoot mode on an Elekta Precise machine with 40 pairs of 1cm wide MLC leaves. Planning is done using CMS Focus/XiO. A collaborative approach for RO's, Physicists and RTs from concept to implementation was adopted. An overview will be given of the workflow for the clinic, the equipment used, tolerance levels and the lessons learned. 1. Strategic Planning for IMRT 2. Training a. MSKCC (New York) b.ESTRO (Amsterdam) c.Elekta (US and UK) 3. Linac testing and data acquisition a. Equipment and software review and selection b. Linac reliability/geometric and mechanical checks c. Draft Patient QA procedure d. EPI Image matching checks and procedures 4. Planning system checks a. export of dose matrix (options) b. dose calculation choices 5. IMRT Research Initiatives a. IMRT Planning Studies, Stabilisation, On-line Imaging 6. Equipment Procurement and testing a. Physics and Linac Equipment, Hardware, Software/Licences, Stabilisation 7. Establishing a DICOM Environment a. Prescription sending, Image transfer for EPI checks b. QA Files 8. Physics QA (Pre-Treatment) a.Clinical plan review; DVH checks b. geometry; dosimetry checks; DICOM checks c. 2D Distance to agreement; mm difference reports; Gamma function index 9. Documentation a.Protocol Development i. ICRU 50/62 reporting and prescribing b. QA for Physics c. QA for RT's d. Generation of a report for RO/patient history. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  18. ADVANCED APPROACH TO PRODUCTION WORKFLOW COMPOSITION ON ENGINEERING KNOWLEDGE PORTALS

    OpenAIRE

    Novogrudska, Rina; Kot, Tatyana; Globa, Larisa; Schill, Alexander

    2016-01-01

    Background. In the environment of engineering knowledge portals great amount of partial workflows is concentrated. Such workflows are composed into general workflow aiming to perform real complex production task. Characteristics of partial workflows and general workflow structure are not studied enough, that affects the impossibility of general production workflowdynamic composition.Objective. Creating an approach to the general production workflow dynamic composition based on the partial wor...

  19. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  20. The Diabetic Retinopathy Screening Workflow

    Science.gov (United States)

    Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew

    2015-01-01

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630

  1. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Kamper, Lars; Meyn, Hannes; Rybacki, Konrad; Nordmeyer, Simone; Kempkes, Udo; Piroth, Werner; Isenmann, Stefan; Haage, Patrick

    2012-01-01

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  2. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  3. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  4. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  5. Workflow Based Software Development Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  6. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  7. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  8. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.; Das Sarma, Akash; Widom, J.

    2013-01-01

    for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations' provenance to the workflow provenance. We

  9. A Multilevel Secure Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Sheth, Amit P; Kochut, Krys J; Miller, John A

    1999-01-01

    The Department of Defense (DoD) needs multilevel secure (MLS) workflow management systems to enable globally distributed users and applications to cooperate across classification levels to achieve mission critical goals...

  10. Workflow Based Software Development Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  11. Biphasic solid and liquid gastric emptying in normal control subjects and diabetic patients with continuous acquisition in the left anterior oblique view

    International Nuclear Information System (INIS)

    Ziessman, H.A.; Fahey, F.H.; Herring, C.D.; Deschner, W.K.; Collen, M.J.; Vigersky, R.A.

    1989-01-01

    This paper reports solid and liquid gastric emptying (GE) studied in 10 normal controls and 20 diabetics with symptoms of gastroparesis. After the ingestion of a Tc-99m SC egg sandwich and In-lll DTPA in water, 90 1-minute frames were acquired in the left anterior oblique view. Solid GE had a lag phase in all cases and then emptied linearly. Compared with normal controls, diabetics had delayed GE and delayed lag phase (P< .05). Liquid GE was exponential with no lag phase. Biexponential liquid emptying with an early fast component followed by a second slower one was seen in 60% of normal controls and 70% of diabetics. The slower component of liquid GE correlated with the solid GE rate (normal controls, r= .826; diabetics, r = .885)

  12. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  13. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  14. SwinDeW-C: A Peer-to-Peer Based Cloud Workflow System

    Science.gov (United States)

    Liu, Xiao; Yuan, Dong; Zhang, Gaofeng; Chen, Jinjun; Yang, Yun

    Workflow systems are designed to support the process automation of large scale business and scientific applications. In recent years, many workflow systems have been deployed on high performance computing infrastructures such as cluster, peer-to-peer (p2p), and grid computing (Moore, 2004; Wang, Jie, & Chen, 2009; Yang, Liu, Chen, Lignier, & Jin, 2007). One of the driving forces is the increasing demand of large scale instance and data/computation intensive workflow applications (large scale workflow applications for short) which are common in both eBusiness and eScience application areas. Typical examples (will be detailed in Section 13.2.1) include such as the transaction intensive nation-wide insurance claim application process; the data and computation intensive pulsar searching process in Astrophysics. Generally speaking, instance intensive applications are those processes which need to be executed for a large number of times sequentially within a very short period or concurrently with a large number of instances (Liu, Chen, Yang, & Jin, 2008; Liu et al., 2010; Yang et al., 2008). Therefore, large scale workflow applications normally require the support of high performance computing infrastructures (e.g. advanced CPU units, large memory space and high speed network), especially when workflow activities are of data and computation intensive themselves. In the real world, to accommodate such a request, expensive computing infrastructures including such as supercomputers and data servers are bought, installed, integrated and maintained with huge cost by system users

  15. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  16. Data-Independent Acquisition-Based Quantitative Proteomic Analysis Reveals Potential Biomarkers of Kidney Cancer.

    Science.gov (United States)

    Song, Yimeng; Zhong, Lijun; Zhou, Juntuo; Lu, Min; Xing, Tianying; Ma, Lulin; Shen, Jing

    2017-12-01

    Renal cell carcinoma (RCC) is a malignant and metastatic cancer with 95% mortality, and clear cell RCC (ccRCC) is the most observed among the five major subtypes of RCC. Specific biomarkers that can distinguish cancer tissues from adjacent normal tissues should be developed to diagnose this disease in early stages and conduct a reliable prognostic evaluation. Data-independent acquisition (DIA) strategy has been widely employed in proteomic analysis because of various advantages, including enhanced protein coverage and reliable data acquisition. In this study, a DIA workflow is constructed on a quadrupole-Orbitrap LC-MS platform to reveal dysregulated proteins between ccRCC and adjacent normal tissues. More than 4000 proteins are identified, 436 of these proteins are dysregulated in ccRCC tissues. Bioinformatic analysis reveals that multiple pathways and Gene Ontology items are strongly associated with ccRCC. The expression levels of L-lactate dehydrogenase A chain, annexin A4, nicotinamide N-methyltransferase, and perilipin-2 examined through RT-qPCR, Western blot, and immunohistochemistry confirm the validity of the proteomic analysis results. The proposed DIA workflow yields optimum time efficiency and data reliability and provides a good choice for proteomic analysis in biological and clinical studies, and these dysregulated proteins might be potential biomarkers for ccRCC diagnosis. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  18. Multilevel Workflow System in the ATLAS Experiment

    International Nuclear Information System (INIS)

    Borodin, M; De, K; Navarro, J Garcia; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs are executed across more than a hundred distributed computing sites by PanDA - the ATLAS job-level workload management system. On the outer level, the Database Engine for Tasks (DEfT) empowers production managers with templated workflow definitions. On the next level, the Job Execution and Definition Interface (JEDI) is integrated with PanDA to provide dynamic job definition tailored to the sites capabilities. We report on scaling up the production system to accommodate a growing number of requirements from main ATLAS areas: Trigger, Physics and Data Preparation. (paper)

  19. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  20. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  1. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  2. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    Nilsen, Dimitri; Weber, Pavel

    2014-01-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  3. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  4. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  5. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  6. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  7. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2018-01-01

    From 2025 onwards, the ATLAS collaboration at the Large Hadron Collider (LHC) at CERN will experience a massive increase in data quantity as well as complexity. Including mitigating factors, the prevalent computing power by that time will only fulfil one tenth of the requirement. This contribution will focus on Cloud computing as an approach to help overcome this challenge by providing flexible hardware that can be configured to the specific needs of a workflow. Experience with Cloud computing exists, but there is a large uncertainty if and to which degree it can be able to reduce the burden by 2025. In order to understand and quantify the benefits of Cloud computing, the "Workflow and Infrastructure Model" was created. It estimates the viability of Cloud computing by combining different inputs from the workflow side with infrastructure specifications. The model delivers metrics that enable the comparison of different Cloud configurations as well as different Cloud offerings with each other. A wide range of r...

  8. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  9. Impact of CGNS on CFD Workflow

    Science.gov (United States)

    Poinot, M.; Rumsey, C. L.; Mani, M.

    2004-01-01

    CFD tools are an integral part of industrial and research processes, for which the amount of data is increasing at a high rate. These data are used in a multi-disciplinary fluid dynamics environment, including structural, thermal, chemical or even electrical topics. We show that the data specification is an important challenge that must be tackled to achieve an efficient workflow for use in this environment. We compare the process with other software techniques, such as network or database type, where past experiences showed how difficult it was to bridge the gap between completely general specifications and dedicated specific applications. We show two aspects of the use of CFD General Notation System (CGNS) that impact CFD workflow: as a data specification framework and as a data storage means. Then, we give examples of projects involving CFD workflows where the use of the CGNS standard leads to a useful method either for data specification, exchange, or storage.

  10. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  11. A virtual radiation therapy workflow training simulation

    International Nuclear Information System (INIS)

    Bridge, P.; Crowe, S.B.; Gibson, G.; Ellemor, N.J.; Hargrave, C.; Carmichael, M.

    2016-01-01

    Aim: Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method: Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results: Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion: Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment

  12. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  13. A Prudent Approach to Fair Use Workflow

    Directory of Open Access Journals (Sweden)

    Karey Patterson

    2018-02-01

    Full Text Available This poster will outline a new highly efficient workflow for the management of copyright materials that is prudent and accommodates generally and legally accepted Fair Use limits. The workflow allows library or copyright staff an easy means to keep on top of their copyright obligations, manage licenses and review and adjust schedules but is still a highly efficient means to cope with large numbers of requests to use materials. The poster details speed and efficiency gains for professors and library staff while reducing legal exposure.

  14. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  15. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  16. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  17. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  18. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  19. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  20. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  1. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  2. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  3. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  4. Improving Radiology Workflow with Automated Examination Tracking and Alerts.

    Science.gov (United States)

    Pianykh, Oleg S; Jaworsky, Christina; Shore, M T; Rosenthal, Daniel I

    2017-07-01

    The modern radiology workflow is a production line where imaging examinations pass in sequence through many steps. In busy clinical environments, even a minor delay in any step can propagate through the system and significantly lengthen the examination process. This is particularly true for the tasks delegated to the human operators, who may be distracted or stressed. We have developed an application to track examinations through a critical part of the workflow, from the image-acquisition scanners to the PACS archive. Our application identifies outliers and actively alerts radiology managers about the need to resolve these problems as soon as they happen. In this study, we investigate how this real-time tracking and alerting affected the speed of examination delivery to the radiologist. We demonstrate that active alerting produced a 3-fold reduction of examination-to-PACS delays. Additionally, we discover an overall improvement in examination-to-PACS delivery, evidence that the tracking and alerts instill a culture where timely processing is essential. By providing supervisors with information about exactly where delays emerge in their workflow and alerting the correct staff to take action, applications like ours create more robust radiology workflow with predictable, timely outcomes. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  5. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  6. Performing Workflows in Pervasive Environments Based on Context Specifications

    OpenAIRE

    Xiping Liu; Jianxin Chen

    2010-01-01

    The workflow performance consists of the performance of activities and transitions between activities. Along with the fast development of varied computing devices, activities in workflows and transitions between activities could be performed in pervasive ways, whichcauses that the workflow performance need to migrate from traditional computing environments to pervasive environments. To perform workflows in pervasive environments needs to take account of the context information which affects b...

  7. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  8. Optimized workflow and imaging protocols for whole-body oncologic PET/MRI.

    Science.gov (United States)

    Ishii, Shirou; Hara, Takamitsu; Nanbu, Takeyuki; Suenaga, Hiroki; Sugawara, Shigeyasu; Kuroiwa, Daichi; Sekino, Hirofumi; Miyajima, Masayuki; Kubo, Hitoshi; Oriuchi, Noboru; Ito, Hiroshi

    2016-11-01

    Although PET/MRI has the advantages of a simultaneous acquisition of PET and MRI, high soft-tissue contrast of the MRI images, and reduction of radiation exposure, its low profitability and long acquisition time are significant problems in clinical settings. Thus, MRI protocols that meet oncological purposes need to be used in order to reduce examination time while securing detectability. Currently, half-Fourier acquisition single-shot turbo spin echo and 3D-T1 volumetric interpolated breath-hold examination may be the most commonly used sequences for whole-body imaging due to their shorter acquisition time and higher diagnostic accuracy. Although there have been several reports that adding diffusion weighted image (DWI) to PET/MRI protocol has had no effect on tumor detection to date, in cases of liver, kidney, bladder, and prostate cancer, the use of DWI may be beneficial in detecting lesions. Another possible option is to scan each region with different MRI sequences instead of scanning the whole body using one sequence continuously. We herein report a workflow and imaging protocols for whole-body oncologic PET/MRI using an integrated system in the clinical routine, designed for the detection, for example by cancer screening, of metastatic lesions, in order to help future users optimize their workflow and imaging protocols.

  9. On Lifecycle Constraints of Artifact-Centric Workflows

    Science.gov (United States)

    Kucukoguz, Esra; Su, Jianwen

    Data plays a fundamental role in modeling and management of business processes and workflows. Among the recent "data-aware" workflow models, artifact-centric models are particularly interesting. (Business) artifacts are the key data entities that are used in workflows and can reflect both the business logic and the execution states of a running workflow. The notion of artifacts succinctly captures the fluidity aspect of data during workflow executions. However, much of the technical dimension concerning artifacts in workflows is not well understood. In this paper, we study a key concept of an artifact "lifecycle". In particular, we allow declarative specifications/constraints of artifact lifecycle in the spirit of DecSerFlow, and formulate the notion of lifecycle as the set of all possible paths an artifact can navigate through. We investigate two technical problems: (Compliance) does a given workflow (schema) contain only lifecycle allowed by a constraint? And (automated construction) from a given lifecycle specification (constraint), is it possible to construct a "compliant" workflow? The study is based on a new formal variant of artifact-centric workflow model called "ArtiNets" and two classes of lifecycle constraints named "regular" and "counting" constraints. We present a range of technical results concerning compliance and automated construction, including: (1) compliance is decidable when workflow is atomic or constraints are regular, (2) for each constraint, we can always construct a workflow that satisfies the constraint, and (3) sufficient conditions where atomic workflows can be constructed.

  10. WS-VLAM: A GT4 based workflow management system

    NARCIS (Netherlands)

    Wibisono, A.; Vasyunin, D.; Korkhov, V.; Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.

    2007-01-01

    Generic Grid middleware, e.g., Globus Toolkit 4 (GT4), provides basic services for scientific workflow management systems to discover, store and integrate workflow components. Using the state of the art Grid services can advance the functionality of workflow engine in orchestrating distributed Grid

  11. Optimal resource assignment in workflows for maximizing cooperation

    NARCIS (Netherlands)

    Kumar, Akhil; Dijkman, R.M.; Song, Minseok; Daniel, Fl.; Wang, J.; Weber, B.

    2013-01-01

    A workflow is a team process since many actors work on various tasks to complete an instance. Resource management in such workflows deals with assignment of tasks to workers or actors. In team formation, it is necessary to ensure that members of a team are compatible with each other. When a workflow

  12. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  13. Workflow-Oriented Cyberinfrastructure for Sensor Data Analytics

    Science.gov (United States)

    Orcutt, J. A.; Rajasekar, A.; Moore, R. W.; Vernon, F.

    2015-12-01

    Sensor streams comprise an increasingly large part of Earth Science data. Analytics based on sensor data require an easy way to perform operations such as acquisition, conversion to physical units, metadata linking, sensor fusion, analysis and visualization on distributed sensor streams. Furthermore, embedding real-time sensor data into scientific workflows is of growing interest. We have implemented a scalable networked architecture that can be used to dynamically access packets of data in a stream from multiple sensors, and perform synthesis and analysis across a distributed network. Our system is based on the integrated Rule Oriented Data System (irods.org), which accesses sensor data from the Antelope Real Time Data System (brtt.com), and provides virtualized access to collections of data streams. We integrate real-time data streaming from different sources, collected for different purposes, on different time and spatial scales, and sensed by different methods. iRODS, noted for its policy-oriented data management, brings to sensor processing features and facilities such as single sign-on, third party access control lists ( ACLs), location transparency, logical resource naming, and server-side modeling capabilities while reducing the burden on sensor network operators. Rich integrated metadata support also makes it straightforward to discover data streams of interest and maintain data provenance. The workflow support in iRODS readily integrates sensor processing into any analytical pipeline. The system is developed as part of the NSF-funded Datanet Federation Consortium (datafed.org). APIs for selecting, opening, reaping and closing sensor streams are provided, along with other helper functions to associate metadata and convert sensor packets into NetCDF and JSON formats. Near real-time sensor data including seismic sensors, environmental sensors, LIDAR and video streams are available through this interface. A system for archiving sensor data and metadata in Net

  14. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  15. Grid workflow job execution service 'Pilot'

    International Nuclear Information System (INIS)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-01-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  16. Workflow optimization beyond RIS and PACS

    International Nuclear Information System (INIS)

    Treitl, M.; Wirth, S.; Lucke, A.; Nissen-Meyer, S.; Trumm, C.; Rieger, J.; Pfeifer, K.-J.; Reiser, M.; Villain, S.

    2005-01-01

    Technological progress and the rising cost pressure on the healthcare system have led to a drastic change in the work environment of radiologists today. The pervasive demand for workflow optimization and increased efficiency of its activities raises the question of whether by employment of electronic systems, such as RIS and PACS, the potentials of digital technology are sufficiently used to fulfil this demand. This report describes the tasks and structures in radiology departments, which so far are only insufficiently supported by commercially available electronic systems but are nevertheless substantial. We developed and employed a web-based, integrated workplace system, which simplifies many daily tasks of departmental organization and administration apart from well-established tasks of documentation. Furthermore, we analyzed the effects exerted on departmental workflow by employment of this system for 3 years. (orig.) [de

  17. Designing Flexible E-Business Workflow Systems

    OpenAIRE

    Cătălin Silvestru; Codrin Nisioiu; Marinela Mircea; Bogdan Ghilic-Micu; Marian Stoica

    2010-01-01

    In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design o...

  18. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  19. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  20. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  1. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  2. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  3. Routine digital pathology workflow: The Catania experience

    Directory of Open Access Journals (Sweden)

    Filippo Fraggetta

    2017-01-01

    Full Text Available Introduction: Successful implementation of whole slide imaging (WSI for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100% permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  4. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  5. Facilitating Stewardship of scientific data through standards based workflows

    Science.gov (United States)

    Bastrakova, I.; Kemp, C.; Potter, A. K.

    2013-12-01

    scientific data acquisition and analysis requirements and effective interoperable data management and delivery. This includes participating in national and international dialogue on development of standards, embedding data management activities in business processes, and developing scientific staff as effective data stewards. Similar approach is applied to the geophysical data. By ensuring the geophysical datasets at GA strictly follow metadata and industry standards we are able to implement a provenance based workflow where the data is easily discoverable, geophysical processing can be applied to it and results can be stored. The provenance based workflow enables metadata records for the results to be produced automatically from the input dataset metadata.

  6. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes...

  7. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  8. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  9. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Science.gov (United States)

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system

  10. Tendências da aquisição lexical em crianças em desenvolvimento normal e crianças com Alterações Específicas no Desenvolvimento da Linguagem Trends on lexical acquisition in children within normal development and children with developmental language disorder

    Directory of Open Access Journals (Sweden)

    Juliana Perina Gândara

    2010-01-01

    Full Text Available O objetivo do presente estudo foi descrever as semelhanças e diferenças encontradas ao longo da aquisição lexical por crianças em desenvolvimento normal e crianças com Alterações Específicas no Desenvolvimento da Linguagem (AEDL, por meio de um amplo levantamento bibliográfico em bases de dados (SciELO, Lilacs, PubMed, Web of Science, Dedalus, que abrangeu as últimas décadas de estudos na área. Os estudos selecionados, de natureza observacional ou experimental, mostraram grande variedade de achados relacionados ao desenvolvimento do vocabulário, abrangendo as tendências e variações e também outras habilidades envolvidas no processo de aquisição lexical. De maneira geral, os resultados sugerem que as alterações lexicais que constituem um dos marcos inicialmente observados em crianças com AEDL são justificadas por dificuldades observadas em habilidades e/ou características influenciadas ou diretamente relacionadas aos mecanismos envolvidos no processamento da informação, que comprometem a qualidade e a recuperação das representações fonológicas e semânticas correspondentes a um novo item lexical. Entretanto, vários estudos sugerem que situações ostensivas e de grande suporte contextual que enfoquem poucas novas palavras favorecem a aquisição lexical de crianças com AEDL.The aim of the present study was to describe the similarities and differences found throughout lexical acquisition between normally developing children and children with developmental language disorder through an extensive literature review. The search was carried out in the databases SciELO, Lilacs, PubMed, Web of Science and Dedalus, and covered the last decades of studies in the area. The selected studies, of observational or experimental nature, showed great variability of findings related to vocabulary development, describing tendencies and variations, and also other abilities enrolled in the lexical acquisition process. Generally, the

  11. Data acquisition

    International Nuclear Information System (INIS)

    Clout, P.N.

    1982-01-01

    Data acquisition systems are discussed for molecular biology experiments using synchrotron radiation sources. The data acquisition system requirements are considered. The components of the solution are described including hardwired solutions and computer-based solutions. Finally, the considerations for the choice of the computer-based solution are outlined. (U.K.)

  12. Evaluation of an image-based tracking workflow using a passive marker and resonant micro-coil fiducials for automatic image plane alignment in interventional MRI.

    Science.gov (United States)

    Neumann, M; Breton, E; Cuvillon, L; Pan, L; Lorenz, C H; de Mathelin, M

    2012-01-01

    In this paper, an original workflow is presented for MR image plane alignment based on tracking in real-time MR images. A test device consisting of two resonant micro-coils and a passive marker is proposed for detection using image-based algorithms. Micro-coils allow for automated initialization of the object detection in dedicated low flip angle projection images; then the passive marker is tracked in clinical real-time MR images, with alternation between two oblique orthogonal image planes along the test device axis; in case the passive marker is lost in real-time images, the workflow is reinitialized. The proposed workflow was designed to minimize dedicated acquisition time to a single dedicated acquisition in the ideal case (no reinitialization required). First experiments have shown promising results for test-device tracking precision, with a mean position error of 0.79 mm and a mean orientation error of 0.24°.

  13. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  14. Text mining for the biocuration workflow

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  15. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  16. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  17. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  18. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    2008-01-01

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  19. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  20. AcquisitionFootprintAttenuationDrivenbySeismicAttributes

    Directory of Open Access Journals (Sweden)

    Cuellar-Urbano Mayra

    2014-04-01

    Full Text Available Acquisition footprint, one of the major problems that PEMEX faces in seismic imaging, is noise highly correlated to the geometric array of sources and receivers used for onshore and offshore seismic acquisitions. It prevails in spite of measures taken during acquisition and data processing. This pattern, throughout the image, is easily confused with geological features and misguides seismic attribute computation. In this work, we use seismic data from PEMEX Exploración y Producción to show the conditioning process for removing random and coherent noise using linear filters. Geometric attributes used in a workflow were computed for obtaining an acquisition footprint noise model and adaptively subtract it from the seismic data.

  1. Workflow efficiency of two 1.5 T MR scanners with and without an automated user interface for head examinations.

    Science.gov (United States)

    Moenninghoff, Christoph; Umutlu, Lale; Kloeters, Christian; Ringelstein, Adrian; Ladd, Mark E; Sombetzki, Antje; Lauenstein, Thomas C; Forsting, Michael; Schlamann, Marc

    2013-06-01

    Workflow efficiency and workload of radiological technologists (RTs) were compared in head examinations performed with two 1.5 T magnetic resonance (MR) scanners equipped with or without an automated user interface called "day optimizing throughput" (Dot) workflow engine. Thirty-four patients with known intracranial pathology were examined with a 1.5 T MR scanner with Dot workflow engine (Siemens MAGNETOM Aera) and with a 1.5 T MR scanner with conventional user interface (Siemens MAGNETOM Avanto) using four standardized examination protocols. The elapsed time for all necessary work steps, which were performed by 11 RTs within the total examination time, was compared for each examination at both MR scanners. The RTs evaluated the user-friendliness of both scanners by a questionnaire. Normality of distribution was checked for all continuous variables by use of the Shapiro-Wilk test. Normally distributed variables were analyzed by Student's paired t-test, otherwise Wilcoxon signed-rank test was used to compare means. Total examination time of MR examinations performed with Dot engine was reduced from 24:53 to 20:01 minutes (P user interface (P = .001). According to this preliminary study, the Dot workflow engine is a time-saving user assistance software, which decreases the RTs' effort significantly and may help to automate neuroradiological examinations for a higher workflow efficiency. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  2. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  3. Building and documenting workflows with python-based snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area....

  4. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  5. Integrating prediction, provenance, and optimization into high energy workflows

    Energy Technology Data Exchange (ETDEWEB)

    Schram, M.; Bansal, V.; Friese, R. D.; Tallent, N. R.; Yin, J.; Barker, K. J.; Stephan, E.; Halappanavar, M.; Kerbyson, D. J.

    2017-10-01

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  6. A Model of Workflow Composition for Emergency Management

    Science.gov (United States)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  7. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  8. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  9. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  10. A Strategy for an MLS Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Eppinger, Brian J; Moskowitz, Ira S

    1999-01-01

    .... Therefore, DoD needs MLS workflow management systems (WFMS) to enable globally distributed users and existing applications to cooperate across classification domains to achieve mission critical goals...

  11. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  12. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  13. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  14. Workflow management for a cosmology collaboratory

    International Nuclear Information System (INIS)

    Loken, Stewart C.; McParland, Charles

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work

  15. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  16. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    Science.gov (United States)

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  17. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    Science.gov (United States)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  18. SYRMEP Tomo Project: a graphical user interface for customizing CT reconstruction workflows.

    Science.gov (United States)

    Brun, Francesco; Massimi, Lorenzo; Fratini, Michela; Dreossi, Diego; Billé, Fulvio; Accardo, Agostino; Pugliese, Roberto; Cedola, Alessia

    2017-01-01

    When considering the acquisition of experimental synchrotron radiation (SR) X-ray CT data, the reconstruction workflow cannot be limited to the essential computational steps of flat fielding and filtered back projection (FBP). More refined image processing is often required, usually to compensate artifacts and enhance the quality of the reconstructed images. In principle, it would be desirable to optimize the reconstruction workflow at the facility during the experiment (beamtime). However, several practical factors affect the image reconstruction part of the experiment and users are likely to conclude the beamtime with sub-optimal reconstructed images. Through an example of application, this article presents SYRMEP Tomo Project (STP), an open-source software tool conceived to let users design custom CT reconstruction workflows. STP has been designed for post-beamtime (off-line use) and for a new reconstruction of past archived data at user's home institution where simple computing resources are available. Releases of the software can be downloaded at the Elettra Scientific Computing group GitHub repository https://github.com/ElettraSciComp/STP-Gui.

  19. Syntax acquisition.

    Science.gov (United States)

    Crain, Stephen; Thornton, Rosalind

    2012-03-01

    Every normal child acquires a language in just a few years. By 3- or 4-years-old, children have effectively become adults in their abilities to produce and understand endlessly many sentences in a variety of conversational contexts. There are two alternative accounts of the course of children's language development. These different perspectives can be traced back to the nature versus nurture debate about how knowledge is acquired in any cognitive domain. One perspective dates back to Plato's dialog 'The Meno'. In this dialog, the protagonist, Socrates, demonstrates to Meno, an aristocrat in Ancient Greece, that a young slave knows more about geometry than he could have learned from experience. By extension, Plato's Problem refers to any gap between experience and knowledge. How children fill in the gap in the case of language continues to be the subject of much controversy in cognitive science. Any model of language acquisition must address three factors, inter alia: 1. The knowledge children accrue; 2. The input children receive (often called the primary linguistic data); 3. The nonlinguistic capacities of children to form and test generalizations based on the input. According to the famous linguist Noam Chomsky, the main task of linguistics is to explain how children bridge the gap-Chomsky calls it a 'chasm'-between what they come to know about language, and what they could have learned from experience, even given optimistic assumptions about their cognitive abilities. Proponents of the alternative 'nurture' approach accuse nativists like Chomsky of overestimating the complexity of what children learn, underestimating the data children have to work with, and manifesting undue pessimism about children's abilities to extract information based on the input. The modern 'nurture' approach is often referred to as the usage-based account. We discuss the usage-based account first, and then the nativist account. After that, we report and discuss the findings of several

  20. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  1. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  2. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  3. Towards seamless workflows in agile data science

    Science.gov (United States)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the

  4. Barriers to effective, safe communication and workflow between nurses and non-consultant hospital doctors during out-of-hours.

    Science.gov (United States)

    Brady, Anne-Marie; Byrne, Gobnait; Quirke, Mary Brigid; Lynch, Aine; Ennis, Shauna; Bhangu, Jaspreet; Prendergast, Meabh

    2017-11-01

    This study aimed to evaluate the nature and type of communication and workflow arrangements between nurses and doctors out-of-hours (OOH). Effective communication and workflow arrangements between nurses and doctors are essential to minimize risk in hospital settings, particularly in the out-of-hour's period. Timely patient flow is a priority for all healthcare organizations and the quality of communication and workflow arrangements influences patient safety. Qualitative descriptive design and data collection methods included focus groups and individual interviews. A 500 bed tertiary referral acute hospital in Ireland. Junior and senior Non-Consultant Hospital Doctors, staff nurses and nurse managers. Both nurses and doctors acknowledged the importance of good interdisciplinary communication and collaborative working, in sustaining effective workflow and enabling a supportive working environment and patient safety. Indeed, issues of safety and missed care OOH were found to be primarily due to difficulties of communication and workflow. Medical workflow OOH is often dependent on cues and communication to/from nursing. However, communication systems and, in particular the bleep system, considered central to the process of communication between doctors and nurses OOH, can contribute to workflow challenges and increased staff stress. It was reported as commonplace for routine work, that should be completed during normal hours, to fall into OOH when resources were most limited, further compounding risk to patient safety. Enhancement of communication strategies between nurses and doctors has the potential to remove barriers to effective decision-making and patient flow. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  5. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  6. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  7. The P2P approach to interorganizational workflows

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Weske, M.H.; Dittrich, K.R.; Geppert, A.; Norrie, M.C.

    2001-01-01

    This paper describes in an informal way the Public-To-Private (P2P) approach to interorganizational workflows, which is based on a notion of inheritance. The approach consists of three steps: (1) create a common understanding of the interorganizational workflow by specifying a shared public

  8. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  9. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  10. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  11. Building and documenting workflows with python-based snakemake

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to

  12. Analyzing the Gap between Workflows and their Natural Language Descriptions

    NARCIS (Netherlands)

    Groth, P.T.; Gil, Y

    2009-01-01

    Scientists increasingly use workflows to represent and share their computational experiments. Because of their declarative nature, focus on pre-existing component composition and the availability of visual editors, workflows provide a valuable start for creating user-friendly environments for end

  13. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  14. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  15. Parametric Room Acoustic workflows with real-time acoustic simulation

    DEFF Research Database (Denmark)

    Parigi, Dario

    2017-01-01

    The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages......The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages...

  16. Open source workflow : a viable direction for BPM?

    NARCIS (Netherlands)

    Wohed, P.; Russell, N.C.; Hofstede, ter A.H.M.; Andersson, B.; Aalst, van der W.M.P.; Bellahsène, Z.; Léonard, M.

    2008-01-01

    With the growing interest in open source software in general and business process management and workflow systems in particular, it is worthwhile investigating the state of open source workflow management. The plethora of these offerings (recent surveys such as [4,6], each contain more than 30 such

  17. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  18. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  19. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  20. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Joosten, Stef M.M.; Guareis de farias, Cléver

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the

  1. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  2. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  3. A practical workflow for making anatomical atlases for biological research.

    Science.gov (United States)

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  4. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  5. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  6. DOMstudio: an integrated workflow for Digital Outcrop Model reconstruction and interpretation

    Science.gov (United States)

    Bistacchi, Andrea

    2015-04-01

    Different Remote Sensing technologies, including photogrammetry and LIDAR, allow collecting 3D dataset that can be used to create 3D digital representations of outcrop surfaces, called Digital Outcrop Models (DOM), or sometimes Virtual Outcrop Models (VOM). Irrespective of the Remote Sensing technique used, DOMs can be represented either by photorealistic point clouds (PC-DOM) or textured surfaces (TS-DOM). The first are datasets composed of millions of points with XYZ coordinates and RGB colour, whilst the latter are triangulated surfaces onto which images of the outcrop have been mapped or "textured" (applying a tech-nology originally developed for movies and videogames). Here we present a workflow that allows exploiting in an integrated and efficient, yet flexible way, both kinds of dataset: PC-DOMs and TS-DOMs. The workflow is composed of three main steps: (1) data collection and processing, (2) interpretation, and (3) modelling. Data collection can be performed with photogrammetry, LIDAR, or other techniques. The quality of photogrammetric datasets obtained with Structure From Motion (SFM) techniques has shown a tremendous improvement over the past few years, and this is becoming the more effective way to collect DOM datasets. The main advantages of photogrammetry over LIDAR are represented by the very simple and lightweight field equipment (a digital camera), and by the arbitrary spatial resolution, that can be increased simply getting closer to the out-crop or by using a different lens. It must be noted that concerns about the precision of close-range photogrammetric surveys, that were justified in the past, are no more a problem if modern software and acquisition schemas are applied. In any case, LIDAR is a well-tested technology and it is still very common. Irrespective of the data collection technology, the output will be a photorealistic point cloud and a collection of oriented photos, plus additional imagery in special projects (e.g. infrared images

  7. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.; Zhang, L.J.; Watson, T.J.; Yang, J.; Hung, P.C.K.

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of

  8. Provenance for distributed biomedical workflow execution

    NARCIS (Netherlands)

    Madougou, S.; Santcroos, M.; Benabdelkader, A.; van Schaik, B.D.; Shahand, S.; Korkhov, V.; van Kampen, A.H.C.; Olabarriaga, S.D.

    2012-01-01

    Scientific research has become very data and compute intensive because of the progress in data acquisition and measurement devices, which is particularly true in Life Sciences. To cope with this deluge of data, scientists use distributed computing and storage infrastructures. The use of such

  9. Mergers + acquisitions.

    Science.gov (United States)

    Hoppszallern, Suzanna

    2002-05-01

    The hospital sector in 2001 led the health care field in mergers and acquisitions. Most deals involved a network augmenting its presence within a specific region or in a market adjacent to its primary service area. Analysts expect M&A activity to increase in 2002.

  10. Exploring Dental Providers' Workflow in an Electronic Dental Record Environment.

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N; Ye, Zhan; Acharya, Amit

    2016-01-01

    A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR.

  11. Workflow Lexicons in Healthcare: Validation of the SWIM Lexicon.

    Science.gov (United States)

    Meenan, Chris; Erickson, Bradley; Knight, Nancy; Fossett, Jewel; Olsen, Elizabeth; Mohod, Prerna; Chen, Joseph; Langer, Steve G

    2017-06-01

    For clinical departments seeking to successfully navigate the challenges of modern health reform, obtaining access to operational and clinical data to establish and sustain goals for improving quality is essential. More broadly, health delivery organizations are also seeking to understand performance across multiple facilities and often across multiple electronic medical record (EMR) systems. Interpreting operational data across multiple vendor systems can be challenging, as various manufacturers may describe different departmental workflow steps in different ways and sometimes even within a single vendor's installed customer base. In 2012, The Society for Imaging Informatics in Medicine (SIIM) recognized the need for better quality and performance data standards and formed SIIM's Workflow Initiative for Medicine (SWIM), an initiative designed to consistently describe workflow steps in radiology departments as well as defining operational quality metrics. The SWIM lexicon was published as a working model to describe operational workflow steps and quality measures. We measured the prevalence of the SWIM lexicon workflow steps in both academic and community radiology environments using real-world patient observations and correlated that information with automatically captured workflow steps from our clinical information systems. Our goal was to measure frequency of occurrence of workflow steps identified by the SWIM lexicon in a real-world clinical setting, as well as to correlate how accurately departmental information systems captured patient flow through our health facility.

  12. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  13. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  14. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  15. CMS Alignement and Calibration workflows: lesson learned and future plans

    CERN Document Server

    AUTHOR|(CDS)2069172

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  16. What is needed for effective open access workflows?

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  17. EDMS based workflow for Printing Industry

    Directory of Open Access Journals (Sweden)

    Prathap Nayak

    2013-04-01

    Full Text Available Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always dependendent on each other. Hence the information has to be harmonized artistically in order to avoid production downtime or employees pointing fingers at each other. This paper analyses how the implementation of Electronic Document Management System (EDMS could contribute to the Printing Industry for immediate access to stored documents within and across departments irrespective of geographical boundaries. The paper outlines initially with a brief history, contemporary EDMS system and some illustrated examples with a study done by choosing Library as a pilot area for evaluating EDMS. The paper ends with an imitative proposal that maps several document management based activities for implementation of EDMS for a Printing Industry.

  18. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  19. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  20. Resilient workflows for computational mechanics platforms

    Science.gov (United States)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  1. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  2. Mergers & Acquisitions

    DEFF Research Database (Denmark)

    Fomcenco, Alex

    This dissertation is a legal dogmatic thesis, the goal of which is to describe and analyze the current state of law in Europe in regard to some relevant selected elements related to mergers and acquisitions, and the adviser’s counsel in this regard. Having regard to the topic of the dissertation...... and fiscal neutrality, group-related issues, holding-structure issues, employees, stock exchange listing issues, and corporate nationality....

  3. GeNNet: an integrated platform for unifying scientific workflows and graph databases for transcriptome data analysis

    Directory of Open Access Journals (Sweden)

    Raquel L. Costa

    2017-07-01

    Full Text Available There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were

  4. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  5. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.; Beal, Jacob; Gorochowski, Thomas E.; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Gö ksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-01-01

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select

  6. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  7. Job life cycle management libraries for CMS workflow management projects

    International Nuclear Information System (INIS)

    Lingen, Frank van; Wilkinson, Rick; Evans, Dave; Foulkes, Stephen; Afaq, Anzar; Vaandering, Eric; Ryu, Seangchan

    2010-01-01

    Scientific analysis and simulation requires the processing and generation of millions of data samples. These tasks are often comprised of multiple smaller tasks divided over multiple (computing) sites. This paper discusses the Compact Muon Solenoid (CMS) workflow infrastructure, and specifically the Python based workflow library which is used for so called task lifecycle management. The CMS workflow infrastructure consists of three layers: high level specification of the various tasks based on input/output data sets, life cycle management of task instances derived from the high level specification and execution management. The workflow library is the result of a convergence of three CMS sub projects that respectively deal with scientific analysis, simulation and real time data aggregation from the experiment. This will reduce duplication and hence development and maintenance costs.

  8. A Community-Driven Workflow Recommendation and Reuse Infrastructure

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX user...

  9. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  10. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  11. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  12. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  13. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  14. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  15. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  16. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...... WorkflowNet2BPEL4WS....

  17. Worklist handling in workflow-enabled radiological application systems

    Science.gov (United States)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  18. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  19. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  20. LabelFlow Framework for Annotating Workflow Provenance

    Directory of Open Access Journals (Sweden)

    Pinar Alper

    2018-02-01

    Full Text Available Scientists routinely analyse and share data for others to use. Successful data (reuse relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses is with workflows. A stand-out feature of workflows is their ability to record provenance from executions. Provenance is useful when analyses are executed with changing parameters (changing contexts and results need to be traced to respective parameters. In this paper we investigate whether provenance can be exploited to support reporting. Specifically; we outline a case-study based on a real-world workflow and set of reporting queries. We observe that provenance, as collected from workflow executions, is of limited use for reporting, as it supports queries partially. We identify that this is due to the generic nature of provenance, its lack of domain-specific contextual metadata. We observe that the required information is available in implicit form, embedded in data. We describe LabelFlow, a framework comprised of four Labelling Operators for decorating provenance with domain-specific Labels. LabelFlow can be instantiated for a domain by plugging it with domain-specific metadata extractors. We provide a tool that takes as input a workflow, and produces as output a Labelling Pipeline for that workflow, comprised of Labelling Operators. We revisit the case-study and show how Labels provide a more complete implementation of reporting queries.

  1. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  3. MSM, an Efficient Workflow for Metabolite Identification Using Hybrid Linear Ion Trap Orbitrap Mass Spectrometer

    Science.gov (United States)

    Cho, Robert; Huang, Yingying; Schwartz, Jae C.; Chen, Yan; Carlson, Timothy J.; Ma, Ji

    2012-05-01

    Identification of drug metabolites can often yield important information regarding clearance mechanism, pharmacologic activity, or toxicity for drug candidate molecules. Additionally, the identification of metabolites can provide beneficial structure-activity insight to help guide lead optimization efforts towards molecules with optimal metabolic profiles. There are challenges associated with detecting and identifying metabolites in the presence of complex biological matrices, and new LC-MS technologies have been developed to meet these challenges. In this report, we describe the development of an experimental approach that applies unique features of the hybrid linear ion trap Orbitrap mass spectrometer to streamline in vitro and in vivo metabolite identification experiments. The approach, referred to as MSM, utilizes multiple collision cells, dissociation methods, mass analyzers, and detectors. With multiple scan types and different dissociation modes built into one experimental method, along with flexible post-acquisition analysis options, the MSM workflow offers an attractive option to fast and reliable identification of metabolites in different kinds of in vitro and in vivo samples. The MSM workflow was successfully applied to metabolite identification analysis of verapamil in both in vitro rat hepatocyte incubations and in vivo rat bile samples.

  4. The impact of using an intravenous workflow management system (IVWMS) on cost and patient safety.

    Science.gov (United States)

    Lin, Alex C; Deng, Yihong; Thaibah, Hilal; Hingl, John; Penm, Jonathan; Ivey, Marianne F; Thomas, Mark

    2018-07-01

    The aim of this study was to determine the financial costs associated with wasted and missing doses before and after the implementation of an intravenous workflow management system (IVWMS) and to quantify the number and the rate of detected intravenous (IV) preparation errors. A retrospective analysis of the sample hospital information system database was conducted using three months of data before and after the implementation of an IVWMS System (DoseEdge ® ) which uses barcode scanning and photographic technologies to track and verify each step of the preparation process. The financial impact associated with wasted and missing >IV doses was determined by combining drug acquisition, labor, accessory, and disposal costs. The intercepted error reports and pharmacist detected error reports were drawn from the IVWMS to quantify the number of errors by defined error categories. The total number of IV doses prepared before and after the implementation of the IVWMS system were 110,963 and 101,765 doses, respectively. The adoption of the IVWMS significantly reduced the amount of wasted and missing IV doses by 14,176 and 2268 doses, respectively (p system was $144,019 over 3 months. The total number of errors detected was 1160 (1.14%) after using the IVWMS. The implementation of the IVWMS facilitated workflow changes that led to a positive impact on cost and patient safety. The implementation of the IVWMS increased patient safety by enforcing standard operating procedures and bar code verifications. Published by Elsevier B.V.

  5. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  6. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  7. Improving adherence to the Epic Beacon ambulatory workflow.

    Science.gov (United States)

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  8. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  9. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    Science.gov (United States)

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. The myth of standardized workflow in primary care.

    Science.gov (United States)

    Holman, G Talley; Beasley, John W; Karsh, Ben-Tzion; Stone, Jamie A; Smith, Paul D; Wetterneck, Tosha B

    2016-01-01

    Primary care efficiency and quality are essential for the nation's health. The demands on primary care physicians (PCPs) are increasing as healthcare becomes more complex. A more complete understanding of PCP workflow variation is needed to guide future healthcare redesigns. This analysis evaluates workflow variation in terms of the sequence of tasks performed during patient visits. Two patient visits from 10 PCPs from 10 different United States Midwestern primary care clinics were analyzed to determine physician workflow. Tasks and the progressive sequence of those tasks were observed, documented, and coded by task category using a PCP task list. Variations in the sequence and prevalence of tasks at each stage of the primary care visit were assessed considering the physician, the patient, the visit's progression, and the presence of an electronic health record (EHR) at the clinic. PCP workflow during patient visits varies significantly, even for an individual physician, with no single or even common workflow pattern being present. The prevalence of specific tasks shifts significantly as primary care visits progress to their conclusion but, notably, PCPs collect patient information throughout the visit. PCP workflows were unpredictable during face-to-face patient visits. Workflow emerges as the result of a "dance" between physician and patient as their separate agendas are addressed, a side effect of patient-centered practice. Future healthcare redesigns should support a wide variety of task sequences to deliver high-quality primary care. The development of tools such as electronic health records must be based on the realities of primary care visits if they are to successfully support a PCP's mental and physical work, resulting in effective, safe, and efficient primary care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  11. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  12. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  13. A versatile mathematical work-flow to explore how Cancer Stem Cell fate influences tumor progression.

    Science.gov (United States)

    Fornari, Chiara; Balbo, Gianfranco; Halawani, Sami M; Ba-Rukab, Omar; Ahmad, Ab Rahman; Calogero, Raffaele A; Cordero, Francesca; Beccuti, Marco

    2015-01-01

    Nowadays multidisciplinary approaches combining mathematical models with experimental assays are becoming relevant for the study of biological systems. Indeed, in cancer research multidisciplinary approaches are successfully used to understand the crucial aspects implicated in tumor growth. In particular, the Cancer Stem Cell (CSC) biology represents an area particularly suited to be studied through multidisciplinary approaches, and modeling has significantly contributed to pinpoint the crucial aspects implicated in this theory. More generally, to acquire new insights on a biological system it is necessary to have an accurate description of the phenomenon, such that making accurate predictions on its future behaviors becomes more likely. In this context, the identification of the parameters influencing model dynamics can be advantageous to increase model accuracy and to provide hints in designing wet experiments. Different techniques, ranging from statistical methods to analytical studies, have been developed. Their applications depend on case-specific aspects, such as the availability and quality of experimental data, and the dimension of the parameter space. The study of a new model on the CSC-based tumor progression has been the motivation to design a new work-flow that helps to characterize possible system dynamics and to identify those parameters influencing such behaviors. In detail, we extended our recent model on CSC-dynamics creating a new system capable of describing tumor growth during the different stages of cancer progression. Indeed, tumor cells appear to progress through lineage stages like those of normal tissues, being their division auto-regulated by internal feedback mechanisms. These new features have introduced some non-linearities in the model, making it more difficult to be studied by solely analytical techniques. Our new work-flow, based on statistical methods, was used to identify the parameters which influence the tumor growth. The

  14. Multidetector-row helical CT: analysis of time management and workflow

    Energy Technology Data Exchange (ETDEWEB)

    Roos, Justus E.; Desbiolles, Lotus M.; Willmann, Juergen K.; Weishaupt, Dominik; Marincek, Borut; Hilfiker, Paul R. [Institute of Diagnostic Radiology, University Hospital Zurich (Switzerland)

    2002-03-01

    The purpose of this study was to evaluate time management and workflow for multidetector-row helical CT (MDCT). Time for patient and data handling of at total of 580 patients were evaluated at two different time periods (December 1999, August 2000), each for the following baseline measurements: (a) change of clothes/instruction; (b) patient placement on the CT table/i.v. catheter; (c) CT planning and programming; (d) CT data acquisition; (e) CT data reconstruction; (f) CT data storage/printing. All imaging was performed on a Somatom Volume Zoom (Siemens, Erlangen, Germany). Time measurements summarized for different CT protocols revealed the following: (a) 5:01 min ({+-}2.06 min); (b) 4:36 min ({+-}2.43 min); (c) 4:11 min ({+-}2.55 min); (d) 0:43 min ({+-}0.15 min); (e) 6:59 min ({+-}2.39 min); (f) 09:51 min ({+-}3.51 min). Planning and programming was most time-consuming for CT angiography, whereas chest and abdominal CT needed only 3:26 and 3:30 min, respectively. Reconstruction time was highest for HRCT (9:22 min) and CTA (9:03 min). Data storage/printing was most time-consuming for HRCT (13:02 min), followed by combined neck-chest-abdomen examinations (12:19 min). Comparing the two time periods, during which a software update was performed, a mean time reduction of 4:31 min per patient (15%, p<0.001) was achieved. Whereas CT data acquisition time is no longer a problem with MDCT, patient management, data reconstruction, and data storage are the most time-consuming parts. Well-trained technicians, state-of-the-art workstations, and fast networking are the most important factors to improve workflow. (orig.)

  15. Introducing students to digital geological mapping: A workflow based on cheap hardware and free software

    Science.gov (United States)

    Vrabec, Marko; Dolžan, Erazem

    2016-04-01

    The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as

  16. wft4galaxy: a workflow testing tool for galaxy.

    Science.gov (United States)

    Piras, Marco Enrico; Pireddu, Luca; Zanetti, Gianluigi

    2017-12-01

    Workflow managers for scientific analysis provide a high-level programming platform facilitating standardization, automation, collaboration and access to sophisticated computing resources. The Galaxy workflow manager provides a prime example of this type of platform. As compositions of simpler tools, workflows effectively comprise specialized computer programs implementing often very complex analysis procedures. To date, no simple way to automatically test Galaxy workflows and ensure their correctness has appeared in the literature. With wft4galaxy we offer a tool to bring automated testing to Galaxy workflows, making it feasible to bring continuous integration to their development and ensuring that defects are detected promptly. wft4galaxy can be easily installed as a regular Python program or launched directly as a Docker container-the latter reducing installation effort to a minimum. Available at https://github.com/phnmnl/wft4galaxy under the Academic Free License v3.0. marcoenrico.piras@crs4.it. © The Author 2017. Published by Oxford University Press.

  17. Distributing Workflows over a Ubiquitous P2P Network

    Directory of Open Access Journals (Sweden)

    Eddie Al-Shakarchi

    2007-01-01

    Full Text Available This paper discusses issues in the distribution of bundled workflows across ubiquitous peer-to-peer networks for the application of music information retrieval. The underlying motivation for this work is provided by the DART project, which aims to develop a novel music recommendation system by gathering statistical data using collaborative filtering techniques and the analysis of the audio itsel, in order to create a reliable and comprehensive database of the music that people own and which they listen to. To achieve this, the DART scientists creating the algorithms need the ability to distribute the Triana workflows they create, representing the analysis to be performed, across the network on a regular basis (perhaps even daily in order to update the network as a whole with new workflows to be executed for the analysis. DART uses a similar approach to BOINC but differs in that the workers receive input data in the form of a bundled Triana workflow, which is executed in order to process any MP3 files that they own on their machine. Once analysed, the results are returned to DART's distributed database that collects and aggregates the resulting information. DART employs the use of package repositories to decentralise the distribution of such workflow bundles and this approach is validated in this paper through simulations that show that suitable scalability is maintained through the system as the number of participants increases. The results clearly illustrate the effectiveness of the approach.

  18. Provenance-based refresh in data-oriented workflows

    KAUST Repository

    Ikeda, Robert; Salihoglu, Semih; Widom, Jennifer

    2011-01-01

    We consider a general workflow setting in which input data sets are processed by a graph of transformations to produce output results. Our goal is to perform efficient selective refresh of elements in the output data, i.e., compute the latest values of specific output elements when the input data may have changed. We explore how data provenance can be used to enable efficient refresh. Our approach is based on capturing one-level data provenance at each transformation when the workflow is run initially. Then at refresh time provenance is used to determine (transitively) which input elements are responsible for given output elements, and the workflow is rerun only on that portion of the data needed for refresh. Our contributions are to formalize the problem setting and the problem itself, to specify properties of transformations and provenance that are required for efficient refresh, and to provide algorithms that apply to a wide class of transformations and workflows. We have built a prototype system supporting the features and algorithms presented in the paper. We report preliminary experimental results on the overhead of provenance capture, and on the crossover point between selective refresh and full workflow recomputation. © 2011 ACM.

  19. Application of Workflow Technology for Big Data Analysis Service

    Directory of Open Access Journals (Sweden)

    Bin Zhang

    2018-04-01

    Full Text Available This study presents a lightweight representational state transfer-based cloud workflow system to construct a big data intelligent software-as-a-service (SaaS platform. The system supports the dynamic construction and operation of an intelligent data analysis application, and realizes rapid development and flexible deployment of the business analysis process that can improve the interaction and response time of the process. The proposed system integrates offline-batch and online-streaming analysis models that allow users to conduct batch and streaming computing simultaneously. Users can rend cloud capabilities and customize a set of big data analysis applications in the form of workflow processes. This study elucidates the architecture and application modeling, customization, dynamic construction, and scheduling of a cloud workflow system. A chain workflow foundation mechanism is proposed to combine several analysis components into a chain component that can promote efficiency. Four practical application cases are provided to verify the analysis capability of the system. Experimental results show that the proposed system can support multiple users in accessing the system concurrently and effectively uses data analysis algorithms. The proposed SaaS workflow system has been used in network operators and has achieved good results.

  20. Integration of the radiotherapy irradiation planning in the digital workflow

    International Nuclear Information System (INIS)

    Roehner, F.; Schmucker, M.; Henne, K.; Bruggmoser, G.; Grosu, A.L.; Frommhold, H.; Heinemann, F.E.; Momm, F.

    2013-01-01

    Background and purpose: At the Clinic of Radiotherapy at the University Hospital Freiburg, all relevant workflow is paperless. After implementing the Operating Schedule System (OSS) as a framework, all processes are being implemented into the departmental system MOSAIQ. Designing a digital workflow for radiotherapy irradiation planning is a large challenge, it requires interdisciplinary expertise and therefore the interfaces between the professions also have to be interdisciplinary. For every single step of radiotherapy irradiation planning, distinct responsibilities have to be defined and documented. All aspects of digital storage, backup and long-term availability of data were considered and have already been realized during the OSS project. Method: After an analysis of the complete workflow and the statutory requirements, a detailed project plan was designed. In an interdisciplinary workgroup, problems were discussed and a detailed flowchart was developed. The new functionalities were implemented in a testing environment by the Clinical and Administrative IT Department (CAI). After extensive tests they were integrated into the new modular department system. Results and conclusion: The Clinic of Radiotherapy succeeded in realizing a completely digital workflow for radiotherapy irradiation planning. During the testing phase, our digital workflow was examined and afterwards was approved by the responsible authority. (orig.)

  1. Going with the flow: Implementing a workflow system to streamline acquisitions in a special library

    CSIR Research Space (South Africa)

    Halland, Y

    2004-05-01

    Full Text Available and used CSIRIS staff within the business units as “guinea pigs”. Some refinement was done on the basis of the feedback received from them before a few end-users were asked to test the system as well. The testing period was limited as CS had other...

  2. A Comprehensive Workflow of Mass Spectrometry-Based Untargeted Metabolomics in Cancer Metabolic Biomarker Discovery Using Human Plasma and Urine

    Directory of Open Access Journals (Sweden)

    Jianwen She

    2013-09-01

    Full Text Available Current available biomarkers lack sensitivity and/or specificity for early detection of cancer. To address this challenge, a robust and complete workflow for metabolic profiling and data mining is described in details. Three independent and complementary analytical techniques for metabolic profiling are applied: hydrophilic interaction liquid chromatography (HILIC–LC, reversed-phase liquid chromatography (RP–LC, and gas chromatography (GC. All three techniques are coupled to a mass spectrometer (MS in the full scan acquisition mode, and both unsupervised and supervised methods are used for data mining. The univariate and multivariate feature selection are used to determine subsets of potentially discriminative predictors. These predictors are further identified by obtaining accurate masses and isotopic ratios using selected ion monitoring (SIM and data-dependent MS/MS and/or accurate mass MSn ion tree scans utilizing high resolution MS. A list combining all of the identified potential biomarkers generated from different platforms and algorithms is used for pathway analysis. Such a workflow combining comprehensive metabolic profiling and advanced data mining techniques may provide a powerful approach for metabolic pathway analysis and biomarker discovery in cancer research. Two case studies with previous published data are adapted and included in the context to elucidate the application of the workflow.

  3. Quality control of structural MRI images applied using FreeSurfer - a hands-on workflow to rate motion artifacts

    Directory of Open Access Journals (Sweden)

    Lea Luise Backhausen

    2016-12-01

    Full Text Available In structural magnetic resonance imaging motion artifacts are common, especially when not scanning healthy young adults. It has been shown that motion affects the analysis with automated image-processing techniques (e.g. FreeSurfer. This can bias results. Several developmental and adult studies have found reduced volume and thickness of gray matter due to motion artifacts. Thus, quality control is necessary in order to ensure an acceptable level of quality and to define exclusion criteria of images (i.e. determine participants with most severe artifacts. However, information about the quality control workflow and image exclusion procedure is largely lacking in the current literature and the existing rating systems differ. Here we propose a stringent workflow of quality control steps during and after acquisition of T1-weighted images, which enables researchers dealing with populations that are typically affected by motion artifacts to enhance data quality and maximize sample sizes. As an underlying aim we established a thorough quality control rating system for T1-weighted images and applied it to the analysis of developmental clinical data using the automated processing pipeline FreeSurfer. This hands-on workflow and quality control rating system will aid researchers in minimizing motion artifacts in the final data set, and therefore enhance the quality of structural magnetic resonance imaging studies.

  4. Improved workflow modelling using role activity diagram-based modelling with application to a radiology service case study.

    Science.gov (United States)

    Shukla, Nagesh; Keast, John E; Ceglarek, Darek

    2014-10-01

    The modelling of complex workflows is an important problem-solving technique within healthcare settings. However, currently most of the workflow models use a simplified flow chart of patient flow obtained using on-site observations, group-based debates and brainstorming sessions, together with historic patient data. This paper presents a systematic and semi-automatic methodology for knowledge acquisition with detailed process representation using sequential interviews of people in the key roles involved in the service delivery process. The proposed methodology allows the modelling of roles, interactions, actions, and decisions involved in the service delivery process. This approach is based on protocol generation and analysis techniques such as: (i) initial protocol generation based on qualitative interviews of radiology staff, (ii) extraction of key features of the service delivery process, (iii) discovering the relationships among the key features extracted, and, (iv) a graphical representation of the final structured model of the service delivery process. The methodology is demonstrated through a case study of a magnetic resonance (MR) scanning service-delivery process in the radiology department of a large hospital. A set of guidelines is also presented in this paper to visually analyze the resulting process model for identifying process vulnerabilities. A comparative analysis of different workflow models is also conducted. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  5. Analog to digital workflow improvement: a quantitative study.

    Science.gov (United States)

    Wideman, Catherine; Gallet, Jacqueline

    2006-01-01

    This study tracked a radiology department's conversion from utilization of a Kodak Amber analog system to a Kodak DirectView DR 5100 digital system. Through the use of ProModel Optimization Suite, a workflow simulation software package, significant quantitative information was derived from workflow process data measured before and after the change to a digital system. Once the digital room was fully operational and the radiology staff comfortable with the new system, average patient examination time was reduced from 9.24 to 5.28 min, indicating that a higher patient throughput could be achieved. Compared to the analog system, chest examination time for modality specific activities was reduced by 43%. The percentage of repeat examinations experienced with the digital system also decreased to 8% vs. the level of 9.5% experienced with the analog system. The study indicated that it is possible to quantitatively study clinical workflow and productivity by using commercially available software.

  6. Dynamic Service Selection in Workflows Using Performance Data

    Directory of Open Access Journals (Sweden)

    David W. Walker

    2007-01-01

    Full Text Available An approach to dynamic workflow management and optimisation using near-realtime performance data is presented. Strategies are discussed for choosing an optimal service (based on user-specified criteria from several semantically equivalent Web services. Such an approach may involve finding "similar" services, by first pruning the set of discovered services based on service metadata, and subsequently selecting an optimal service based on performance data. The current implementation of the prototype workflow framework is described, and demonstrated with a simple workflow. Performance results are presented that show the performance benefits of dynamic service selection. A statistical analysis based on the first order statistic is used to investigate the likely improvement in service response time arising from dynamic service selection.

  7. Task Delegation Based Access Control Models for Workflow Systems

    Science.gov (United States)

    Gaaloul, Khaled; Charoy, François

    e-Government organisations are facilitated and conducted using workflow management systems. Role-based access control (RBAC) is recognised as an efficient access control model for large organisations. The application of RBAC in workflow systems cannot, however, grant permissions to users dynamically while business processes are being executed. We currently observe a move away from predefined strict workflow modelling towards approaches supporting flexibility on the organisational level. One specific approach is that of task delegation. Task delegation is a mechanism that supports organisational flexibility, and ensures delegation of authority in access control systems. In this paper, we propose a Task-oriented Access Control (TAC) model based on RBAC to address these requirements. We aim to reason about task from organisational perspectives and resources perspectives to analyse and specify authorisation constraints. Moreover, we present a fine grained access control protocol to support delegation based on the TAC model.

  8. Implementation of the electronic DDA workflow for NSSS system design

    International Nuclear Information System (INIS)

    Eom, Young Sam; Kim, Yeon Sung; Lee, Suk Hee; Kim, Mi Kyung

    1996-06-01

    For improving NSSS design quality, and productivity several cases of the nuclear developed nation's integrated management system, such as Mitsubishi's NUWINGS (Japan), AECL's CANDID (Canada) and Duke Powes's (USA) were investigated, and it was studied in this report that the system implementation of NSSS design document computerization and the major workflow process of the DDA (Document Distribution for Agreement). On the basis of the requirements of design document computerization which covered preparation, review, approval and distribution of the engineering documents, KAERI Engineering Information Management System (KEIMS) was implemented. Major effects of this report are to implement GUI panel for input and retrieval of the document index information, to setup electronic document workflow, and to provide quality assurance verification by tracing the workflow history. Major effects of NSSS design document computerization are the improvement of efficiency and reliability and the engineering cost reduction by means of the fast documents verification capability and electronic document transferring system. 2 tabs., 16 figs., 9 refs. (Author)

  9. Nexus: A modular workflow management system for quantum simulation codes

    Science.gov (United States)

    Krogel, Jaron T.

    2016-01-01

    The management of simulation workflows represents a significant task for the individual computational researcher. Automation of the required tasks involved in simulation work can decrease the overall time to solution and reduce sources of human error. A new simulation workflow management system, Nexus, is presented to address these issues. Nexus is capable of automated job management on workstations and resources at several major supercomputing centers. Its modular design allows many quantum simulation codes to be supported within the same framework. Current support includes quantum Monte Carlo calculations with QMCPACK, density functional theory calculations with Quantum Espresso or VASP, and quantum chemical calculations with GAMESS. Users can compose workflows through a transparent, text-based interface, resembling the input file of a typical simulation code. A usage example is provided to illustrate the process.

  10. Assessment of the Nurse Medication Administration Workflow Process

    Directory of Open Access Journals (Sweden)

    Nathan Huynh

    2016-01-01

    Full Text Available This paper presents findings of an observational study of the Registered Nurse (RN Medication Administration Process (MAP conducted on two comparable medical units in a large urban tertiary care medical center in Columbia, South Carolina. A total of 305 individual MAP observations were recorded over a 6-week period with an average of 5 MAP observations per RN participant for both clinical units. A key MAP variation was identified in terms of unbundled versus bundled MAP performance. In the unbundled workflow, an RN engages in the MAP by performing only MAP tasks during a care episode. In the bundled workflow, an RN completes medication administration along with other patient care responsibilities during the care episode. Using a discrete-event simulation model, this paper addresses the difference between unbundled and bundled workflow and their effects on simulated redesign interventions.

  11. From shared data to sharing workflow: Merging PACS and teleradiology

    International Nuclear Information System (INIS)

    Benjamin, Menashe; Aradi, Yinon; Shreiber, Reuven

    2010-01-01

    Due to a host of technological, interface, operational and workflow limitations, teleradiology and PACS/RIS were historically developed as separate systems serving different purposes. PACS/RIS handled local radiology storage and workflow management while teleradiology addressed remote access to images. Today advanced PACS/RIS support complete site radiology workflow for attending physicians, whether on-site or remote. In parallel, teleradiology has emerged into a service of providing remote, off-hours, coverage for emergency radiology and to a lesser extent subspecialty reading to subscribing sites and radiology groups. When attending radiologists use teleradiology for remote access to a site, they may share all relevant patient data and participate in the site's workflow like their on-site peers. The operation gets cumbersome and time consuming when these radiologists serve multi-sites, each requiring a different remote access, or when the sites do not employ the same PACS/RIS/Reporting Systems and do not share the same ownership. The least efficient operation is of teleradiology companies engaged in reading for multiple facilities. As these services typically employ non-local radiologists, they are allowed to share some of the available patient data necessary to provide an emergency report but, by enlarge, they do not share the workflow of the sites they serve. Radiology stakeholders usually prefer to have their own radiologists perform all radiology tasks including interpretation of off-hour examinations. It is possible with current technology to create a system that combines the benefits of local radiology services to multiple sites with the advantages offered by adding subspecialty and off-hours emergency services through teleradiology. Such a system increases efficiency for the radiology groups by enabling all users, regardless of location, to work 'local' and fully participate in the workflow of every site. We refer to such a system as SuperPACS.

  12. Exformatics Declarative Case Management Workflows as DCR Graphs

    DEFF Research Database (Denmark)

    Slaats, Tijs; Mukkamala, Raghava Rao; Hildebrandt, Thomas

    2013-01-01

    Declarative workflow languages have been a growing research subject over the past ten years, but applications of the declarative approach in industry are still uncommon. Over the past two years Exformatics A/S, a Danish provider of Electronic Case Management systems, has been cooperating...... with researchers at IT University of Copenhagen (ITU) to create tools for the declarative workflow language Dynamic Condition Response Graphs (DCR Graphs) and incorporate them into their products and in teaching at ITU. In this paper we give a status report over the work. We start with an informal introduction...

  13. Addressing informatics challenges in Translational Research with workflow technology.

    Science.gov (United States)

    Beaulah, Simon A; Correll, Mick A; Munro, Robin E J; Sheldon, Jonathan G

    2008-09-01

    Interest in Translational Research has been growing rapidly in recent years. In this collision of different data, technologies and cultures lie tremendous opportunities for the advancement of science and business for organisations that are able to integrate, analyse and deliver this information effectively to users. Workflow-based integration and analysis systems are becoming recognised as a fast and flexible way to build applications that are tailored to scientific areas, yet are built on a common platform. Workflow systems are allowing organisations to meet the key informatics challenges in Translational Research and improve disease understanding and patient care.

  14. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows

  15. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  16. 2017 NAIP Acquisition Map

    Data.gov (United States)

    Farm Service Agency, Department of Agriculture — Planned States for 2017 NAIP acquisition and acquisition status layer (updated daily). Updates to the acquisition seasons may be made during the season to...

  17. It's All About the Data: Workflow Systems and Weather

    Science.gov (United States)

    Plale, B.

    2009-05-01

    Digital data is fueling new advances in the computational sciences, particularly geospatial research as environmental sensing grows more practical through reduced technology costs, broader network coverage, and better instruments. e-Science research (i.e., cyberinfrastructure research) has responded to data intensive computing with tools, systems, and frameworks that support computationally oriented activities such as modeling, analysis, and data mining. Workflow systems support execution of sequences of tasks on behalf of a scientist. These systems, such as Taverna, Apache ODE, and Kepler, when built as part of a larger cyberinfrastructure framework, give the scientist tools to construct task graphs of execution sequences, often through a visual interface for connecting task boxes together with arcs representing control flow or data flow. Unlike business processing workflows, scientific workflows expose a high degree of detail and control during configuration and execution. Data-driven science imposes unique needs on workflow frameworks. Our research is focused on two issues. The first is the support for workflow-driven analysis over all kinds of data sets, including real time streaming data and locally owned and hosted data. The second is the essential role metadata/provenance collection plays in data driven science, for discovery, determining quality, for science reproducibility, and for long-term preservation. The research has been conducted over the last 6 years in the context of cyberinfrastructure for mesoscale weather research carried out as part of the Linked Environments for Atmospheric Discovery (LEAD) project. LEAD has pioneered new approaches for integrating complex weather data, assimilation, modeling, mining, and cyberinfrastructure systems. Workflow systems have the potential to generate huge volumes of data. Without some form of automated metadata capture, either metadata description becomes largely a manual task that is difficult if not impossible

  18. SU-F-T-252: An Investigation of Gamma Knife Frame Definition Error When Using a Pre-Planning Workflow

    International Nuclear Information System (INIS)

    Johnson, P

    2016-01-01

    Purpose: To determine causal factors related to high frame definition error when treating GK patients using a pre-planning workflow. Methods: 160 cases were retrospectively reviewed. All patients received treatment using a pre-planning workflow whereby stereotactic coordinates are determined from a CT scan acquired after framing using a fiducial box. The planning software automatically detects the fiducials and compares their location to expected values based on the rigid design of the fiducial system. Any difference is reported as mean and maximum frame definition error. The manufacturer recommends these values be less than 1.0 mm and 1.5 mm. In this study, frame definition error was analyzed in comparison with a variety of factors including which neurosurgeon/oncologist/physicist was involved with the procedure, number of post used during framing (3 or 4), type of lesion, and which CT scanner was utilized for acquisition. An analysis of variance (ANOVA) approach was used to statistically evaluate the data and determine causal factors related to instances of high frame definition error. Results: Two factors were identified as significant: number of post (p=0.0003) and CT scanner (p=0.0001). Further analysis showed that one of the four scanners was significantly different than the others. This diagnostic scanner was identified as an older model with localization lasers not tightly calibrated. The average value for maximum frame definition error using this scanner was 1.48 mm (4 posts) and 1.75 mm (3 posts). For the other scanners this value was 1.13 mm (4 posts) and 1.40 mm (3 posts). Conclusion: In utilizing a pre-planning workflow the choice of CT scanner matters. Any scanner utilized for GK should undergo routine QA at a level appropriate for radiation oncology. In terms of 3 vs 4 post, it is hypothesized that three posts provide less stability during CT acquisition. This will be tested in future work.

  19. SU-F-T-252: An Investigation of Gamma Knife Frame Definition Error When Using a Pre-Planning Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, P [University of Miami, Miami, FL (United States)

    2016-06-15

    Purpose: To determine causal factors related to high frame definition error when treating GK patients using a pre-planning workflow. Methods: 160 cases were retrospectively reviewed. All patients received treatment using a pre-planning workflow whereby stereotactic coordinates are determined from a CT scan acquired after framing using a fiducial box. The planning software automatically detects the fiducials and compares their location to expected values based on the rigid design of the fiducial system. Any difference is reported as mean and maximum frame definition error. The manufacturer recommends these values be less than 1.0 mm and 1.5 mm. In this study, frame definition error was analyzed in comparison with a variety of factors including which neurosurgeon/oncologist/physicist was involved with the procedure, number of post used during framing (3 or 4), type of lesion, and which CT scanner was utilized for acquisition. An analysis of variance (ANOVA) approach was used to statistically evaluate the data and determine causal factors related to instances of high frame definition error. Results: Two factors were identified as significant: number of post (p=0.0003) and CT scanner (p=0.0001). Further analysis showed that one of the four scanners was significantly different than the others. This diagnostic scanner was identified as an older model with localization lasers not tightly calibrated. The average value for maximum frame definition error using this scanner was 1.48 mm (4 posts) and 1.75 mm (3 posts). For the other scanners this value was 1.13 mm (4 posts) and 1.40 mm (3 posts). Conclusion: In utilizing a pre-planning workflow the choice of CT scanner matters. Any scanner utilized for GK should undergo routine QA at a level appropriate for radiation oncology. In terms of 3 vs 4 post, it is hypothesized that three posts provide less stability during CT acquisition. This will be tested in future work.

  20. Images crossing borders: image and workflow sharing on multiple levels.

    Science.gov (United States)

    Ross, Peeter; Pohjonen, Hanna

    2011-04-01

    Digitalisation of medical data makes it possible to share images and workflows between related parties. In addition to linear data flow where healthcare professionals or patients are the information carriers, a new type of matrix of many-to-many connections is emerging. Implementation of shared workflow brings challenges of interoperability and legal clarity. Sharing images or workflows can be implemented on different levels with different challenges: inside the organisation, between organisations, across country borders, or between healthcare institutions and citizens. Interoperability issues vary according to the level of sharing and are either technical or semantic, including language. Legal uncertainty increases when crossing national borders. Teleradiology is regulated by multiple European Union (EU) directives and legal documents, which makes interpretation of the legal system complex. To achieve wider use of eHealth and teleradiology several strategic documents were published recently by the EU. Despite EU activities, responsibility for organising, providing and funding healthcare systems remains with the Member States. Therefore, the implementation of new solutions requires strong co-operation between radiologists, societies of radiology, healthcare administrators, politicians and relevant EU authorities. The aim of this article is to describe different dimensions of image and workflow sharing and to analyse legal acts concerning teleradiology in the EU.

  1. Implementation of a healthcare process in four different workflow systems

    NARCIS (Netherlands)

    Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.

    2009-01-01

    Currently, many hospitals are investigating the use of a work-flow management system in order to provide support for care processes. However, today's workow management systems fall short in supporting care processes as exibility is required for its execution. In this paper, we investigate the

  2. A Generalized Email Classification System for Workflow Analysis

    NARCIS (Netherlands)

    P. Chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); C-L. Chang (Chia-Lin); M.J. McAleer (Michael)

    2017-01-01

    textabstractOne of the most powerful internet communication channels is email. As employees and their clients communicate primarily via email, much crucial business data is conveyed via email content. Where businesses are understandably concerned, they need a sophisticated workflow management

  3. 3D workflows in orthodontics, maxillofacial surgery and prosthodontics

    NARCIS (Netherlands)

    van der Meer, Wicher Jurjen

    2016-01-01

    In this thesis different aspects of digital workflows in Orthodontics, Maxillofacial Surgery and Prosthodontics are discussed and, where possible, placed in a broader perspective thereby attempting to go both broader and deeper into the implications of the introduction of 3D digital technology in

  4. Collaborative e-Science Experiments and Scientific Workflows

    NARCIS (Netherlands)

    Belloum, A.; Inda, M.A.; Vasunin, D.; Korkhov, V.; Zhao, Z.; Rauwerda, H.; Breit, T.M.; Bubak, M.; Hertzberger, L.O.

    2011-01-01

    Recent advances in Internet and grid technologies have greatly enhanced scientific experiments' life cycle. In addition to compute- and data-intensive tasks, large-scale collaborations involving geographically distributed scientists and e-infrastructure are now possible. Scientific workflows, which

  5. SHIWA workflow interoperability solutions for neuroimaging data analysis

    NARCIS (Netherlands)

    Korkhov, Vladimir; Krefting, Dagmar; Montagnat, Johan; Truong Huu, Tram; Kukla, Tamas; Terstyanszky, Gabor; Manset, David; Caan, Matthan; Olabarriaga, Silvia

    2012-01-01

    Neuroimaging is a field that benefits from distributed computing infrastructures (DCIs) to perform data- and compute-intensive processing and analysis. Using grid workflow systems not only automates the processing pipelines, but also enables domain researchers to implement their expertise on how to

  6. Supporting flexible processes with adaptive workflow and case handling

    NARCIS (Netherlands)

    Günther, C.W.; Reichert, M.; Aalst, van der W.M.P.

    2008-01-01

    Workflow management technology has profoundly transformed the way complex tasks are being handled in modern, large-scale organizations. However, it is mostly those systems' inherent lack of flexibility that hinders their broad acceptance, and that is perceived as annoyance by users. In this context,

  7. Styx Grid Services: Lightweight Middleware for Efficient Scientific Workflows

    Directory of Open Access Journals (Sweden)

    J.D. Blower

    2006-01-01

    Full Text Available The service-oriented approach to performing distributed scientific research is potentially very powerful but is not yet widely used in many scientific fields. This is partly due to the technical difficulties involved in creating services and workflows and the inefficiency of many workflow systems with regard to handling large datasets. We present the Styx Grid Service, a simple system that wraps command-line programs and allows them to be run over the Internet exactly as if they were local programs. Styx Grid Services are very easy to create and use and can be composed into powerful workflows with simple shell scripts or more sophisticated graphical tools. An important feature of the system is that data can be streamed directly from service to service, significantly increasing the efficiency of workflows that use large data volumes. The status and progress of Styx Grid Services can be monitored asynchronously using a mechanism that places very few demands on firewalls. We show how Styx Grid Services can interoperate with with Web Services and WS-Resources using suitable adapters.

  8. CrossFlow: integrating workflow management and electronic commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.W.P.J.; Aberer, K.

    2001-01-01

    The CrossFlow architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  9. CrossFlow: Integrating Workflow Management and Electronic Commerce

    NARCIS (Netherlands)

    Hoffner, Y.; Ludwig, H.; Grefen, P.W.P.J.; Aberer, K.

    2001-01-01

    The CrossFlow1 architecture provides support for cross-organisational workflow management in dynamically established virtual enterprises. The creation of a business relationship between a service provider organisation performing a service on behalf of a consumer organisation can be made dynamic when

  10. How to increase work autonomy in workflow management systems?

    NARCIS (Netherlands)

    Vanderfeesten, I.T.P.; Reijers, H.A.

    2006-01-01

    Abstract: Purpose – Current workflow management systems (WfMS's) are often too rigid and lead to "chain production" in the office. The paper proposes a number of "tuning measures" to reconfigure an implemented WfMS in such a way that it is more agreeable to the needs of its users.

  11. Electronic Health Record-Driven Workflow for Diagnostic Radiologists.

    Science.gov (United States)

    Geeslin, Matthew G; Gaskin, Cree M

    2016-01-01

    In most settings, radiologists maintain a high-throughput practice in which efficiency is crucial. The conversion from film-based to digital study interpretation and data storage launched the era of PACS-driven workflow, leading to significant gains in speed. The advent of electronic health records improved radiologists' access to patient data; however, many still find this aspect of workflow to be relatively cumbersome. Nevertheless, the ability to guide a diagnostic interpretation with clinical information, beyond that provided in the examination indication, can add significantly to the specificity of a radiologist's interpretation. Responsibilities of the radiologist include, but are not limited to, protocoling examinations, interpreting studies, chart review, peer review, writing notes, placing orders, and communicating with referring providers. Most of the aforementioned activities are not PACS-centric and require a login to one or more additional applications. Consolidation of these tasks for completion through a single interface can simplify workflow, save time, and potentially reduce the incidence of errors. Here, the authors describe diagnostic radiology workflow that leverages the electronic health record to significantly add to a radiologist's ability to be part of the health care team, provide relevant interpretations, and improve efficiency and quality. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  12. A standard-enabled workflow for synthetic biology.

    Science.gov (United States)

    Myers, Chris J; Beal, Jacob; Gorochowski, Thomas E; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Göksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications. © 2017 The Author(s); published by Portland Press Limited on behalf of the Biochemical Society.

  13. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    We present a field study of oncology workflow, involving doctors, nurses and pharmacists at Danish hospitals and discuss the obstacles, enablers and challenges for the use of computer based clinical practice guidelines. Related to the CIGDec approach of Pesic and van der Aalst we then describe how...

  14. You’ve Got Email: a Workflow Management Extraction System

    NARCIS (Netherlands)

    P. Chaipornkaew (Piyanuch); T. Prexawanprasut (Takorn); M.J. McAleer (Michael)

    2017-01-01

    textabstractEmail is one of the most powerful tools for communication. Many businesses use email as the main channel for communication, so it is possible that substantial data are included in email content. In order to help businesses grow faster, a workflow management system may be required. The

  15. A Workflow For Computer-Aided Cytology In Whole Slide Images: Application In Fine-Needle Aspiration Thyroid Cytology

    Directory of Open Access Journals (Sweden)

    R. Marée

    2016-06-01

    Using Cytomine web annotation tools, experts first built an unprecedented ground-truth dataset of various types of normal and abnormal cells and clusters (> 6000 objects from 60 FNA whole-slide images to train recognition models. Once all cells of new slides are classified by our workflow, predictions are uploaded to the Cytomine-Core server through HTTP requests and can be displayed in the Cytomine-WebUI as sorted galleries of most suspicious objects. At the conference we will present our qualitative and quantitative evaluation of the different steps of the workflow and discuss limitations and perspectives. This novel Cytomine module will be released as open-source in the near future so that other research groups will be able to train, apply, and extend it on their own data.

  16. Digital Workflow for the Conservation of Bahrain Built Heritage: the Sheik Isa Bin ALI House

    Science.gov (United States)

    Barazzetti, L.; Mezzino, D.; Santana Quintero, M.

    2017-08-01

    Currently, the commercial market offers several tools for digital documentation of historic sites and buildings. Photogrammetry and laser scanning play a fundamental role in the acquisition of metric information, which is then processed to generate reliable records particularly useful also in the built heritage conservation field. Although potentially very fast and accurate, such techniques require expert operators to produce reliable results, especially in the case of complex and large sites. The aim of this paper is to present the digital workflow developed for data acquisition and processing of the Shaikh Isa Bin Ali house in Muharraq, Bahrain. This historic structure is an outstanding example of Bahrain architecture as well as tangible memory of the country history, with strong connotations in the Bahrain cultural identity. The building has been documented employing several digital techniques, including: aerial (drone) and terrestrial photogrammetry, rectifying photography, total station and laser scanning. The documentation project has been developed for the Bahrain Authority for Culture and Antiquities (BACA) by a multidisciplinary team of experts from Carleton Immersive Media Studio (CIMS, Carleton University, Canada) and Gicarus Lab (Politecnico di Milano, Italy).

  17. Pegasus Workflow Management System: Helping Applications From Earth and Space

    Science.gov (United States)

    Mehta, G.; Deelman, E.; Vahi, K.; Silva, F.

    2010-12-01

    Pegasus WMS is a Workflow Management System that can manage large-scale scientific workflows across Grid, local and Cloud resources simultaneously. Pegasus WMS provides a means for representing the workflow of an application in an abstract XML form, agnostic of the resources available to run it and the location of data and executables. It then compiles these workflows into concrete plans by querying catalogs and farming computations across local and distributed computing resources, as well as emerging commercial and community cloud environments in an easy and reliable manner. Pegasus WMS optimizes the execution as well as data movement by leveraging existing Grid and cloud technologies via a flexible pluggable interface and provides advanced features like reusing existing data, automatic cleanup of generated data, and recursive workflows with deferred planning. It also captures all the provenance of the workflow from the planning stage to the execution of the generated data, helping scientists to accurately measure performance metrics of their workflow as well as data reproducibility issues. Pegasus WMS was initially developed as part of the GriPhyN project to support large-scale high-energy physics and astrophysics experiments. Direct funding from the NSF enabled support for a wide variety of applications from diverse domains including earthquake simulation, bacterial RNA studies, helioseismology and ocean modeling. Earthquake Simulation: Pegasus WMS was recently used in a large scale production run in 2009 by the Southern California Earthquake Centre to run 192 million loosely coupled tasks and about 2000 tightly coupled MPI style tasks on National Cyber infrastructure for generating a probabilistic seismic hazard map of the Southern California region. SCEC ran 223 workflows over a period of eight weeks, using on average 4,420 cores, with a peak of 14,540 cores. A total of 192 million files were produced totaling about 165TB out of which 11TB of data was saved

  18. A Workflow-Oriented Approach To Propagation Models In Heliophysics

    Directory of Open Access Journals (Sweden)

    Gabriele Pierantoni

    2014-01-01

    Full Text Available The Sun is responsible for the eruption of billions of tons of plasma andthe generation of near light-speed particles that propagate throughout the solarsystem and beyond. If directed towards Earth, these events can be damaging toour tecnological infrastructure. Hence there is an effort to understand the causeof the eruptive events and how they propagate from Sun to Earth. However, thephysics governing their propagation is not well understood, so there is a need todevelop a theoretical description of their propagation, known as a PropagationModel, in order to predict when they may impact Earth. It is often difficultto define a single propagation model that correctly describes the physics ofsolar eruptive events, and even more difficult to implement models capable ofcatering for all these complexities and to validate them using real observational data.In this paper, we envisage that workflows offer both a theoretical andpractical framerwork for a novel approach to propagation models. We definea mathematical framework that aims at encompassing the different modalitieswith which workflows can be used, and provide a set of generic building blockswritten in the TAVERNA workflow language that users can use to build theirown propagation models. Finally we test both the theoretical model and thecomposite building blocks of the workflow with a real Science Use Case that wasdiscussed during the 4th CDAW (Coordinated Data Analysis Workshop eventheld by the HELIO project. We show that generic workflow building blocks canbe used to construct a propagation model that succesfully describes the transitof solar eruptive events toward Earth and predict a correct Earth-impact time

  19. Provenance-Based Debugging and Drill-Down in Data-Oriented Workflows

    KAUST Repository

    Ikeda, Robert; Cho, Junsang; Fang, Charlie; Salihoglu, Semih; Torikai, Satoshi; Widom, Jennifer

    2012-01-01

    Panda (for Provenance and Data) is a system that supports the creation and execution of data-oriented workflows, with automatic provenance generation and built-in provenance tracing operations. Workflows in Panda are arbitrary a cyclic graphs

  20. Contract-Based Transaction Management in Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Grefen, P.W.P.J.

    Cross-organizational workflow management is an essential ingredient for process integration in virtual enterprises. To obtain cross-organizational workflow processes with robust semantics, these processes should be supported by highlevel cross-organizational transaction management. In this context,

  1. VisTrails is an open-source scientific workflow and provenance management system

    CSIR Research Space (South Africa)

    Mthombeni, Thabo DM

    2011-12-01

    Full Text Available VisTrails is an open-source scientific workflow and provenance management system that provides support for simulations, data exploration and visualization. Whereas workflows have been traditionally used to automate repetitive tasks, for applications...

  2. Patient-centered care requires a patient-oriented workflow model.

    Science.gov (United States)

    Ozkaynak, Mustafa; Brennan, Patricia Flatley; Hanauer, David A; Johnson, Sharon; Aarts, Jos; Zheng, Kai; Haque, Saira N

    2013-06-01

    Effective design of health information technology (HIT) for patient-centered care requires consideration of workflow from the patient's perspective, termed 'patient-oriented workflow.' This approach organizes the building blocks of work around the patients who are moving through the care system. Patient-oriented workflow complements the more familiar clinician-oriented workflow approaches, and offers several advantages, including the ability to capture simultaneous, cooperative work, which is essential in care delivery. Patient-oriented workflow models can also provide an understanding of healthcare work taking place in various formal and informal health settings in an integrated manner. We present two cases demonstrating the potential value of patient-oriented workflow models. Significant theoretical, methodological, and practical challenges must be met to ensure adoption of patient-oriented workflow models. Patient-oriented workflow models define meaningful system boundaries and can lead to HIT implementations that are more consistent with cooperative work and its emergent features.

  3. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  4. Closha: bioinformatics workflow system for the analysis of massive sequencing data.

    Science.gov (United States)

    Ko, GunHwan; Kim, Pan-Gyu; Yoon, Jongcheol; Han, Gukhee; Park, Seong-Jin; Song, Wangho; Lee, Byungwook

    2018-02-19

    While next-generation sequencing (NGS) costs have fallen in recent years, the cost and complexity of computation remain substantial obstacles to the use of NGS in bio-medical care and genomic research. The rapidly increasing amounts of data available from the new high-throughput methods have made data processing infeasible without automated pipelines. The integration of data and analytic resources into workflow systems provides a solution to the problem by simplifying the task of data analysis. To address this challenge, we developed a cloud-based workflow management system, Closha, to provide fast and cost-effective analysis of massive genomic data. We implemented complex workflows making optimal use of high-performance computing clusters. Closha allows users to create multi-step analyses using drag and drop functionality and to modify the parameters of pipeline tools. Users can also import the Galaxy pipelines into Closha. Closha is a hybrid system that enables users to use both analysis programs providing traditional tools and MapReduce-based big data analysis programs simultaneously in a single pipeline. Thus, the execution of analytics algorithms can be parallelized, speeding up the whole process. We also developed a high-speed data transmission solution, KoDS, to transmit a large amount of data at a fast rate. KoDS has a file transfer speed of up to 10 times that of normal FTP and HTTP. The computer hardware for Closha is 660 CPU cores and 800 TB of disk storage, enabling 500 jobs to run at the same time. Closha is a scalable, cost-effective, and publicly available web service for large-scale genomic data analysis. Closha supports the reliable and highly scalable execution of sequencing analysis workflows in a fully automated manner. Closha provides a user-friendly interface to all genomic scientists to try to derive accurate results from NGS platform data. The Closha cloud server is freely available for use from http://closha.kobic.re.kr/ .

  5. Verification of Timed Healthcare Workflows Using Component Timed-Arc Petri Nets

    DEFF Research Database (Denmark)

    Bertolini, Cristiano; Liu, Zhiming; Srba, Jiri

    2013-01-01

    Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics and a supp......Workflows in modern healthcare systems are becoming increasingly complex and their execution involves concurrency and sharing of resources. The definition, analysis and management of collaborative healthcare workflows requires abstract model notations with a precisely defined semantics...

  6. Modeling, Design, and Implementation of a Cloud Workflow Engine Based on Aneka

    OpenAIRE

    Zhou, Jiantao; Sun, Chaoxin; Fu, Weina; Liu, Jing; Jia, Lei; Tan, Hongyan

    2014-01-01

    This paper presents a Petri net-based model for cloud workflow which plays a key role in industry. Three kinds of parallelisms in cloud workflow are characterized and modeled. Based on the analysis of the modeling, a cloud workflow engine is designed and implemented in Aneka cloud environment. The experimental results validate the effectiveness of our approach of modeling, design, and implementation of cloud workflow.

  7. A-Posteriori Detection of Sensor Infrastructure Errors in Correlated Sensor Data and Business Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas; Rinderle-Ma, Stefanie; Toumani, Farouk; Wolf, Karsten

    Some physical objects are influenced by business workflows and are observed by sensors. Since both sensor infrastructures and business workflows must deal with imprecise information, the correlation of sensor data and business workflow data related to physical objects might be used a-posteriori to

  8. Automatic support for product based workflow design : generation of process models from a product data model

    NARCIS (Netherlands)

    Vanderfeesten, I.T.P.; Reijers, H.A.; Aalst, van der W.M.P.; Vogelaar, J.J.C.L.; Meersman, R.; Dillon, T.; Herrero, P.

    2010-01-01

    Product Based Workflow Design (PBWD) is one of the few scientific methodologies for the (re)design of workflow processes. It is based on an analysis of the product that is produced in the workflow process and derives a process model from the product structure. Until now this derivation has been a

  9. Automated evolutionary restructuring of workflows to minimise errors via stochastic model checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    This paper presents a framework for the automated restructuring of workflows that allows one to minimise the impact of errors on a production workflow. The framework allows for the modelling of workflows by means of a formalised subset of the Business Process Modelling and Notation (BPMN) language...

  10. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee

    2016-01-01

    This article presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults...

  11. Workflow Dynamics and the Imaging Value Chain: Quantifying the Effect of Designating a Nonimage-Interpretive Task Workflow.

    Science.gov (United States)

    Lee, Matthew H; Schemmel, Andrew J; Pooler, B Dustin; Hanley, Taylor; Kennedy, Tabassum A; Field, Aaron S; Wiegmann, Douglas; Yu, John-Paul J

    To assess the impact of separate non-image interpretive task and image-interpretive task workflows in an academic neuroradiology practice. A prospective, randomized, observational investigation of a centralized academic neuroradiology reading room was performed. The primary reading room fellow was observed over a one-month period using a time-and-motion methodology, recording frequency and duration of tasks performed. Tasks were categorized into separate image interpretive and non-image interpretive workflows. Post-intervention observation of the primary fellow was repeated following the implementation of a consult assistant responsible for non-image interpretive tasks. Pre- and post-intervention data were compared. Following separation of image-interpretive and non-image interpretive workflows, time spent on image-interpretive tasks by the primary fellow increased from 53.8% to 73.2% while non-image interpretive tasks decreased from 20.4% to 4.4%. Mean time duration of image interpretation nearly doubled, from 05:44 to 11:01 (p = 0.002). Decreases in specific non-image interpretive tasks, including phone calls/paging (2.86/hr versus 0.80/hr), in-room consultations (1.36/hr versus 0.80/hr), and protocoling (0.99/hr versus 0.10/hr), were observed. The consult assistant experienced 29.4 task switching events per hour. Rates of specific non-image interpretive tasks for the CA were 6.41/hr for phone calls/paging, 3.60/hr for in-room consultations, and 3.83/hr for protocoling. Separating responsibilities into NIT and IIT workflows substantially increased image interpretation time and decreased TSEs for the primary fellow. Consolidation of NITs into a separate workflow may allow for more efficient task completion. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. The Symbiotic Relationship between Scientific Workflow and Provenance (Invited)

    Science.gov (United States)

    Stephan, E.

    2010-12-01

    The purpose of this presentation is to describe the symbiotic nature of scientific workflows and provenance. We will also discuss the current trends and real world challenges facing these two distinct research areas. Although motivated differently, the needs of the international science communities are the glue that binds this relationship together. Understanding and articulating the science drivers to these communities is paramount as these technologies evolve and mature. Originally conceived for managing business processes, workflows are now becoming invaluable assets in both computational and experimental sciences. These reconfigurable, automated systems provide essential technology to perform complex analyses by coupling together geographically distributed disparate data sources and applications. As a result, workflows are capable of higher throughput in a shorter amount of time than performing the steps manually. Today many different workflow products exist; these could include Kepler and Taverna or similar products like MeDICI, developed at PNNL, that are standardized on the Business Process Execution Language (BPEL). Provenance, originating from the French term Provenir “to come from”, is used to describe the curation process of artwork as art is passed from owner to owner. The concept of provenance was adopted by digital libraries as a means to track the lineage of documents while standards such as the DublinCore began to emerge. In recent years the systems science community has increasingly expressed the need to expand the concept of provenance to formally articulate the history of scientific data. Communities such as the International Provenance and Annotation Workshop (IPAW) have formalized a provenance data model. The Open Provenance Model, and the W3C is hosting a provenance incubator group featuring the Proof Markup Language. Although both workflows and provenance have risen from different communities and operate independently, their mutual

  13. A workflow learning model to improve geovisual analytics utility.

    Science.gov (United States)

    Roth, Robert E; Maceachren, Alan M; McCabe, Craig A

    2009-01-01

    the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.

  14. Digital workflow for virtually designing and milling ceramic lithium disilicate veneers: a clinical report.

    Science.gov (United States)

    Zandinejad, A; Lin, W S; Atarodi, M; Abdel-Azim, T; Metz, M J; Morton, D

    2015-01-01

    Laminate veneers have been routinely used to restore and enhance the appearance of natural dentition. The traditional pathway for fabricating veneers consisted of making conventional polyvinyl siloxane impressions, producing stone casts, and fabricating final porcelain prostheses on stone dies. Pressed ceramics have successfully been used for laminate veneer fabrication for several years. Recently, digital computer-aided design/computer-aided manufacturing scanning has become commercially available to make a digital impression that is sent electronically to a dental laboratory or a chairside milling machine. However, technology has been developed to allow digital data acquisition in conjunction with electronically transmitted data that enables virtual design of restorations and milling at a remote production center. Following the aforementioned workflow will provide the opportunity to fabricate a physical cast-free restoration. This new technique has been reported recently for all-ceramic IPS e.max full-coverage pressed-ceramic restorations. However, laminate veneers are very delicate and technique-sensitive restorations when compared with all-ceramic full-coverage ones made from the same material. Complete digital design and fabrication of multiple consecutive laminate veneers seems to be very challenging. This clinical report presents the digital workflow for the virtual design and fabrication of multiple laminate veneers in a patient for enhancing the esthetics of his maxillary anterior teeth. A step-by-step process is presented with a discussion of the advantages and disadvantages of this novel technique. Additionally, the use of lithium disilicate ceramic as the material of choice and the rationale for such a decision is discussed.

  15. A web accessible scientific workflow system for vadoze zone performance monitoring: design and implementation examples

    Science.gov (United States)

    Mattson, E.; Versteeg, R.; Ankeny, M.; Stormberg, G.

    2005-12-01

    Long term performance monitoring has been identified by DOE, DOD and EPA as one of the most challenging and costly elements of contaminated site remedial efforts. Such monitoring should provide timely and actionable information relevant to a multitude of stakeholder needs. This information should be obtained in a manner which is auditable, cost effective and transparent. Over the last several years INL staff has designed and implemented a web accessible scientific workflow system for environmental monitoring. This workflow environment integrates distributed, automated data acquisition from diverse sensors (geophysical, geochemical and hydrological) with server side data management and information visualization through flexible browser based data access tools. Component technologies include a rich browser-based client (using dynamic javascript and html/css) for data selection, a back-end server which uses PHP for data processing, user management, and result delivery, and third party applications which are invoked by the back-end using webservices. This system has been implemented and is operational for several sites, including the Ruby Gulch Waste Rock Repository (a capped mine waste rock dump on the Gilt Edge Mine Superfund Site), the INL Vadoze Zone Research Park and an alternative cover landfill. Implementations for other vadoze zone sites are currently in progress. These systems allow for autonomous performance monitoring through automated data analysis and report generation. This performance monitoring has allowed users to obtain insights into system dynamics, regulatory compliance and residence times of water. Our system uses modular components for data selection and graphing and WSDL compliant webservices for external functions such as statistical analyses and model invocations. Thus, implementing this system for novel sites and extending functionality (e.g. adding novel models) is relatively straightforward. As system access requires a standard webbrowser

  16. Networked Print Production: Does JDF Provide a Perfect Workflow?

    Directory of Open Access Journals (Sweden)

    Bernd Zipper

    2004-12-01

    Full Text Available The "networked printing works" is a well-worn slogan used by many providers in the graphics industry and for the past number of years printing-works manufacturers have been working on the goal of achieving the "networked printing works". A turning point from the concept to real implementation can now be expected at drupa 2004: JDF (Job Definition Format and thus "networked production" will form the center of interest here. The first approaches towards a complete, networked workflow between prepress, print and postpress in production are already available - the products and solutions will now be presented publicly at drupa 2004. So, drupa 2004 will undoubtedly be the "JDF-drupa" - the drupa where machines learn to communicate with each other digitally - the drupa, where the dream of general system and job communication in the printing industry can be first realized. CIP3, which has since been renamed CIP4, is an international consortium of leading manufacturers from the printing and media industry who have taken on the task of integrating processes for prepress, print and postpress. The association, to which nearly all manufacturers in the graphics industry belong, has succeeded with CIP3 in developing a first international standard for the transmission of control data in the print workflow.Further development of the CIP4 standard now includes a more extensive "system language" called JDF, which will guarantee workflow communication beyond manufacturer boundaries. However, not only data for actual print production will be communicated with JDF (Job Definition Format: planning and calculation data for MIS (Management Information systems and calculation systems will also be prepared. The German printing specialist Hans-Georg Wenke defines JDF as follows: "JDF takes over data from MIS for machines, aggregates and their control desks, data exchange within office applications, and finally ensures that data can be incorporated in the technical workflow

  17. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    Science.gov (United States)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  18. A framework for streamlining research workflow in neuroscience and psychology

    Directory of Open Access Journals (Sweden)

    Jonas eKubilius

    2014-01-01

    Full Text Available Successful accumulation of knowledge is critically dependent on the ability to verify and replicate every part of scientific conduct. However, such principles are difficult to enact when researchers continue to resort on ad hoc workflows and with poorly maintained code base. In this paper I examine the needs of neuroscience and psychology community, and introduce psychopy_ext, a unifying framework that seamlessly integrates popular experiment building, analysis and manuscript preparation tools by choosing reasonable defaults and implementing relatively rigid patterns of workflow. This structure allows for automation of multiple tasks, such as generated user interfaces, unit testing, control analyses of stimuli, single-command access to descriptive statistics, and publication quality plotting. Taken together, psychopy_ext opens an exciting possibility for faster, more robust code development and collaboration for researchers.

  19. The complete digital workflow in fixed prosthodontics: a systematic review.

    Science.gov (United States)

    Joda, Tim; Zarone, Fernando; Ferrari, Marco

    2017-09-19

    The continuous development in dental processing ensures new opportunities in the field of fixed prosthodontics in a complete virtual environment without any physical model situations. The aim was to compare fully digitalized workflows to conventional and/or mixed analog-digital workflows for the treatment with tooth-borne or implant-supported fixed reconstructions. A PICO strategy was executed using an electronic (MEDLINE, EMBASE, Google Scholar) plus manual search up to 2016-09-16 focusing on RCTs investigating complete digital workflows in fixed prosthodontics with regard to economics or esthetics or patient-centered outcomes with or without follow-up or survival/success rate analysis as well as complication assessment of at least 1 year under function. The search strategy was assembled from MeSH-Terms and unspecific free-text words: {(("Dental Prosthesis" [MeSH]) OR ("Crowns" [MeSH]) OR ("Dental Prosthesis, Implant-Supported" [MeSH])) OR ((crown) OR (fixed dental prosthesis) OR (fixed reconstruction) OR (dental bridge) OR (implant crown) OR (implant prosthesis) OR (implant restoration) OR (implant reconstruction))} AND {("Computer-Aided Design" [MeSH]) OR ((digital workflow) OR (digital technology) OR (computerized dentistry) OR (intraoral scan) OR (digital impression) OR (scanbody) OR (virtual design) OR (digital design) OR (cad/cam) OR (rapid prototyping) OR (monolithic) OR (full-contour))} AND {("Dental Technology" [MeSH) OR ((conventional workflow) OR (lost-wax-technique) OR (porcelain-fused-to-metal) OR (PFM) OR (implant impression) OR (hand-layering) OR (veneering) OR (framework))} AND {(("Study, Feasibility" [MeSH]) OR ("Survival" [MeSH]) OR ("Success" [MeSH]) OR ("Economics" [MeSH]) OR ("Costs, Cost Analysis" [MeSH]) OR ("Esthetics, Dental" [MeSH]) OR ("Patient Satisfaction" [MeSH])) OR ((feasibility) OR (efficiency) OR (patient-centered outcome))}. Assessment of risk of bias in selected studies was done at a 'trial level' including random sequence

  20. Staffing and Workflow of a Maturing Institutional Repository

    Directory of Open Access Journals (Sweden)

    Debora L. Madsen

    2013-02-01

    Full Text Available Institutional repositories (IRs have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University's IR (K-REx has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.

  1. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction.

    Science.gov (United States)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O; Ellis, Heidi J C; Gryk, Michael R

    2015-07-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  2. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert [UConn Health, Department of Molecular Biology and Biophysics (United States); Martyn, Timothy O. [Rensselaer at Hartford, Department of Engineering and Science (United States); Ellis, Heidi J. C. [Western New England College, Department of Computer Science and Information Technology (United States); Gryk, Michael R., E-mail: gryk@uchc.edu [UConn Health, Department of Molecular Biology and Biophysics (United States)

    2015-07-15

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses.

  3. CONNJUR Workflow Builder: a software integration environment for spectral reconstruction

    International Nuclear Information System (INIS)

    Fenwick, Matthew; Weatherby, Gerard; Vyas, Jay; Sesanker, Colbert; Martyn, Timothy O.; Ellis, Heidi J. C.; Gryk, Michael R.

    2015-01-01

    CONNJUR Workflow Builder (WB) is an open-source software integration environment that leverages existing spectral reconstruction tools to create a synergistic, coherent platform for converting biomolecular NMR data from the time domain to the frequency domain. WB provides data integration of primary data and metadata using a relational database, and includes a library of pre-built workflows for processing time domain data. WB simplifies maximum entropy reconstruction, facilitating the processing of non-uniformly sampled time domain data. As will be shown in the paper, the unique features of WB provide it with novel abilities to enhance the quality, accuracy, and fidelity of the spectral reconstruction process. WB also provides features which promote collaboration, education, parameterization, and non-uniform data sets along with processing integrated with the Rowland NMR Toolkit (RNMRTK) and NMRPipe software packages. WB is available free of charge in perpetuity, dual-licensed under the MIT and GPL open source licenses

  4. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    Science.gov (United States)

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Speed in Acquisitions

    DEFF Research Database (Denmark)

    Meglio, Olimpia; King, David R.; Risberg, Annette

    2017-01-01

    The advantage of speed is often invoked by academics and practitioners as an essential condition during post-acquisition integration, frequently without consideration of the impact earlier decisions have on acquisition speed. In this article, we examine the role speed plays in acquisitions across...... the acquisition process using research organized around characteristics that display complexity with respect to acquisition speed. We incorporate existing research with a process perspective of acquisitions in order to present trade-offs, and consider the influence of both stakeholders and the pre......-deal-completion context on acquisition speed, as well as the organization’s capabilities to facilitating that speed. Observed trade-offs suggest both that acquisition speed often requires longer planning time before an acquisition and that associated decisions require managerial judgement. A framework for improving...

  6. ESO Reflex: a graphical workflow engine for data reduction

    Science.gov (United States)

    Hook, Richard; Ullgrén, Marko; Romaniello, Martino; Maisala, Sami; Oittinen, Tero; Solin, Otto; Savolainen, Ville; Järveläinen, Pekka; Tyynelä, Jani; Péron, Michèle; Ballester, Pascal; Gabasch, Armin; Izzo, Carlo

    ESO Reflex is a prototype software tool that provides a novel approach to astronomical data reduction by integrating a modern graphical workflow system (Taverna) with existing legacy data reduction algorithms. Most of the raw data produced by instruments at the ESO Very Large Telescope (VLT) in Chile are reduced using recipes. These are compiled C applications following an ESO standard and utilising routines provided by the Common Pipeline Library (CPL). Currently these are run in batch mode as part of the data flow system to generate the input to the ESO/VLT quality control process and are also exported for use offline. ESO Reflex can invoke CPL-based recipes in a flexible way through a general purpose graphical interface. ESO Reflex is based on the Taverna system that was originally developed within the UK life-sciences community. Workflows have been created so far for three VLT/VLTI instruments, and the GUI allows the user to make changes to these or create workflows of their own. Python scripts or IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available. Taverna is intended for use with web services and experiments using ESO Reflex to access Virtual Observatory web services have been successfully performed. ESO Reflex is the main product developed by Sampo, a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal was to look into the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Sampo concluded early in 2008. This contribution will describe ESO Reflex and show several examples of its use both locally and using Virtual Observatory remote web services. ESO Reflex is expected to be released to the community in early 2009.

  7. a Standardized Approach to Topographic Data Processing and Workflow Management

    Science.gov (United States)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and

  8. Optimising metadata workflows in a distributed information environment

    OpenAIRE

    Robertson, R. John; Barton, Jane

    2005-01-01

    The different purposes present within a distributed information environment create the potential for repositories to enhance their metadata by capitalising on the diversity of metadata available for any given object. This paper presents three conceptual reference models required to achieve this optimisation of metadata workflow: the ecology of repositories, the object lifecycle model, and the metadata lifecycle model. It suggests a methodology for developing the metadata lifecycle model, and ...

  9. Workflow Optimization for Tuning Prostheses with High Input Channel

    Science.gov (United States)

    2017-10-01

    of Specific Aim 1 by driving a commercially available two DoF wrist and single DoF hand. The high -level control system will provide analog signals...AWARD NUMBER: W81XWH-16-1-0767 TITLE: Workflow Optimization for Tuning Prostheses with High Input Channel PRINCIPAL INVESTIGATOR: Daniel Merrill...Unlimited The views, opinions and/or findings contained in this report are those of the author(s) and should not be construed as an official Department

  10. A STRUCTURAL MODEL OF AN EXCAVATOR WORKFLOW CONTROL SYSTEM

    Directory of Open Access Journals (Sweden)

    A. Gurko

    2016-12-01

    Full Text Available Earthwork improving is connected with excavators automation. In this paper, on the basis of the analysis of problems that a hydraulic excavator control system have to solve, the hierarchical structure of a control system have been proposed. The decomposition of the control process had been executed that allowed to develop the structural model which reflects the characteristics of a multilevel space-distributed control system of an excavator workflow.

  11. ESO Reflex: A Graphical Workflow Engine for Data Reduction

    Science.gov (United States)

    Hook, R.; Romaniello, M.; Péron, M.; Ballester, P.; Gabasch, A.; Izzo, C.; Ullgrén, M.; Maisala, S.; Oittinen, T.; Solin, O.; Savolainen, V.; Järveläinen, P.; Tyynelä, J.

    2008-08-01

    Sampo {http://www.eso.org/sampo} (Hook et al. 2005) is a project led by ESO and conducted by a software development team from Finland as an in-kind contribution to joining ESO. The goal is to assess the needs of the ESO community in the area of data reduction environments and to create pilot software products that illustrate critical steps along the road to a new system. Those prototypes will not only be used to validate concepts and understand requirements but will also be tools of immediate value for the community. Most of the raw data produced by ESO instruments can be reduced using CPL {http://www.eso.org/cpl} recipes: compiled C programs following an ESO standard and utilizing routines provided by the Common Pipeline Library. Currently reduction recipes are run in batch mode as part of the data flow system to generate the input to the ESO VLT/VLTI quality control process and are also made public for external users. Sampo has developed a prototype application called ESO Reflex {http://www.eso.org/sampo/reflex/} that integrates a graphical user interface and existing data reduction algorithms. ESO Reflex can invoke CPL-based recipes in a flexible way through a dedicated interface. ESO Reflex is based on the graphical workflow engine Taverna {http://taverna.sourceforge.net} that was originally developed by the UK eScience community, mostly for work in the life sciences. Workflows have been created so far for three VLT/VLTI instrument modes ( VIMOS/IFU {http://www.eso.org/instruments/vimos/}, FORS spectroscopy {http://www.eso.org/instruments/fors/} and AMBER {http://www.eso.org/instruments/amber/}), and the easy-to-use GUI allows the user to make changes to these or create workflows of their own. Python scripts and IDL procedures can be easily brought into workflows and a variety of visualisation and display options, including custom product inspection and validation steps, are available.

  12. Multi-perspective workflow modeling for online surgical situation models.

    Science.gov (United States)

    Franke, Stefan; Meixensberger, Jürgen; Neumuth, Thomas

    2015-04-01

    Surgical workflow management is expected to enable situation-aware adaptation and intelligent systems behavior in an integrated operating room (OR). The overall aim is to unburden the surgeon and OR staff from both manual maintenance and information seeking tasks. A major step toward intelligent systems behavior is a stable classification of the surgical situation from multiple perspectives based on performed low-level tasks. The present work proposes a method for the classification of surgical situations based on multi-perspective workflow modeling. A model network that interconnects different types of surgical process models is described. Various aspects of a surgical situation description were considered: low-level tasks, high-level tasks, patient status, and the use of medical devices. A study with sixty neurosurgical interventions was conducted to evaluate the performance of our approach and its robustness against incomplete workflow recognition input. A correct classification rate of over 90% was measured for high-level tasks and patient status. The device usage models for navigation and neurophysiology classified over 95% of the situations correctly, whereas the ultrasound usage was more difficult to predict. Overall, the classification rate decreased with an increasing level of input distortion. Autonomous adaptation of medical devices and intelligent systems behavior do not currently depend solely on low-level tasks. Instead, they require a more general type of understanding of the surgical condition. The integration of various surgical process models in a network provided a comprehensive representation of the interventions and allowed for the generation of extensive situation descriptions. Multi-perspective surgical workflow modeling and online situation models will be a significant pre-requisite for reliable and intelligent systems behavior. Hence, they will contribute to a cooperative OR environment. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Impact of Diabetes E-Consults on Outpatient Clinic Workflow.

    Science.gov (United States)

    Zoll, Brian; Parikh, Pratik J; Gallimore, Jennie; Harrell, Stephen; Burke, Brian

    2015-08-01

    An e-consult is an electronic communication system between clinicians, usually a primary care physician (PCP) and a medical or surgical specialist, regarding general or patient-specific, low complexity questions that would not need an in-person consultation. The objectives of this study were to understand and quantify the impact of the e-consult initiative on outpatient clinic workflow and outcomes. We collected data from 5 different Veterans Affairs (VA) outpatient clinics and interviewed several physicians and staff members. We then developed a simulation model for a primary care team at an outpatient clinic. A detailed experimental study was conducted to determine the effects of factors, such as e-consult demand, view-alert notification arrivals, walk-in patient arrivals, and PCP unavailability, on e-consult cycle time. Statistical tests indicated that 4 factors related to outpatient clinic workflow were significant, and levels within each of the 4 significant factors resulted in statistically different e-consult cycle times. The arrival rate of electronic notifications, along with patient walk-ins, had a considerable effect on cycle time. Splitting the workload of an unavailable PCP among the other PCPs, instead of the current practice of allocating it to a single PCP, increases the system's ability to handle a much larger e-consult demand. The full potential of e-consults can only be realized if the workflow at the outpatient clinics is designed or modified to support this initiative. This study furthers our understanding of how e-consult systems can be analyzed and alternative workflows tested using statistical and simulation modeling to improve care delivery and outcomes. © The Author(s) 2014.

  14. Yadage and Packtivity - analysis preservation using parametrized workflows

    Science.gov (United States)

    Cranmer, Kyle; Heinrich, Lukas

    2017-10-01

    Preserving data analyses produced by the collaborations at LHC in a parametrized fashion is crucial in order to maintain reproducibility and re-usability. We argue for a declarative description in terms of individual processing steps - “packtivities” - linked through a dynamic directed acyclic graph (DAG) and present an initial set of JSON schemas for such a description and an implementation - “yadage” - capable of executing workflows of analysis preserved via Linux containers.

  15. Malware Normalization

    OpenAIRE

    Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut

    2005-01-01

    Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...

  16. Thermal Remote Sensing with Uav-Based Workflows

    Science.gov (United States)

    Boesch, R.

    2017-08-01

    Climate change will have a significant influence on vegetation health and growth. Predictions of higher mean summer temperatures and prolonged summer draughts may pose a threat to agriculture areas and forest canopies. Rising canopy temperatures can be an indicator of plant stress because of the closure of stomata and a decrease in the transpiration rate. Thermal cameras are available for decades, but still often used for single image analysis, only in oblique view manner or with visual evaluations of video sequences. Therefore remote sensing using a thermal camera can be an important data source to understand transpiration processes. Photogrammetric workflows allow to process thermal images similar to RGB data. But low spatial resolution of thermal cameras, significant optical distortion and typically low contrast require an adapted workflow. Temperature distribution in forest canopies is typically completely unknown and less distinct than for urban or industrial areas, where metal constructions and surfaces yield high contrast and sharp edge information. The aim of this paper is to investigate the influence of interior camera orientation, tie point matching and ground control points on the resulting accuracy of bundle adjustment and dense cloud generation with a typically used photogrammetric workflow for UAVbased thermal imagery in natural environments.

  17. a Workflow for UAV's Integration Into a Geodesign Platform

    Science.gov (United States)

    Anca, P.; Calugaru, A.; Alixandroae, I.; Nazarie, R.

    2016-06-01

    This paper presents a workflow for the development of various Geodesign scenarios. The subject is important in the context of identifying patterns and designing solutions for a Smart City with optimized public transportation, efficient buildings, efficient utilities, recreational facilities a.s.o.. The workflow describes the procedures starting with acquiring data in the field, data processing, orthophoto generation, DTM generation, integration into a GIS platform and analyzing for a better support for Geodesign. Esri's City Engine is used mostly for 3D modeling capabilities that enable the user to obtain 3D realistic models. The workflow uses as inputs information extracted from images acquired using UAVs technologies, namely eBee, existing 2D GIS geodatabases, and a set of CGA rules. The method that we used further, is called procedural modeling, and uses rules in order to extrude buildings, the street network, parcel zoning and side details, based on the initial attributes from the geodatabase. The resulted products are various scenarios for redesigning, for analyzing new exploitation sites. Finally, these scenarios can be published as interactive web scenes for internal, groups or pubic consultation. In this way, problems like the impact of new constructions being build, re-arranging green spaces or changing routes for public transportation, etc. are revealed through impact and visibility analysis or shadowing analysis and are brought to the citizen's attention. This leads to better decisions.

  18. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  19. Reproducible Research Data Analyses using the Common Workflow Language standards

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    This talk will introduce the Common Workflow Language project. In July 2016 they released standards that enable the portable, interoperable, and executable description of command line data analysis tools and workflow made from those tools. These descriptions are enhanced by CWL's first class (but optional) support for Docker containers. CWL originated from the world of bioinformatics but is not discipline specific and is gaining interest and use in other fields. Attendees who want to play with CWL prior to attending the presentation are invited to go through the "Gentle Introduction to the Common Workflow Language" tutorial on any OS X or Linux machine on their own time. About the speaker Michael R. Crusoe is one of the co-founders of the CWL project and is the CWL Community Engineer. His facilitation, technical contributions, and training on behalf of the project draw from his time as the former lead developer of C. Titus Brown's k-h-mer project, his previous career as a sysadmin and programmer, and his ex...

  20. Formalizing an integrative, multidisciplinary cancer therapy discovery workflow

    Science.gov (United States)

    McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy

    2014-01-01

    Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390

  1. Automated quality control in a file-based broadcasting workflow

    Science.gov (United States)

    Zhang, Lina

    2014-04-01

    Benefit from the development of information and internet technologies, television broadcasting is transforming from inefficient tape-based production and distribution to integrated file-based workflows. However, no matter how many changes have took place, successful broadcasting still depends on the ability to deliver a consistent high quality signal to the audiences. After the transition from tape to file, traditional methods of manual quality control (QC) become inadequate, subjective, and inefficient. Based on China Central Television's full file-based workflow in the new site, this paper introduces an automated quality control test system for accurate detection of hidden troubles in media contents. It discusses the system framework and workflow control when the automated QC is added. It puts forward a QC criterion and brings forth a QC software followed this criterion. It also does some experiments on QC speed by adopting parallel processing and distributed computing. The performance of the test system shows that the adoption of automated QC can make the production effective and efficient, and help the station to achieve a competitive advantage in the media market.

  2. ATLAS job transforms: a data driven workflow engine

    International Nuclear Information System (INIS)

    Stewart, G A; Breaden-Madden, W B; Maddocks, H J; Harenberg, T; Sandhoff, M; Sarrazin, B

    2014-01-01

    The need to run complex workflows for a high energy physics experiment such as ATLAS has always been present. However, as computing resources have become even more constrained, compared to the wealth of data generated by the LHC, the need to use resources efficiently and manage complex workflows within a single grid job have increased. In ATLAS, a new Job Transform framework has been developed that we describe in this paper. This framework manages the multiple execution steps needed to 'transform' one data type into another (e.g., RAW data to ESD to AOD to final ntuple) and also provides a consistent interface for the ATLAS production system. The new framework uses a data driven workflow definition which is both easy to manage and powerful. After a transform is defined, jobs are expressed simply by specifying the input data and the desired output data. The transform infrastructure then executes only the necessary substeps to produce the final data products. The global execution cost of running the job is minimised and the transform can adapt to scenarios where data can be produced along different execution paths. Transforms for specific physics tasks which support up to 60 individual substeps have been successfully run. As the new transforms infrastructure has been deployed in production many features have been added to the framework which improve reliability, quality of error reporting and also provide support for multi-process jobs.

  3. Improved compliance by BPM-driven workflow automation.

    Science.gov (United States)

    Holzmüller-Laue, Silke; Göde, Bernd; Fleischer, Heidi; Thurow, Kerstin

    2014-12-01

    Using methods and technologies of business process management (BPM) for the laboratory automation has important benefits (i.e., the agility of high-level automation processes, rapid interdisciplinary prototyping and implementation of laboratory tasks and procedures, and efficient real-time process documentation). A principal goal of the model-driven development is the improved transparency of processes and the alignment of process diagrams and technical code. First experiences of using the business process model and notation (BPMN) show that easy-to-read graphical process models can achieve and provide standardization of laboratory workflows. The model-based development allows one to change processes quickly and an easy adaption to changing requirements. The process models are able to host work procedures and their scheduling in compliance with predefined guidelines and policies. Finally, the process-controlled documentation of complex workflow results addresses modern laboratory needs of quality assurance. BPMN 2.0 as an automation language to control every kind of activity or subprocess is directed to complete workflows in end-to-end relationships. BPMN is applicable as a system-independent and cross-disciplinary graphical language to document all methods in laboratories (i.e., screening procedures or analytical processes). That means, with the BPM standard, a communication method of sharing process knowledge of laboratories is also available. © 2014 Society for Laboratory Automation and Screening.

  4. An integrated billing application to streamline clinician workflow.

    Science.gov (United States)

    Vawdrey, David K; Walsh, Colin; Stetson, Peter D

    2014-01-01

    Between 2008 and 2010, our academic medical center transitioned to electronic provider documentation using a commercial electronic health record system. For attending physicians, one of the most frustrating aspects of this experience was the system's failure to support their existing electronic billing workflow. Because of poor system integration, it was difficult to verify the supporting documentation for each bill and impractical to track whether billable notes had corresponding charges. We developed and deployed in 2011 an integrated billing application called "iCharge" that streamlines clinicians' documentation and billing workflow, and simultaneously populates the inpatient problem list using billing diagnosis codes. Each month, over 550 physicians use iCharge to submit approximately 23,000 professional service charges for over 4,200 patients. On average, about 2.5 new problems are added to each patient's problem list. This paper describes the challenges and benefits of workflow integration across disparate applications and presents an example of innovative software development within a commercial EHR framework.

  5. IT-benchmarking of clinical workflows: concept, implementation, and evaluation.

    Science.gov (United States)

    Thye, Johannes; Straede, Matthias-Christopher; Liebe, Jan-David; Hübner, Ursula

    2014-01-01

    Due to the emerging evidence of health IT as opportunity and risk for clinical workflows, health IT must undergo a continuous measurement of its efficacy and efficiency. IT-benchmarks are a proven means for providing this information. The aim of this study was to enhance the methodology of an existing benchmarking procedure by including, in particular, new indicators of clinical workflows and by proposing new types of visualisation. Drawing on the concept of information logistics, we propose four workflow descriptors that were applied to four clinical processes. General and specific indicators were derived from these descriptors and processes. 199 chief information officers (CIOs) took part in the benchmarking. These hospitals were assigned to reference groups of a similar size and ownership from a total of 259 hospitals. Stepwise and comprehensive feedback was given to the CIOs. Most participants who evaluated the benchmark rated the procedure as very good, good, or rather good (98.4%). Benchmark information was used by CIOs for getting a general overview, advancing IT, preparing negotiations with board members, and arguing for a new IT project.

  6. The MPO API: A tool for recording scientific workflows

    Energy Technology Data Exchange (ETDEWEB)

    Wright, John C., E-mail: jcwright@mit.edu [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Greenwald, Martin; Stillerman, Joshua [MIT Plasma Science and Fusion Center, Cambridge, MA (United States); Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia [General Atomics, San Diego, CA (United States); Romosan, Alex; Shoshani, Arie [Lawrence Berkeley Laboratory, Berkeley, CA (United States)

    2014-05-15

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access.

  7. The medical simulation markup language - simplifying the biomechanical modeling workflow.

    Science.gov (United States)

    Suwelack, Stefan; Stoll, Markus; Schalck, Sebastian; Schoch, Nicolai; Dillmann, Rüdiger; Bendl, Rolf; Heuveline, Vincent; Speidel, Stefanie

    2014-01-01

    Modeling and simulation of the human body by means of continuum mechanics has become an important tool in diagnostics, computer-assisted interventions and training. This modeling approach seeks to construct patient-specific biomechanical models from tomographic data. Usually many different tools such as segmentation and meshing algorithms are involved in this workflow. In this paper we present a generalized and flexible description for biomechanical models. The unique feature of the new modeling language is that it not only describes the final biomechanical simulation, but also the workflow how the biomechanical model is constructed from tomographic data. In this way, the MSML can act as a middleware between all tools used in the modeling pipeline. The MSML thus greatly facilitates the prototyping of medical simulation workflows for clinical and research purposes. In this paper, we not only detail the XML-based modeling scheme, but also present a concrete implementation. Different examples highlight the flexibility, robustness and ease-of-use of the approach.

  8. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  9. The MPO API: A tool for recording scientific workflows

    International Nuclear Information System (INIS)

    Wright, John C.; Greenwald, Martin; Stillerman, Joshua; Abla, Gheni; Chanthavong, Bobby; Flanagan, Sean; Schissel, David; Lee, Xia; Romosan, Alex; Shoshani, Arie

    2014-01-01

    Highlights: • A description of a new framework and tool for recording scientific workflows, especially those resulting from simulation and analysis. • An explanation of the underlying technologies used to implement this web based tool. • Several examples of using the tool. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for high-consequence applications. The Metadata, Provenance and Ontology (MPO) project builds on previous work [M. Greenwald, Fusion Eng. Des. 87 (2012) 2205–2208] and is focused on providing documentation of workflows, data provenance and the ability to data-mine large sets of results. While there are important design and development aspects to the data structures and user interfaces, we concern ourselves in this paper with the application programming interface (API) – the set of functions that interface with the data server. Our approach for the data server is to follow the Representational State Transfer (RESTful) software architecture style for client–server communication. At its core, the API uses the POST and GET methods of the HTTP protocol to transfer workflow information in message bodies to targets specified in the URL to and from the database via a web server. Higher level API calls are built upon this core API. This design facilitates implementation on different platforms and in different languages and is robust to changes in the underlying technologies used. The command line client implementation can communicate with the data server from any machine with HTTP access

  10. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach.

    Science.gov (United States)

    Haston, Elspeth; Cubey, Robert; Pullan, Martin; Atkins, Hannah; Harris, David J

    2012-01-01

    Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR) software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  11. Developing integrated workflows for the digitisation of herbarium specimens using a modular and scalable approach

    Directory of Open Access Journals (Sweden)

    Elspeth Haston

    2012-07-01

    Full Text Available Digitisation programmes in many institutes frequently involve disparate and irregular funding, diverse selection criteria and scope, with different members of staff managing and operating the processes. These factors have influenced the decision at the Royal Botanic Garden Edinburgh to develop an integrated workflow for the digitisation of herbarium specimens which is modular and scalable to enable a single overall workflow to be used for all digitisation projects. This integrated workflow is comprised of three principal elements: a specimen workflow, a data workflow and an image workflow.The specimen workflow is strongly linked to curatorial processes which will impact on the prioritisation, selection and preparation of the specimens. The importance of including a conservation element within the digitisation workflow is highlighted. The data workflow includes the concept of three main categories of collection data: label data, curatorial data and supplementary data. It is shown that each category of data has its own properties which influence the timing of data capture within the workflow. Development of software has been carried out for the rapid capture of curatorial data, and optical character recognition (OCR software is being used to increase the efficiency of capturing label data and supplementary data. The large number and size of the images has necessitated the inclusion of automated systems within the image workflow.

  12. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  13. Quantitative workflow based on NN for weighting criteria in landfill suitability mapping

    Science.gov (United States)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Alkhasawneh, Mutasem Sh.; Aziz, Hamidi Abdul

    2017-10-01

    Our study aims to introduce a new quantitative workflow that integrates neural networks (NNs) and multi criteria decision analysis (MCDA). Existing MCDA workflows reveal a number of drawbacks, because of the reliance on human knowledge in the weighting stage. Thus, new workflow presented to form suitability maps at the regional scale for solid waste planning based on NNs. A feed-forward neural network employed in the workflow. A total of 34 criteria were pre-processed to establish the input dataset for NN modelling. The final learned network used to acquire the weights of the criteria. Accuracies of 95.2% and 93.2% achieved for the training dataset and testing dataset, respectively. The workflow was found to be capable of reducing human interference to generate highly reliable maps. The proposed workflow reveals the applicability of NN in generating landfill suitability maps and the feasibility of integrating them with existing MCDA workflows.

  14. Language Acquisition without an Acquisition Device

    Science.gov (United States)

    O'Grady, William

    2012-01-01

    Most explanatory work on first and second language learning assumes the primacy of the acquisition phenomenon itself, and a good deal of work has been devoted to the search for an "acquisition device" that is specific to humans, and perhaps even to language. I will consider the possibility that this strategy is misguided and that language…

  15. The impact of electronic medical record systems on outpatient workflows: a longitudinal evaluation of its workflow effects.

    Science.gov (United States)

    Vishwanath, Arun; Singh, Sandeep Rajan; Winkelstein, Peter

    2010-11-01

    The promise of the electronic medical record (EMR) lies in its ability to reduce the costs of health care delivery and improve the overall quality of care--a promise that is realized through major changes in workflows within the health care organization. Yet little systematic information exists about the workflow effects of EMRs. Moreover, some of the research to-date points to reduced satisfaction among physicians after implementation of the EMR and increased time, i.e., negative workflow effects. A better understanding of the impact of the EMR on workflows is, hence, vital to understanding what the technology really does offer that is new and unique. (i) To empirically develop a physician centric conceptual model of the workflow effects of EMRs; (ii) To use the model to understand the antecedents to the physicians' workflow expectation from the new EMR; (iii) To track physicians' satisfaction overtime, 3 months and 20 months after implementation of the EMR; (iv) To explore the impact of technology learning curves on physicians' reported satisfaction levels. The current research uses the mixed-method technique of concept mapping to empirically develop the conceptual model of an EMR's workflow effects. The model is then used within a controlled study to track physician expectations from a new EMR system as well as their assessments of the EMR's performance 3 months and 20 months after implementation. The research tracks the actual implementation of a new EMR within the outpatient clinics of a large northeastern research hospital. The pre-implementation survey netted 20 physician responses; post-implementation Time 1 survey netted 22 responses, and Time 2 survey netted 26 physician responses. The implementation of the actual EMR served as the intervention. Since the study was conducted within the same setting and tracked a homogenous group of respondents, the overall study design ensured against extraneous influences on the results. Outcome measures were derived

  16. Transacsys PLC - acquisition

    CERN Multimedia

    2001-01-01

    Transacsys Workflow Ltd has agreed to pay 1 million Sfr to aquire the rights to the EDH suite of computer programmes developed at CERN for the administration of internal transactions. The company intends to market and sell-on EDH (1/2 page).

  17. Workflow in clinical trial sites & its association with near miss events for data quality: ethnographic, workflow & systems simulation.

    Science.gov (United States)

    de Carvalho, Elias Cesar Araujo; Batilana, Adelia Portero; Claudino, Wederson; Reis, Luiz Fernando Lima; Schmerling, Rafael A; Shah, Jatin; Pietrobon, Ricardo

    2012-01-01

    With the exponential expansion of clinical trials conducted in (Brazil, Russia, India, and China) and VISTA (Vietnam, Indonesia, South Africa, Turkey, and Argentina) countries, corresponding gains in cost and enrolment efficiency quickly outpace the consonant metrics in traditional countries in North America and European Union. However, questions still remain regarding the quality of data being collected in these countries. We used ethnographic, mapping and computer simulation studies to identify/address areas of threat to near miss events for data quality in two cancer trial sites in Brazil. Two sites in Sao Paolo and Rio Janeiro were evaluated using ethnographic observations of workflow during subject enrolment and data collection. Emerging themes related to threats to near miss events for data quality were derived from observations. They were then transformed into workflows using UML-AD and modeled using System Dynamics. 139 tasks were observed and mapped through the ethnographic study. The UML-AD detected four major activities in the workflow evaluation of potential research subjects prior to signature of informed consent, visit to obtain subject́s informed consent, regular data collection sessions following study protocol and closure of study protocol for a given project. Field observations pointed to three major emerging themes: (a) lack of standardized process for data registration at source document, (b) multiplicity of data repositories and (c) scarcity of decision support systems at the point of research intervention. Simulation with policy model demonstrates a reduction of the rework problem. Patterns of threats to data quality at the two sites were similar to the threats reported in the literature for American sites. The clinical trial site managers need to reorganize staff workflow by using information technology more efficiently, establish new standard procedures and manage professionals to reduce near miss events and save time/cost. Clinical trial

  18. WorkflowNet2BPEL4WS: A Tool for Translating Unstructured Workflow Processes to Readable BPEL

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M. P.

    2007-01-01

    code and not easy to use by end-users. Therefore, we provide a mapping from WF-nets to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. To evaluate WorkflowNet2BPEL4WS we used more than 100 processes modeled...

  19. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  20. Calo trigger acquisition system

    CERN Multimedia

    Franchini, Matteo

    2016-01-01

    Calo trigger acquisition system - Evolution of the acquisition system from a multiple boards system (upper, orange cables) to a single board one (below, light blue cables) where all the channels are collected in a single board.

  1. Modelling live forensic acquisition

    CSIR Research Space (South Africa)

    Grobler, MM

    2009-06-01

    Full Text Available This paper discusses the development of a South African model for Live Forensic Acquisition - Liforac. The Liforac model is a comprehensive model that presents a range of aspects related to Live Forensic Acquisition. The model provides forensic...

  2. Playing at Serial Acquisitions

    NARCIS (Netherlands)

    J.T.J. Smit (Han); T. Moraitis (Thras)

    2010-01-01

    textabstractBehavioral biases can result in suboptimal acquisition decisions-with the potential for errors exacerbated in consolidating industries, where consolidators design serial acquisition strategies and fight escalating takeover battles for platform companies that may determine their future

  3. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  4. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...

  5. A workflow to process 3D+time microscopy images of developing organisms and reconstruct their cell lineage

    Science.gov (United States)

    Faure, Emmanuel; Savy, Thierry; Rizzi, Barbara; Melani, Camilo; Stašová, Olga; Fabrèges, Dimitri; Špir, Róbert; Hammons, Mark; Čúnderlík, Róbert; Recher, Gaëlle; Lombardot, Benoît; Duloquin, Louise; Colin, Ingrid; Kollár, Jozef; Desnoulez, Sophie; Affaticati, Pierre; Maury, Benoît; Boyreau, Adeline; Nief, Jean-Yves; Calvat, Pascal; Vernier, Philippe; Frain, Monique; Lutfalla, Georges; Kergosien, Yannick; Suret, Pierre; Remešíková, Mariana; Doursat, René; Sarti, Alessandro; Mikula, Karol; Peyriéras, Nadine; Bourgine, Paul

    2016-01-01

    The quantitative and systematic analysis of embryonic cell dynamics from in vivo 3D+time image data sets is a major challenge at the forefront of developmental biology. Despite recent breakthroughs in the microscopy imaging of living systems, producing an accurate cell lineage tree for any developing organism remains a difficult task. We present here the BioEmergences workflow integrating all reconstruction steps from image acquisition and processing to the interactive visualization of reconstructed data. Original mathematical methods and algorithms underlie image filtering, nucleus centre detection, nucleus and membrane segmentation, and cell tracking. They are demonstrated on zebrafish, ascidian and sea urchin embryos with stained nuclei and membranes. Subsequent validation and annotations are carried out using Mov-IT, a custom-made graphical interface. Compared with eight other software tools, our workflow achieved the best lineage score. Delivered in standalone or web service mode, BioEmergences and Mov-IT offer a unique set of tools for in silico experimental embryology. PMID:26912388

  6. Web-based execution of graphical work-flows: a modular platform for multifunctional scientific process automation

    International Nuclear Information System (INIS)

    De Ley, E.; Jacobs, D.; Ounsy, M.

    2012-01-01

    The Passerelle process automation suite offers a fundamentally modular solution platform, based on a layered integration of several best-of-breed technologies. It has been successfully applied by Synchrotron Soleil as the sequencer for data acquisition and control processes on its beamlines, integrated with TANGO as a control bus and GlobalScreen TM ) as the SCADA package. Since last year it is being used as the graphical work-flow component for the development of an eclipse-based Data Analysis Work Bench, at ESRF. The top layer of Passerelle exposes an actor-based development paradigm, based on the Ptolemy framework (UC Berkeley). Actors provide explicit reusability and strong decoupling, combined with an inherently concurrent execution model. Actor libraries exist for TANGO integration, web-services, database operations, flow control, rules-based analysis, mathematical calculations, launching external scripts etc. Passerelle's internal architecture is based on OSGi, the major Java framework for modular service-based applications. A large set of modules exist that can be recombined as desired to obtain different features and deployment models. Besides desktop versions of the Passerelle work-flow workbench, there is also the Passerelle Manager. It is a secured web application including a graphical editor, for centralized design, execution, management and monitoring of process flows, integrating standard Java Enterprise services with OSGi. We will present the internal technical architecture, some interesting application cases and the lessons learnt. (authors)

  7. [First language acquisition research and theories of language acquisition].

    Science.gov (United States)

    Miller, S; Jungheim, M; Ptok, M

    2014-04-01

    In principle, a child can seemingly easily acquire any given language. First language acquisition follows a certain pattern which to some extent is found to be language independent. Since time immemorial, it has been of interest why children are able to acquire language so easily. Different disciplinary and methodological orientations addressing this question can be identified. A selective literature search in PubMed and Scopus was carried out and relevant monographies were considered. Different, partially overlapping phases can be distinguished in language acquisition research: whereas in ancient times, deprivation experiments were carried out to discover the "original human language", the era of diary studies began in the mid-19th century. From the mid-1920s onwards, behaviouristic paradigms dominated this field of research; interests were focussed on the determination of normal, average language acquisition. The subsequent linguistic period was strongly influenced by the nativist view of Chomsky and the constructivist concepts of Piaget. Speech comprehension, the role of speech input and the relevance of genetic disposition became the centre of attention. The interactionist concept led to a revival of the convergence theory according to Stern. Each of these four major theories--behaviourism, cognitivism, interactionism and nativism--have given valuable and unique impulses, but no single theory is universally accepted to provide an explanation of all aspects of language acquisition. Moreover, it can be critically questioned whether clinicians consciously refer to one of these theories in daily routine work and whether therapies are then based on this concept. It remains to be seen whether or not new theories of grammar, such as the so-called construction grammar (CxG), will eventually change the general concept of language acquisition.

  8. Workflow for the Targeted and Untargeted Detection of Small Metabolites in Fish Skin Mucus

    Directory of Open Access Journals (Sweden)

    Lada Ivanova

    2018-06-01

    Full Text Available The skin mucus of fish is in permanent contact with the aquatic environment. Data from the analysis of the chemical composition of skin mucus could potentially be used for monitoring the health status of the fish. Knowledge about mucus composition or change in composition over time could also contribute to understanding the aetiology of certain diseases. The objective of the present study was the development of a workflow for non-invasive sampling of skin mucus from farmed salmon (Salmo salar for the targeted and untargeted detection of small metabolites. Skin mucus was either scraped off, wiped off using medical wipes, or the mucus’ water phase was absorbed using the same type of medical wipes that was used for the wiping method. Following a simple filtration step, the obtained mucus samples were subjected to hydrophilic interaction chromatography coupled to high-resolution mass spectrometry. Post-acquisition processing included the targeted analysis of 86 small metabolites, of which up to 60 were detected in absorbed mucus. Untargeted analysis of the mucus samples from equally treated salmon revealed that the total variation of the metabolome was lowest in absorbed mucus and highest in the scraped mucus. Thus, future studies including small-molecule metabolomics of skin mucus in fish would benefit from a sampling regime employing absorption of the water phase in order to minimize the bias related to the sampling step. Furthermore, the absorption method is also a less invasive approach allowing for repetitive sampling within short time intervals.

  9. BEAM: A computational workflow system for managing and modeling material characterization data in HPC environments

    Energy Technology Data Exchange (ETDEWEB)

    Lingerfelt, Eric J [ORNL; Endeve, Eirik [ORNL; Ovchinnikov, Oleg S [ORNL; Borreguero Calvo, Jose M [ORNL; Park, Byung H [ORNL; Archibald, Richard K [ORNL; Symons, Christopher T [ORNL; Kalinin, Sergei V [ORNL; Messer, Bronson [ORNL; Shankar, Mallikarjun [ORNL; Jesse, Stephen [ORNL

    2016-01-01

    Improvements in scientific instrumentation allow imaging at mesoscopic to atomic length scales, many spectroscopic modes, and now with the rise of multimodal acquisition systems and the associated processing capability the era of multidimensional, informationally dense data sets has arrived. Technical issues in these combinatorial scientific fields are exacerbated by computational challenges best summarized as a necessity for drastic improvement in the capability to transfer, store, and analyze large volumes of data. The Bellerophon Environment for Analysis of Materials (BEAM) platform provides material scientists the capability to directly leverage the integrated computational and analytical power of High Performance Computing (HPC) to perform scalable data analysis and simulation via an intuitive, cross-platform client user interface. This framework delivers authenticated, push-button execution of complex user workflows that deploy data analysis algorithms and computational simulations utilizing the converged compute-and-data infrastructure at Oak Ridge National Laboratory s (ORNL) Compute and Data Environment for Science (CADES) and HPC environments like Titan at the Oak Ridge Leadership Computing Facility (OLCF). In this work we address the underlying HPC needs for characterization in the material science community, elaborate how BEAM s design and infrastructure tackle those needs, and present a small sub-set of user cases where scientists utilized BEAM across a broad range of analytical techniques and analysis modes.

  10. Smart acquisition EELS

    International Nuclear Information System (INIS)

    Sader, Kasim; Schaffer, Bernhard; Vaughan, Gareth; Brydson, Rik; Brown, Andy; Bleloch, Andrew

    2010-01-01

    We have developed a novel acquisition methodology for the recording of electron energy loss spectra (EELS) using a scanning transmission electron microscope (STEM): 'Smart Acquisition'. Smart Acquisition allows the independent control of probe scanning procedures and the simultaneous acquisition of analytical signals such as EELS. The original motivation for this work arose from the need to control the electron dose experienced by beam-sensitive specimens whilst maintaining a sufficiently high signal-to-noise ratio in the EEL signal for the extraction of useful analytical information (such as energy loss near edge spectral features) from relatively undamaged areas. We have developed a flexible acquisition framework which separates beam position data input, beam positioning, and EELS acquisition. In this paper we demonstrate the effectiveness of this technique on beam-sensitive thin films of amorphous aluminium trifluoride. Smart Acquisition has been used to expose lines to the electron beam, followed by analysis of the structures created by line-integrating EELS acquisitions, and the results are compared to those derived from a standard EELS linescan. High angle annular dark-field images show clear reductions in damage for the Smart Acquisition areas compared to the conventional linescan, and the Smart Acquisition low loss EEL spectra are more representative of the undamaged material than those derived using a conventional linescan. Atomically resolved EELS of all four elements of CaNdTiO show the high resolution capabilities of Smart Acquisition.

  11. Smart acquisition EELS

    Energy Technology Data Exchange (ETDEWEB)

    Sader, Kasim, E-mail: k.sader@leeds.ac.uk [SuperSTEM, J block, Daresbury Laboratory, Warrington, Cheshire, WA4 4AD (United Kingdom); Institute for Materials Research, University of Leeds, LS2 9JT (United Kingdom); Schaffer, Bernhard [SuperSTEM, J block, Daresbury Laboratory, Warrington, Cheshire, WA4 4AD (United Kingdom); Department of Physics and Astronomy, University of Glasgow (United Kingdom); Vaughan, Gareth [Institute for Materials Research, University of Leeds, LS2 9JT (United Kingdom); Brydson, Rik [SuperSTEM, J block, Daresbury Laboratory, Warrington, Cheshire, WA4 4AD (United Kingdom); Institute for Materials Research, University of Leeds, LS2 9JT (United Kingdom); Brown, Andy [Institute for Materials Research, University of Leeds, LS2 9JT (United Kingdom); Bleloch, Andrew [SuperSTEM, J block, Daresbury Laboratory, Warrington, Cheshire, WA4 4AD (United Kingdom); Department of Engineering, University of Liverpool, Liverpool (United Kingdom)

    2010-07-15

    We have developed a novel acquisition methodology for the recording of electron energy loss spectra (EELS) using a scanning transmission electron microscope (STEM): 'Smart Acquisition'. Smart Acquisition allows the independent control of probe scanning procedures and the simultaneous acquisition of analytical signals such as EELS. The original motivation for this work arose from the need to control the electron dose experienced by beam-sensitive specimens whilst maintaining a sufficiently high signal-to-noise ratio in the EEL signal for the extraction of useful analytical information (such as energy loss near edge spectral features) from relatively undamaged areas. We have developed a flexible acquisition framework which separates beam position data input, beam positioning, and EELS acquisition. In this paper we demonstrate the effectiveness of this technique on beam-sensitive thin films of amorphous aluminium trifluoride. Smart Acquisition has been used to expose lines to the electron beam, followed by analysis of the structures created by line-integrating EELS acquisitions, and the results are compared to those derived from a standard EELS linescan. High angle annular dark-field images show clear reductions in damage for the Smart Acquisition areas compared to the conventional linescan, and the Smart Acquisition low loss EEL spectra are more representative of the undamaged material than those derived using a conventional linescan. Atomically resolved EELS of all four elements of CaNdTiO show the high resolution capabilities of Smart Acquisition.

  12. An Adaptable Seismic Data Format for Modern Scientific Workflows

    Science.gov (United States)

    Smith, J. A.; Bozdag, E.; Krischer, L.; Lefebvre, M.; Lei, W.; Podhorszki, N.; Tromp, J.

    2013-12-01

    Data storage, exchange, and access play a critical role in modern seismology. Current seismic data formats, such as SEED, SAC, and SEG-Y, were designed with specific applications in mind and are frequently a major bottleneck in implementing efficient workflows. We propose a new modern parallel format that can be adapted for a variety of seismic workflows. The Adaptable Seismic Data Format (ASDF) features high-performance parallel read and write support and the ability to store an arbitrary number of traces of varying sizes. Provenance information is stored inside the file so that users know the origin of the data as well as the precise operations that have been applied to the waveforms. The design of the new format is based on several real-world use cases, including earthquake seismology and seismic interferometry. The metadata is based on the proven XML schemas StationXML and QuakeML. Existing time-series analysis tool-kits are easily interfaced with this new format so that seismologists can use robust, previously developed software packages, such as ObsPy and the SAC library. ADIOS, netCDF4, and HDF5 can be used as the underlying container format. At Princeton University, we have chosen to use ADIOS as the container format because it has shown superior scalability for certain applications, such as dealing with big data on HPC systems. In the context of high-performance computing, we have implemented ASDF into the global adjoint tomography workflow on Oak Ridge National Laboratory's supercomputer Titan.

  13. Order Entry Protocols Are an Amenable Target for Workflow Automation.

    Science.gov (United States)

    Tudor, James; Klochko, Chad; Patel, Milind; Siegal, Daniel

    2018-04-21

    Order entry protocol selection of advanced imaging studies is labor-intensive, can disrupt workflow, and may displace staff from more valuable tasks. The aim of this study was to explore and compare the behaviors of radiologic technologists and radiologists when determining protocol to identify opportunities for workflow automation. A data set of over 273,000 cross-sectional examination orders from four hospitals within our health system was created. From this data set, we isolated the 12 most frequently requested examinations, which represent almost 50% of the entirety of advanced imaging volume. Intergroup comparisons were made between behavior of radiologic technologists and radiologists or residents when determining protocol. Frequencies of changes were calculated. Common parameters of changed examinations were identified. The overall change rate for both radiologists and residents (4%) is very low and comparable to the overall change rate of radiologic technologists (1%). The change rates for the 12 most ordered examinations were calculated and compared individually. Most examinations that underwent change involved a patient with a low estimated glomerular filtration rate, a patient with a contrast allergy, or a provider ordering a general examination but in fact wanting an organ-specific protocol or an angiographic study. Order entry protocol selection of the most frequently ordered advanced imaging examinations was rarely a value-added activity because these examinations are rarely changed. Changes follow predictable patterns that make order entry protocol selection of most radiology orders for advanced imaging amenable to workflow automation. Copyright © 2018 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  14. MBAT: A scalable informatics system for unifying digital atlasing workflows

    Directory of Open Access Journals (Sweden)

    Sane Nikhil

    2010-12-01

    Full Text Available Abstract Background Digital atlases provide a common semantic and spatial coordinate system that can be leveraged to compare, contrast, and correlate data from disparate sources. As the quality and amount of biological data continues to advance and grow, searching, referencing, and comparing this data with a researcher's own data is essential. However, the integration process is cumbersome and time-consuming due to misaligned data, implicitly defined associations, and incompatible data sources. This work addressing these challenges by providing a unified and adaptable environment to accelerate the workflow to gather, align, and analyze the data. Results The MouseBIRN Atlasing Toolkit (MBAT project was developed as a cross-platform, free open-source application that unifies and accelerates the digital atlas workflow. A tiered, plug-in architecture was designed for the neuroinformatics and genomics goals of the project to provide a modular and extensible design. MBAT provides the ability to use a single query to search and retrieve data from multiple data sources, align image data using the user's preferred registration method, composite data from multiple sources in a common space, and link relevant informatics information to the current view of the data or atlas. The workspaces leverage tool plug-ins to extend and allow future extensions of the basic workspace functionality. A wide variety of tool plug-ins were developed that integrate pre-existing as well as newly created technology into each workspace. Novel atlasing features were also developed, such as supporting multiple label sets, dynamic selection and grouping of labels, and synchronized, context-driven display of ontological data. Conclusions MBAT empowers researchers to discover correlations among disparate data by providing a unified environment for bringing together distributed reference resources, a user's image data, and biological atlases into the same spatial or semantic context

  15. A practical data processing workflow for multi-OMICS projects.

    Science.gov (United States)

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post

  16. The impact of missing sensor information on surgical workflow management.

    Science.gov (United States)

    Liebmann, Philipp; Meixensberger, Jürgen; Wiedemann, Peter; Neumuth, Thomas

    2013-09-01

    Sensor systems in the operating room may encounter intermittent data losses that reduce the performance of surgical workflow management systems (SWFMS). Sensor data loss could impact SWFMS-based decision support, device parameterization, and information presentation. The purpose of this study was to understand the robustness of surgical process models when sensor information is partially missing. SWFMS changes caused by wrong or no data from the sensor system which tracks the progress of a surgical intervention were tested. The individual surgical process models (iSPMs) from 100 different cataract procedures of 3 ophthalmologic surgeons were used to select a randomized subset and create a generalized surgical process model (gSPM). A disjoint subset was selected from the iSPMs and used to simulate the surgical process against the gSPM. The loss of sensor data was simulated by removing some information from one task in the iSPM. The effect of missing sensor data was measured using several metrics: (a) successful relocation of the path in the gSPM, (b) the number of steps to find the converging point, and (c) the perspective with the highest occurrence of unsuccessful path findings. A gSPM built using 30% of the iSPMs successfully found the correct path in 90% of the cases. The most critical sensor data were the information regarding the instrument used by the surgeon. We found that use of a gSPM to provide input data for a SWFMS is robust and can be accurate despite missing sensor data. A surgical workflow management system can provide the surgeon with workflow guidance in the OR for most cases. Sensor systems for surgical process tracking can be evaluated based on the stability and accuracy of functional and spatial operative results.

  17. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    Energy Technology Data Exchange (ETDEWEB)

    2009-11-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  18. CMS Data Processing Workflows during an Extended Cosmic Ray Run

    CERN Document Server

    Chatrchyan, S; Sirunyan, A M; Adam, W; Arnold, B; Bergauer, H; Bergauer, T; Dragicevic, M; Eichberger, M; Erö, J; Friedl, M; Frühwirth, R; Ghete, V M; Hammer, J; Hänsel, S; Hoch, M; Hörmann, N; Hrubec, J; Jeitler, M; Kasieczka, G; Kastner, K; Krammer, M; Liko, D; Magrans de Abril, I; Mikulec, I; Mittermayr, F; Neuherz, B; Oberegger, M; Padrta, M; Pernicka, M; Rohringer, H; Schmid, S; Schöfbeck, R; Schreiner, T; Stark, R; Steininger, H; Strauss, J; Taurok, A; Teischinger, F; Themel, T; Uhl, D; Wagner, P; Waltenberger, W; Walzel, G; Widl, E; Wulz, C E; Chekhovsky, V; Dvornikov, O; Emeliantchik, I; Litomin, A; Makarenko, V; Marfin, I; Mossolov, V; Shumeiko, N; Solin, A; Stefanovitch, R; Suarez Gonzalez, J; Tikhonov, A; Fedorov, A; Karneyeu, A; Korzhik, M; Panov, V; Zuyeuski, R; Kuchinsky, P; Beaumont, W; Benucci, L; Cardaci, M; De Wolf, E A; Delmeire, E; Druzhkin, D; Hashemi, M; Janssen, X; Maes, T; Mucibello, L; Ochesanu, S; Rougny, R; Selvaggi, M; Van Haevermaet, H; Van Mechelen, P; Van Remortel, N; Adler, V; Beauceron, S; Blyweert, S; D'Hondt, J; De Weirdt, S; Devroede, O; Heyninck, J; Kalogeropoulos, A; Maes, J; Maes, M; Mozer, M U; Tavernier, S; Van Doninck, W; Van Mulders, P; Villella, I; Bouhali, O; Chabert, E C; Charaf, O; Clerbaux, B; De Lentdecker, G; Dero, V; Elgammal, S; Gay, A P R; Hammad, G H; Marage, P E; Rugovac, S; Vander Velde, C; Vanlaer, P; Wickens, J; Grunewald, M; Klein, B; Marinov, A; Ryckbosch, D; Thyssen, F; Tytgat, M; Vanelderen, L; Verwilligen, P; Basegmez, S; Bruno, G; Caudron, J; Delaere, C; Demin, P; Favart, D; Giammanco, A; Grégoire, G; Lemaitre, V; Militaru, O; Ovyn, S; Piotrzkowski, K; Quertenmont, L; Schul, N; Beliy, N; Daubie, E; Alves, G A; Pol, M E; Souza, M H G; Carvalho, W; De Jesus Damiao, D; De Oliveira Martins, C; Fonseca De Souza, S; Mundim, L; Oguri, V; Santoro, A; Silva Do Amaral, S M; Sznajder, A; Fernandez Perez Tomei, T R; Ferreira Dias, M A; Gregores, E M; Novaes, S F; Abadjiev, K; Anguelov, T; Damgov, J; Darmenov, N; Dimitrov, L; Genchev, V; Iaydjiev, P; Piperov, S; Stoykova, S; Sultanov, G; Trayanov, R; Vankov, I; Dimitrov, A; Dyulendarova, M; Kozhuharov, V; Litov, L; Marinova, E; Mateev, M; Pavlov, B; Petkov, P; Toteva, Z; Chen, G M; Chen, H S; Guan, W; Jiang, C H; Liang, D; Liu, B; Meng, X; Tao, J; Wang, J; Wang, Z; Xue, Z; Zhang, Z; Ban, Y; Cai, J; Ge, Y; Guo, S; Hu, Z; Mao, Y; Qian, S J; Teng, H; Zhu, B; Avila, C; Baquero Ruiz, M; Carrillo Montoya, C A; Gomez, A; Gomez Moreno, B; Ocampo Rios, A A; Osorio Oliveros, A F; Reyes Romero, D; Sanabria, J C; Godinovic, N; Lelas, K; Plestina, R; Polic, D; Puljak, I; Antunovic, Z; Dzelalija, M; Brigljevic, V; Duric, S; Kadija, K; Morovic, S; Fereos, R; Galanti, M; Mousa, J; Papadakis, A; Ptochos, F; Razis, P A; Tsiakkouri, D; Zinonos, Z; Hektor, A; Kadastik, M; Kannike, K; Müntel, M; Raidal, M; Rebane, L; Anttila, E; Czellar, S; Härkönen, J; Heikkinen, A; Karimäki, V; Kinnunen, R; Klem, J; Kortelainen, M J; Lampén, T; Lassila-Perini, K; Lehti, S; Lindén, T; Luukka, P; Mäenpää, T; Nysten, J; Tuominen, E; Tuominiemi, J; Ungaro, D; Wendland, L; Banzuzi, K; Korpela, A; Tuuva, T; Nedelec, P; Sillou, D; Besancon, M; Chipaux, R; Dejardin, M; Denegri, D; Descamps, J; Fabbro, B; Faure, J L; Ferri, F; Ganjour, S; Gentit, F X; Givernaud, A; Gras, P; Hamel de Monchenault, G; Jarry, P; Lemaire, M C; Locci, E; Malcles, J; Marionneau, M; Millischer, L; Rander, J; Rosowsky, A; Rousseau, D; Titov, M; Verrecchia, P; Baffioni, S; Bianchini, L; Bluj, M; Busson, P; Charlot, C; Dobrzynski, L; Granier de Cassagnac, R; Haguenauer, M; Miné, P; Paganini, P; Sirois, Y; Thiebaux, C; Zabi, A; Agram, J L; Besson, A; Bloch, D; Bodin, D; Brom, J M; Conte, E; Drouhin, F; Fontaine, J C; Gelé, D; Goerlach, U; Gross, L; Juillot, P; Le Bihan, A C; Patois, Y; Speck, J; Van Hove, P; Baty, C; Bedjidian, M; Blaha, J; Boudoul, G; Brun, H; Chanon, N; Chierici, R; Contardo, D; Depasse, P; Dupasquier, T; El Mamouni, H; Fassi, F; Fay, J; Gascon, S; Ille, B; Kurca, T; Le Grand, T; Lethuillier, M; Lumb, N; Mirabito, L; Perries, S; Vander Donckt, M; Verdier, P; Djaoshvili, N; Roinishvili, N; Roinishvili, V; Amaglobeli, N; Adolphi, R; Anagnostou, G; Brauer, R; Braunschweig, W; Edelhoff, M; Esser, H; Feld, L; Karpinski, W; Khomich, A; Klein, K; Mohr, N; Ostaptchouk, A; Pandoulas, D; Pierschel, G; Raupach, F; Schael, S; Schultz von Dratzig, A; Schwering, G; Sprenger, D; Thomas, M; Weber, M; Wittmer, B; Wlochal, M; Actis, O; Altenhöfer, G; Bender, W; Biallass, P; Erdmann, M; Fetchenhauer, G; Frangenheim, J; Hebbeker, T; Hilgers, G; Hinzmann, A; Hoepfner, K; Hof, C; Kirsch, M; Klimkovich, T; Kreuzer, P; Lanske, D; Merschmeyer, M; Meyer, A; Philipps, B; Pieta, H; Reithler, H; Schmitz, S A; Sonnenschein, L; Sowa, M; Steggemann, J; Szczesny, H; Teyssier, D; Zeidler, C; Bontenackels, M; Davids, M; Duda, M; Flügge, G; Geenen, H; Giffels, M; Haj Ahmad, W; Hermanns, T; Heydhausen, D; Kalinin, S; Kress, T; Linn, A; Nowack, A; Perchalla, L; Poettgens, M; Pooth, O; Sauerland, P; Stahl, A; Tornier, D; Zoeller, M H; Aldaya Martin, M; Behrens, U; Borras, K; Campbell, A; Castro, E; Dammann, D; Eckerlin, G; Flossdorf, A; Flucke, G; Geiser, A; Hatton, D; Hauk, J; Jung, H; Kasemann, M; Katkov, I; Kleinwort, C; Kluge, H; Knutsson, A; Kuznetsova, E; Lange, W; Lohmann, W; Mankel, R; Marienfeld, M; Meyer, A B; Miglioranzi, S; Mnich, J; Ohlerich, M; Olzem, J; Parenti, A; Rosemann, C; Schmidt, R; Schoerner-Sadenius, T; Volyanskyy, D; Wissing, C; Zeuner, W D; Autermann, C; Bechtel, F; Draeger, J; Eckstein, D; Gebbert, U; Kaschube, K; Kaussen, G; Klanner, R; Mura, B; Naumann-Emme, S; Nowak, F; Pein, U; Sander, C; Schleper, P; Schum, T; Stadie, H; Steinbrück, G; Thomsen, J; Wolf, R; Bauer, J; Blüm, P; Buege, V; Cakir, A; Chwalek, T; De Boer, W; Dierlamm, A; Dirkes, G; Feindt, M; Felzmann, U; Frey, M; Furgeri, A; Gruschke, J; Hackstein, C; Hartmann, F; Heier, S; Heinrich, M; Held, H; Hirschbuehl, D; Hoffmann, K H; Honc, S; Jung, C; Kuhr, T; Liamsuwan, T; Martschei, D; Mueller, S; Müller, Th; Neuland, M B; Niegel, M; Oberst, O; Oehler, A; Ott, J; Peiffer, T; Piparo, D; Quast, G; Rabbertz, K; Ratnikov, F; Ratnikova, N; Renz, M; Saout, C; Sartisohn, G; Scheurer, A; Schieferdecker, P; Schilling, F P; Schott, G; Simonis, H J; Stober, F M; Sturm, P; Troendle, D; Trunov, A; Wagner, W; Wagner-Kuhr, J; Zeise, M; Zhukov, V; Ziebarth, E B; Daskalakis, G; Geralis, T; Karafasoulis, K; Kyriakis, A; Loukas, D; Markou, A; Markou, C; Mavrommatis, C; Petrakou, E; Zachariadou, A; Gouskos, L; Katsas, P; Panagiotou, A; Evangelou, I; Kokkas, P; Manthos, N; Papadopoulos, I; Patras, V; Triantis, F A; Bencze, G; Boldizsar, L; Debreczeni, G; Hajdu, C; Hernath, S; Hidas, P; Horvath, D; Krajczar, K; Laszlo, A; Patay, G; Sikler, F; Toth, N; Vesztergombi, G; Beni, N; Christian, G; Imrek, J; Molnar, J; Novak, D; Palinkas, J; Szekely, G; Szillasi, Z; Tokesi, K; Veszpremi, V; Kapusi, A; Marian, G; Raics, P; Szabo, Z; Trocsanyi, Z L; Ujvari, B; Zilizi, G; Bansal, S; Bawa, H S; Beri, S B; Bhatnagar, V; Jindal, M; Kaur, M; Kaur, R; Kohli, J M; Mehta, M Z; Nishu, N; Saini, L K; Sharma, A; Singh, A; Singh, J B; Singh, S P; Ahuja, S; Arora, S; Bhattacharya, S; Chauhan, S; Choudhary, B C; Gupta, P; Jain, S; Jha, M; Kumar, A; Ranjan, K; Shivpuri, R K; Srivastava, A K; Choudhury, R K; Dutta, D; Kailas, S; Kataria, S K; Mohanty, A K; Pant, L M; Shukla, P; Topkar, A; Aziz, T; Guchait, M; Gurtu, A; Maity, M; Majumder, D; Majumder, G; Mazumdar, K; Nayak, A; Saha, A; Sudhakar, K; Banerjee, S; Dugad, S; Mondal, N K; Arfaei, H; Bakhshiansohi, H; Fahim, A; Jafari, A; Mohammadi Najafabadi, M; Moshaii, A; Paktinat Mehdiabadi, S; Rouhani, S; Safarzadeh, B; Zeinali, M; Felcini, M; Abbrescia, M; Barbone, L; Chiumarulo, F; Clemente, A; Colaleo, A; Creanza, D; Cuscela, G; De Filippis, N; De Palma, M; De Robertis, G; Donvito, G; Fedele, F; Fiore, L; Franco, M; Iaselli, G; Lacalamita, N; Loddo, F; Lusito, L; Maggi, G; Maggi, M; Manna, N; Marangelli, B; My, S; Natali, S; Nuzzo, S; Papagni, G; Piccolomo, S; Pierro, G A; Pinto, C; Pompili, A; Pugliese, G; Rajan, R; Ranieri, A; Romano, F; Roselli, G; Selvaggi, G; Shinde, Y; Silvestris, L; Tupputi, S; Zito, G; Abbiendi, G; Bacchi, W; Benvenuti, A C; Boldini, M; Bonacorsi, D; Braibant-Giacomelli, S; Cafaro, V D; Caiazza, S S; Capiluppi, P; Castro, A; Cavallo, F R; Codispoti, G; Cuffiani, M; D'Antone, I; Dallavalle, G M; Fabbri, F; Fanfani, A; Fasanella, D; Giacomelli, P; Giordano, V; Giunta, M; Grandi, C; Guerzoni, M; Marcellini, S; Masetti, G; Montanari, A; Navarria, F L; Odorici, F; Pellegrini, G; Perrotta, A; Rossi, A M; Rovelli, T; Siroli, G; Torromeo, G; Travaglini, R; Albergo, S; Costa, S; Potenza, R; Tricomi, A; Tuve, C; Barbagli, G; Broccolo, G; Ciulli, V; Civinini, C; D'Alessandro, R; Focardi, E; Frosali, S; Gallo, E; Genta, C; Landi, G; Lenzi, P; Meschini, M; Paoletti, S; Sguazzoni, G; Tropiano, A; Benussi, L; Bertani, M; Bianco, S; Colafranceschi, S; Colonna, D; Fabbri, F; Giardoni, M; Passamonti, L; Piccolo, D; Pierluigi, D; Ponzio, B; Russo, A; Fabbricatore, P; Musenich, R; Benaglia, A; Calloni, M; Cerati, G B; D'Angelo, P; De Guio, F; Farina, F M; Ghezzi, A; Govoni, P; Malberti, M; Malvezzi, S; Martelli, A; Menasce, D; Miccio, V; Moroni, L; Negri, P; Paganoni, M; Pedrini, D; Pullia, A; Ragazzi, S; Redaelli, N; Sala, S; Salerno, R; Tabarelli de Fatis, T; Tancini, V; Taroni, S; Buontempo, S; Cavallo, N; Cimmino, A; De Gruttola, M; Fabozzi, F; Iorio, A O M; Lista, L; Lomidze, D; Noli, P; Paolucci, P; Sciacca, C; Azzi, P; Bacchetta, N; Barcellan, L; Bellan, P; Bellato, M; Benettoni, M; Biasotto, M; Bisello, D; Borsato, E; Branca, A; Carlin, R; Castellani, L; Checchia, P; Conti, E; Dal Corso, F; De Mattia, M; Dorigo, T; Dosselli, U; Fanzago, F; Gasparini, F; Gasparini, U; Giubilato, P; Gonella, F; Gresele, A; Gulmini, M; Kaminskiy, A; Lacaprara, S; Lazzizzera, I; Margoni, M; Maron, G; Mattiazzo, S; Mazzucato, M; Meneghelli, M; Meneguzzo, A T; Michelotto, M; Montecassiano, F; Nespolo, M; Passaseo, M; Pegoraro, M; Perrozzi, L; Pozzobon, N; Ronchese, P; Simonetto, F; Toniolo, N; Torassa, E; Tosi, M; Triossi, A; Vanini, S; Ventura, S; Zotto, P; Zumerle, G; Baesso, P; Berzano, U; Bricola, S; Necchi, M M; Pagano, D; Ratti, S P; Riccardi, C; Torre, P; Vicini, A; Vitulo, P; Viviani, C; Aisa, D; Aisa, S; Babucci, E; Biasini, M; Bilei, G M; Caponeri, B; Checcucci, B; Dinu, N; Fanò, L; Farnesini, L; Lariccia, P; Lucaroni, A; Mantovani, G; Nappi, A; Piluso, A; Postolache, V; Santocchia, A; Servoli, L; Tonoiu, D; Vedaee, A; Volpe, R; Azzurri, P; Bagliesi, G; Bernardini, J; Berretta, L; Boccali, T; Bocci, A; Borrello, L; Bosi, F; Calzolari, F; Castaldi, R; Dell'Orso, R; Fiori, F; Foà, L; Gennai, S; Giassi, A; Kraan, A; Ligabue, F; Lomtadze, T; Mariani, F; Martini, L; Massa, M; Messineo, A; Moggi, A; Palla, F; Palmonari, F; Petragnani, G; Petrucciani, G; Raffaelli, F; Sarkar, S; Segneri, G; Serban, A T; Spagnolo, P; Tenchini, R; Tolaini, S; Tonelli, G; Venturi, A; Verdini, P G; Baccaro, S; Barone, L; Bartoloni, A; Cavallari, F; Dafinei, I; Del Re, D; Di Marco, E; Diemoz, M; Franci, D; Longo, E; Organtini, G; Palma, A; Pandolfi, F; Paramatti, R; Pellegrino, F; Rahatlou, S; Rovelli, C; Alampi, G; Amapane, N; Arcidiacono, R; Argiro, S; Arneodo, M; Biino, C; Borgia, M A; Botta, C; Cartiglia, N; Castello, R; Cerminara, G; Costa, M; Dattola, D; Dellacasa, G; Demaria, N; Dughera, G; Dumitrache, F; Graziano, A; Mariotti, C; Marone, M; Maselli, S; Migliore, E; Mila, G; Monaco, V; Musich, M; Nervo, M; Obertino, M M; Oggero, S; Panero, R; Pastrone, N; Pelliccioni, M; Romero, A; Ruspa, M; Sacchi, R; Solano, A; Staiano, A; Trapani, P P; Trocino, D; Vilela Pereira, A; Visca, L; Zampieri, A; Ambroglini, F; Belforte, S; Cossutti, F; Della Ricca, G; Gobbo, B; Penzo, A; Chang, S; Chung, J; Kim, D H; Kim, G N; Kong, D J; Park, H; Son, D C; Bahk, S Y; Song, S; Jung, S Y; Hong, B; Kim, H; Kim, J H; Lee, K S; Moon, D H; Park, S K; Rhee, H B; Sim, K S; Kim, J; Choi, M; Hahn, G; Park, I C; Choi, S; Choi, Y; Goh, J; Jeong, H; Kim, T J; Lee, J; Lee, S; Janulis, M; Martisiute, D; Petrov, P; Sabonis, T; Castilla Valdez, H; Sánchez Hernández, A; Carrillo Moreno, S; Morelos Pineda, A; Allfrey, P; Gray, R N C; Krofcheck, D; Bernardino Rodrigues, N; Butler, P H; Signal, T; Williams, J C; Ahmad, M; Ahmed, I; Ahmed, W; Asghar, M I; Awan, M I M; Hoorani, H R; Hussain, I; Khan, W A; Khurshid, T; Muhammad, S; Qazi, S; Shahzad, H; Cwiok, M; Dabrowski, R; Dominik, W; Doroba, K; Konecki, M; Krolikowski, J; Pozniak, K; Romaniuk, Ryszard; Zabolotny, W; Zych, P; Frueboes, T; Gokieli, R; Goscilo, L; Górski, M; Kazana, M; Nawrocki, K; Szleper, M; Wrochna, G; Zalewski, P; Almeida, N; Antunes Pedro, L; Bargassa, P; David, A; Faccioli, P; Ferreira Parracho, P G; Freitas Ferreira, M; Gallinaro, M; Guerra Jordao, M; Martins, P; Mini, G; Musella, P; Pela, J; Raposo, L; Ribeiro, P Q; Sampaio, S; Seixas, J; Silva, J; Silva, P; Soares, D; Sousa, M; Varela, J; Wöhri, H K; Altsybeev, I; Belotelov, I; Bunin, P; Ershov, Y; Filozova, I; Finger, M; Finger, M., Jr.; Golunov, A; Golutvin, I; Gorbounov, N; Kalagin, V; Kamenev, A; Karjavin, V; Konoplyanikov, V; Korenkov, V; Kozlov, G; Kurenkov, A; Lanev, A; Makankin, A; Mitsyn, V V; Moisenz, P; Nikonov, E; Oleynik, D; Palichik, V; Perelygin, V; Petrosyan, A; Semenov, R; Shmatov, S; Smirnov, V; Smolin, D; Tikhonenko, E; Vasil'ev, S; Vishnevskiy, A; Volodko, A; Zarubin, A; Zhiltsov, V; Bondar, N; Chtchipounov, L; Denisov, A; Gavrikov, Y; Gavrilov, G; Golovtsov, V; Ivanov, Y; Kim, V; Kozlov, V; Levchenko, P; Obrant, G; Orishchin, E; Petrunin, A; Shcheglov, Y; Shchetkovskiy, A; Sknar, V; Smirnov, I; Sulimov, V; Tarakanov, V; Uvarov, L; Vavilov, S; Velichko, G; Volkov, S; Vorobyev, A; Andreev, Yu; Anisimov, A; Antipov, P; Dermenev, A; Gninenko, S; Golubev, N; Kirsanov, M; Krasnikov, N; Matveev, V; Pashenkov, A; Postoev, V E; Solovey, A; Toropin, A; Troitsky, S; Baud, A; Epshteyn, V; Gavrilov, V; Ilina, N; Kaftanov, V; Kolosov, V; Kossov, M; Krokhotin, A; Kuleshov, S; Oulianov, A; Safronov, G; Semenov, S; Shreyber, I; Stolin, V; Vlasov, E; Zhokin, A; Boos, E; Dubinin, M; Dudko, L; Ershov, A; Gribushin, A; Klyukhin, V; Kodolova, O; Lokhtin, I; Petrushanko, S; Sarycheva, L; Savrin, V; Snigirev, A; Vardanyan, I; Dremin, I; Kirakosyan, M; Konovalova, N; Rusakov, S V; Vinogradov, A; Akimenko, S; Artamonov, A; Azhgirey, I; Bitioukov, S; Burtovoy, V; Grishin, V; Kachanov, V; Konstantinov, D; Krychkine, V; Levine, A; Lobov, I; Lukanin, V; Mel'nik, Y; Petrov, V; Ryutin, R; Slabospitsky, S; Sobol, A; Sytine, A; Tourtchanovitch, L; Troshin, S; Tyurin, N; Uzunian, A; Volkov, A; Adzic, P; Djordjevic, M; Jovanovic, D; Krpic, D; Maletic, D; Puzovic, J; Smiljkovic, N; Aguilar-Benitez, M; Alberdi, J; Alcaraz Maestre, J; Arce, P; Barcala, J M; Battilana, C; Burgos Lazaro, C; Caballero Bejar, J; Calvo, E; Cardenas Montes, M; Cepeda, M; Cerrada, M; Chamizo Llatas, M; Clemente, F; Colino, N; Daniel, M; De La Cruz, B; Delgado Peris, A; Diez Pardos, C; Fernandez Bedoya, C; Fernández Ramos, J P; Ferrando, A; Flix, J; Fouz, M C; Garcia-Abia, P; Garcia-Bonilla, A C; Gonzalez Lopez, O; Goy Lopez, S; Hernandez, J M; Josa, M I; Marin, J; Merino, G; Molina, J; Molinero, A; Navarrete, J J; Oller, J C; Puerta Pelayo, J; Romero, L; Santaolalla, J; Villanueva Munoz, C; Willmott, C; Yuste, C; Albajar, C; Blanco Otano, M; de Trocóniz, J F; Garcia Raboso, A; Lopez Berengueres, J O; Cuevas, J; Fernandez Menendez, J; Gonzalez Caballero, I; Lloret Iglesias, L; Naves Sordo, H; Vizan Garcia, J M; Cabrillo, I J; Calderon, A; Chuang, S H; Diaz Merino, I; Diez Gonzalez, C; Duarte Campderros, J; Fernandez, M; Gomez, G; Gonzalez Sanchez, J; Gonzalez Suarez, R; Jorda, C; Lobelle Pardo, P; Lopez Virto, A; Marco, J; Marco, R; Martinez Rivero, C; Martinez Ruiz del Arbol, P; Matorras, F; Rodrigo, T; Ruiz Jimeno, A; Scodellaro, L; Sobron Sanudo, M; Vila, I; Vilar Cortabitarte, R; Abbaneo, D; Albert, E; Alidra, M; Ashby, S; Auffray, E; Baechler, J; Baillon, P; Ball, A H; Bally, S L; Barney, D; Beaudette, F; Bellan, R; Benedetti, D; Benelli, G; Bernet, C; Bloch, P; Bolognesi, S; Bona, M; Bos, J; Bourgeois, N; Bourrel, T; Breuker, H; Bunkowski, K; Campi, D; Camporesi, T; Cano, E; Cattai, A; Chatelain, J P; Chauvey, M; Christiansen, T; Coarasa Perez, J A; Conde Garcia, A; Covarelli, R; Curé, B; De Roeck, A; Delachenal, V; Deyrail, D; Di Vincenzo, S; Dos Santos, S; Dupont, T; Edera, L M; Elliott-Peisert, A; Eppard, M; Favre, M; Frank, N; Funk, W; Gaddi, A; Gastal, M; Gateau, M; Gerwig, H; Gigi, D; Gill, K; Giordano, D; Girod, J P; Glege, F; Gomez-Reino Garrido, R; Goudard, R; Gowdy, S; Guida, R; Guiducci, L; Gutleber, J; Hansen, M; Hartl, C; Harvey, J; Hegner, B; Hoffmann, H F; Holzner, A; Honma, A; Huhtinen, M; Innocente, V; Janot, P; Le Godec, G; Lecoq, P; Leonidopoulos, C; Loos, R; Lourenço, C; Lyonnet, A; Macpherson, A; Magini, N; Maillefaud, J D; Maire, G; Mäki, T; Malgeri, L; Mannelli, M; Masetti, L; Meijers, F; Meridiani, P; Mersi, S; Meschi, E; Meynet Cordonnier, A; Moser, R; Mulders, M; Mulon, J; Noy, M; Oh, A; Olesen, G; Onnela, A; Orimoto, T; Orsini, L; Perez, E; Perinic, G; Pernot, J F; Petagna, P; Petiot, P; Petrilli, A; Pfeiffer, A; Pierini, M; Pimiä, M; Pintus, R; Pirollet, B; Postema, H; Racz, A; Ravat, S; Rew, S B; Rodrigues Antunes, J; Rolandi, G; Rovere, M; Ryjov, V; Sakulin, H; Samyn, D; Sauce, H; Schäfer, C; Schlatter, W D; Schröder, M; Schwick, C; Sciaba, A; Segoni, I; Sharma, A; Siegrist, N; Siegrist, P; Sinanis, N; Sobrier, T; Sphicas, P; Spiga, D; Spiropulu, M; Stöckli, F; Traczyk, P; Tropea, P; Troska, J; Tsirou, A; Veillet, L; Veres, G I; Voutilainen, M; Wertelaers, P; Zanetti, M; Bertl, W; Deiters, K; Erdmann, W; Gabathuler, K; Horisberger, R; Ingram, Q; Kaestli, H C; König, S; Kotlinski, D; Langenegger, U; Meier, F; Renker, D; Rohe, T; Sibille, J; Starodumov, A; Betev, B; Caminada, L; Chen, Z; Cittolin, S; Da Silva Di Calafiori, D R; Dambach, S; Dissertori, G; Dittmar, M; Eggel, C; Eugster, J; Faber, G; Freudenreich, K; Grab, C; Hervé, A; Hintz, W; Lecomte, P; Luckey, P D; Lustermann, W; Marchica, C; Milenovic, P; Moortgat, F; Nardulli, A; Nessi-Tedaldi, F; Pape, L; Pauss, F; Punz, T; Rizzi, A; Ronga, F J; Sala, L; Sanchez, A K; Sawley, M C; Sordini, V; Stieger, B; Tauscher, L; Thea, A; Theofilatos, K; Treille, D; Trüb, P; Weber, M; Wehrli, L; Weng, J; Zelepoukine, S; Amsler, C; Chiochia, V; De Visscher, S; Regenfus, C; Robmann, P; Rommerskirchen, T; Schmidt, A; Tsirigkas, D; Wilke, L; Chang, Y H; Chen, E A; Chen, W T; Go, A; Kuo, C M; Li, S W; Lin, W; Bartalini, P; Chang, P; Chao, Y; Chen, K F; Hou, W S; Hsiung, Y; Lei, Y J; Lin, S W; Lu, R S; Schümann, J; Shiu, J G; Tzeng, Y M; Ueno, K; Velikzhanin, Y; Wang, C C; Wang, M; Adiguzel, A; Ayhan, A; Azman Gokce, A; Bakirci, M N; Cerci, S; Dumanoglu, I; Eskut, E; Girgis, S; Gurpinar, E; Hos, I; Karaman, T; Kayis Topaksu, A; Kurt, P; Önengüt, G; Önengüt Gökbulut, G; Ozdemir, K; Ozturk, S; Polatöz, A; Sogut, K; Tali, B; Topakli, H; Uzun, D; Vergili, L N; Vergili, M; Akin, I V; Aliev, T; Bilmis, S; Deniz, M; Gamsizkan, H; Guler, A M; Öcalan, K; Serin, M; Sever, R; Surat, U E; Zeyrek, M; Deliomeroglu, M; Demir, D; Gülmez, E; Halu, A; Isildak, B; Kaya, M; Kaya, O; Ozkorucuklu, S; Sonmez, N; Levchuk, L; Lukyanenko, S; Soroka, D; Zub, S; Bostock, F; Brooke, J J; Cheng, T L; Cussans, D; Frazier, R; Goldstein, J; Grant, N; Hansen, M; Heath, G P; Heath, H F; Hill, C; Huckvale, B; Jackson, J; Mackay, C K; Metson, S; Newbold, D M; Nirunpong, K; Smith, V J; Velthuis, J; Walton, R; Bell, K W; Brew, C; Brown, R M; Camanzi, B; Cockerill, D J A; Coughlan, J A; Geddes, N I; Harder, K; Harper, S; Kennedy, B W; Murray, P; Shepherd-Themistocleous, C H; Tomalin, I R; Williams, J H; Womersley, W J; Worm, S D; Bainbridge, R; Ball, G; Ballin, J; Beuselinck, R; Buchmuller, O; Colling, D; Cripps, N; Davies, G; Della Negra, M; Foudas, C; Fulcher, J; Futyan, D; Hall, G; Hays, J; Iles, G; Karapostoli, G; MacEvoy, B C; Magnan, A M; Marrouche, J; Nash, J; Nikitenko, A; Papageorgiou, A; Pesaresi, M; Petridis, K; Pioppi, M; Raymond, D M; Rompotis, N; Rose, A; Ryan, M J; Seez, C; Sharp, P; Sidiropoulos, G; Stettler, M; Stoye, M; Takahashi, M; Tapper, A; Timlin, C; Tourneur, S; Vazquez Acosta, M; Virdee, T; Wakefield, S; Wardrope, D; Whyntie, T; Wingham, M; Cole, J E; Goitom, I; Hobson, P R; Khan, A; Kyberd, P; Leslie, D; Munro, C; Reid, I D; Siamitros, C; Taylor, R; Teodorescu, L; Yaselli, I; Bose, T; Carleton, M; Hazen, E; Heering, A H; Heister, A; John, J St; Lawson, P; Lazic, D; Osborne, D; Rohlf, J; Sulak, L; Wu, S; Andrea, J; Avetisyan, A; Bhattacharya, S; Chou, J P; Cutts, D; Esen, S; Kukartsev, G; Landsberg, G; Narain, M; Nguyen, D; Speer, T; Tsang, K V; Breedon, R; Calderon De La Barca Sanchez, M; Case, M; Cebra, D; Chertok, M; Conway, J; Cox, P T; Dolen, J; Erbacher, R; Friis, E; Ko, W; Kopecky, A; Lander, R; Lister, A; Liu, H; Maruyama, S; Miceli, T; Nikolic, M; Pellett, D; Robles, J; Searle, M; Smith, J; Squires, M; Stilley, J; Tripathi, M; Vasquez Sierra, R; Veelken, C; Andreev, V; Arisaka, K; Cline, D; Cousins, R; Erhan, S; Hauser, J; Ignatenko, M; Jarvis, C; Mumford, J; Plager, C; Rakness, G; Schlein, P; Tucker, J; Valuev, V; Wallny, R; Yang, X; Babb, J; Bose, M; Chandra, A; Clare, R; Ellison, J A; Gary, J W; Hanson, G; Jeng, G Y; Kao, S C; Liu, F; Liu, H; Luthra, A; Nguyen, H; Pasztor, G; Satpathy, A; Shen, B C; Stringer, R; Sturdy, J; Sytnik, V; Wilken, R; Wimpenny, S; Branson, J G; Dusinberre, E; Evans, D; Golf, F; Kelley, R; Lebourgeois, M; Letts, J; Lipeles, E; Mangano, B; Muelmenstaedt, J; Norman, M; Padhi, S; Petrucci, A; Pi, H; Pieri, M; Ranieri, R; Sani, M; Sharma, V; Simon, S; Würthwein, F; Yagil, A; Campagnari, C; D'Alfonso, M; Danielson, T; Garberson, J; Incandela, J; Justus, C; Kalavase, P; Koay, S A; Kovalskyi, D; Krutelyov, V; Lamb, J; Lowette, S; Pavlunin, V; Rebassoo, F; Ribnik, J; Richman, J; Rossin, R; Stuart, D; To, W; Vlimant, J R; Witherell, M; Apresyan, A; Bornheim, A; Bunn, J; Chiorboli, M; Gataullin, M; Kcira, D; Litvine, V; Ma, Y; Newman, H B; Rogan, C; Timciuc, V; Veverka, J; Wilkinson, R; Yang, Y; Zhang, L; Zhu, K; Zhu, R Y; Akgun, B; Carroll, R; Ferguson, T; Jang, D W; Jun, S Y; Paulini, M; Russ, J; Terentyev, N; Vogel, H; Vorobiev, I; Cumalat, J P; Dinardo, M E; Drell, B R; Ford, W T; Heyburn, B; Luiggi Lopez, E; Nauenberg, U; Stenson, K; Ulmer, K; Wagner, S R; Zang, S L; Agostino, L; Alexander, J; Blekman, F; Cassel, D; Chatterjee, A; Das, S; Gibbons, L K; Heltsley, B; Hopkins, W; Khukhunaishvili, A; Kreis, B; Kuznetsov, V; Patterson, J R; Puigh, D; Ryd, A; Shi, X; Stroiney, S; Sun, W; Teo, W D; Thom, J; Vaughan, J; Weng, Y; Wittich, P; Beetz, C P; Cirino, G; Sanzeni, C; Winn, D; Abdullin, S; Afaq, M A; Albrow, M; Ananthan, B; Apollinari, G; Atac, M; Badgett, W; Bagby, L; Bakken, J A; Baldin, B; Banerjee, S; Banicz, K; Bauerdick, L A T; Beretvas, A; Berryhill, J; Bhat, P C; Biery, K; Binkley, M; Bloch, I; Borcherding, F; Brett, A M; Burkett, K; Butler, J N; Chetluru, V; Cheung, H W K; Chlebana, F; Churin, I; Cihangir, S; Crawford, M; Dagenhart, W; Demarteau, M; Derylo, G; Dykstra, D; Eartly, D P; Elias, J E; Elvira, V D; Evans, D; Feng, L; Fischler, M; Fisk, I; Foulkes, S; Freeman, J; Gartung, P; Gottschalk, E; Grassi, T; Green, D; Guo, Y; Gutsche, O; Hahn, A; Hanlon, J; Harris, R M; Holzman, B; Howell, J; Hufnagel, D; James, E; Jensen, H; Johnson, M; Jones, C D; Joshi, U; Juska, E; Kaiser, J; Klima, B; Kossiakov, S; Kousouris, K; Kwan, S; Lei, C M; Limon, P; Lopez Perez, J A; Los, S; Lueking, L; Lukhanin, G; Lusin, S; Lykken, J; Maeshima, K; Marraffino, J M; Mason, D; McBride, P; Miao, T; Mishra, K; Moccia, S; Mommsen, R; Mrenna, S; Muhammad, A S; Newman-Holmes, C; Noeding, C; O'Dell, V; Prokofyev, O; Rivera, R; Rivetta, C H; Ronzhin, A; Rossman, P; Ryu, S; Sekhri, V; Sexton-Kennedy, E; Sfiligoi, I; Sharma, S; Shaw, T M; Shpakov, D; Skup, E; Smith, R P; Soha, A; Spalding, W J; Spiegel, L; Suzuki, I; Tan, P; Tanenbaum, W; Tkaczyk, S; Trentadue, R; Uplegger, L; Vaandering, E W; Vidal, R; Whitmore, J; Wicklund, E; Wu, W; Yarba, J; Yumiceva, F; Yun, J C; Acosta, D; Avery, P; Barashko, V; Bourilkov, D; Chen, M; Di Giovanni, G P; Dobur, D; Drozdetskiy, A; Field, R D; Fu, Y; Furic, I K; Gartner, J; Holmes, D; Kim, B; Klimenko, S; Konigsberg, J; Korytov, A; Kotov, K; Kropivnitskaya, A; Kypreos, T; Madorsky, A; Matchev, K; Mitselmakher, G; Pakhotin, Y; Piedra Gomez, J; Prescott, C; Rapsevicius, V; Remington, R; Schmitt, M; Scurlock, B; Wang, D; Yelton, J; Ceron, C; Gaultney, V; Kramer, L; Lebolo, L M; Linn, S; Markowitz, P; Martinez, G; Rodriguez, J L; Adams, T; Askew, A; Baer, H; Bertoldi, M; Chen, J; Dharmaratna, W G D; Gleyzer, S V; Haas, J; Hagopian, S; Hagopian, V; Jenkins, M; Johnson, K F; Prettner, E; Prosper, H; Sekmen, S; Baarmand, M M; Guragain, S; Hohlmann, M; Kalakhety, H; Mermerkaya, H; Ralich, R; Vodopiyanov, I; Abelev, B; Adams, M R; Anghel, I M; Apanasevich, L; Bazterra, V E; Betts, R R; Callner, J; Castro, M A; Cavanaugh, R; Dragoiu, C; Garcia-Solis, E J; Gerber, C E; Hofman, D J; Khalatian, S; Mironov, C; Shabalina, E; Smoron, A; Varelas, N; Akgun, U; Albayrak, E A; Ayan, A S; Bilki, B; Briggs, R; Cankocak, K; Chung, K; Clarida, W; Debbins, P; Duru, F; Ingram, F D; Lae, C K; McCliment, E; Merlo, J P; Mestvirishvili, A; Miller, M J; Moeller, A; Nachtman, J; Newsom, C R; Norbeck, E; Olson, J; Onel, Y; Ozok, F; Parsons, J; Schmidt, I; Sen, S; Wetzel, J; Yetkin, T; Yi, K; Barnett, B A; Blumenfeld, B; Bonato, A; Chien, C Y; Fehling, D; Giurgiu, G; Gritsan, A V; Guo, Z J; Maksimovic, P; Rappoccio, S; Swartz, M; Tran, N V; Zhang, Y; Baringer, P; Bean, A; Grachov, O; Murray, M; Radicci, V; Sanders, S; Wood, J S; Zhukova, V; Bandurin, D; Bolton, T; Kaadze, K; Liu, A; Maravin, Y; Onoprienko, D; Svintradze, I; Wan, Z; Gronberg, J; Hollar, J; Lange, D; Wright, D; Baden, D; Bard, R; Boutemeur, M; Eno, S C; Ferencek, D; Hadley, N J; Kellogg, R G; Kirn, M; Kunori, S; Rossato, K; Rumerio, P; Santanastasio, F; Skuja, A; Temple, J; Tonjes, M B; Tonwar, S C; Toole, T; Twedt, E; Alver, B; Bauer, G; Bendavid, J; Busza, W; Butz, E; Cali, I A; Chan, M; D'Enterria, D; Everaerts, P; Gomez Ceballos, G; Hahn, K A; Harris, P; Jaditz, S; Kim, Y; Klute, M; Lee, Y J; Li, W; Loizides, C; Ma, T; Miller, M; Nahn, S; Paus, C; Roland, C; Roland, G; Rudolph, M; Stephans, G; Sumorok, K; Sung, K; Vaurynovich, S; Wenger, E A; Wyslouch, B; Xie, S; Yilmaz, Y; Yoon, A S; Bailleux, D; Cooper, S I; Cushman, P; Dahmes, B; De Benedetti, A; Dolgopolov, A; Dudero, P R; Egeland, R; Franzoni, G; Haupt, J; Inyakin, A; Klapoetke, K; Kubota, Y; Mans, J; Mirman, N; Petyt, D; Rekovic, V; Rusack, R; Schroeder, M; Singovsky, A; Zhang, J; Cremaldi, L M; Godang, R; Kroeger, R; Perera, L; Rahmat, R; Sanders, D A; Sonnek, P; Summers, D; Bloom, K; Bockelman, B; Bose, S; Butt, J; Claes, D R; Dominguez, A; Eads, M; Keller, J; Kelly, T; Kravchenko, I; Lazo-Flores, J; Lundstedt, C; Malbouisson, H; Malik, S; Snow, G R; Baur, U; Iashvili, I; Kharchilava, A; Kumar, A; Smith, K; Strang, M; Alverson, G; Barberis, E; Boeriu, O; Eulisse, G; Govi, G; McCauley, T; Musienko, Y; Muzaffar, S; Osborne, I; Paul, T; Reucroft, S; Swain, J; Taylor, L; Tuura, L; Anastassov, A; Gobbi, B; Kubik, A; Ofierzynski, R A; Pozdnyakov, A; Schmitt, M; Stoynev, S; Velasco, M; Won, S; Antonelli, L; Berry, D; Hildreth, M; Jessop, C; Karmgard, D J; Kolberg, T; Lannon, K; Lynch, S; Marinelli, N; Morse, D M; Ruchti, R; Slaunwhite, J; Warchol, J; Wayne, M; Bylsma, B; Durkin, L S; Gilmore, J; Gu, J; Killewald, P; Ling, T Y; Williams, G; Adam, N; Berry, E; Elmer, P; Garmash, A; Gerbaudo, D; Halyo, V; Hunt, A; Jones, J; Laird, E; Marlow, D; Medvedeva, T; Mooney, M; Olsen, J; Piroué, P; Stickland, D; Tully, C; Werner, J S; Wildish, T; Xie, Z; Zuranski, A; Acosta, J G; Bonnett Del Alamo, M; Huang, X T; Lopez, A; Mendez, H; Oliveros, S; Ramirez Vargas, J E; Santacruz, N; Zatzerklyany, A; Alagoz, E; Antillon, E; Barnes, V E; Bolla, G; Bortoletto, D; Everett, A; Garfinkel, A F; Gecse, Z; Gutay, L; Ippolito, N; Jones, M; Koybasi, O; Laasanen, A T; Leonardo, N; Liu, C; Maroussov, V; Merkel, P; Miller, D H; Neumeister, N; Sedov, A; Shipsey, I; Yoo, H D; Zheng, Y; Jindal, P; Parashar, N; Cuplov, V; Ecklund, K M; Geurts, F J M; Liu, J H; Maronde, D; Matveev, M; Padley, B P; Redjimi, R; Roberts, J; Sabbatini, L; Tumanov, A; Betchart, B; Bodek, A; Budd, H; Chung, Y S; de Barbaro, P; Demina, R; Flacher, H; Gotra, Y; Harel, A; Korjenevski, S; Miner, D C; Orbaker, D; Petrillo, G; Vishnevskiy, D; Zielinski, M; Bhatti, A; Demortier, L; Goulianos, K; Hatakeyama, K; Lungu, G; Mesropian, C; Yan, M; Atramentov, O; Bartz, E; Gershtein, Y; Halkiadakis, E; Hits, D; Lath, A; Rose, K; Schnetzer, S; Somalwar, S; Stone, R; Thomas, S; Watts, T L; Cerizza, G; Hollingsworth, M; Spanier, S; Yang, Z C; York, A; Asaadi, J; Aurisano, A; Eusebi, R; Golyash, A; Gurrola, A; Kamon, T; Nguyen, C N; Pivarski, J; Safonov, A; Sengupta, S; Toback, D; Weinberger, M; Akchurin, N; Berntzon, L; Gumus, K; Jeong, C; Kim, H; Lee, S W; Popescu, S; Roh, Y; Sill, A; Volobouev, I; Washington, E; Wigmans, R; Yazgan, E; Engh, D; Florez, C; Johns, W; Pathak, S; Sheldon, P; Andelin, D; Arenton, M W; Balazs, M; Boutle, S; Buehler, M; Conetti, S; Cox, B; Hirosky, R; Ledovskoy, A; Neu, C; Phillips II, D; Ronquest, M; Yohay, R; Gollapinni, S; Gunthoti, K; Harr, R; Karchin, P E; Mattson, M; Sakharov, A; Anderson, M; Bachtis, M; Bellinger, J N; Carlsmith, D; Crotty, I; Dasu, S; Dutta, S; Efron, J; Feyzi, F; Flood, K; Gray, L; Grogg, K S; Grothe, M; Hall-Wilton, R; Jaworski, M; Klabbers, P; Klukas, J; Lanaro, A; Lazaridis, C; Leonard, J; Loveless, R; Magrans de Abril, M; Mohapatra, A; Ott, G; Polese, G; Reeder, D; Savin, A; Smith, W H; Sourkov, A; Swanson, J; Weinberg, M; Wenman, D; Wensveen, M; White, A

    2010-01-01

    The CMS Collaboration conducted a month-long data taking exercise, the Cosmic Run At Four Tesla, during October-November 2008, with the goal of commissioning the experiment for extended operation. With all installed detector systems participating, CMS recorded 270 million cosmic ray events with the solenoid at a magnetic field strength of 3.8 T. This paper describes the data flow from the detector through the various online and offline computing systems, as well as the workflows used for recording the data, for aligning and calibrating the detector, and for analysis of the data.

  19. Integrating workflow and project management systems for PLM applications

    Directory of Open Access Journals (Sweden)

    Fabio Fonseca Pereira de Paula

    2008-07-01

    Full Text Available The adoption of Product Life-cycle Management Systems (PLMs concept is fundamental to improve the product development, mainly to small and medium enterprises (SMEs. One of the challenges is the integration between project management and product data management functions. The paper presents an analysis of the potential integration strategies for a specifics product data management system (SMARTEAM and a project management system (Microsoft Project, which are commonly used for SMEs. Finally the article presents some considerations about the study of Project Management solutions in SMB’s companies, considering the PLM approach. Key-words: integration, project management (PM, workflow, PDM, PLM.

  20. Reference and PDF-manager software: complexities, support and workflow.

    Science.gov (United States)

    Mead, Thomas L; Berryman, Donna R

    2010-10-01

    In the past, librarians taught reference management by training library users to use established software programs such as RefWorks or EndNote. In today's environment, there is a proliferation of Web-based programs that are being used by library clientele that offer a new twist on the well-known reference management programs. Basically, these new programs are PDF-manager software (e.g., Mendeley or Papers). Librarians are faced with new questions, issues, and concerns, given the new workflows and pathways that these PDF-manager programs present. This article takes a look at some of those.

  1. SMITH: a LIMS for handling next-generation sequencing workflows.

    Science.gov (United States)

    Venco, Francesco; Vaskin, Yuriy; Ceol, Arnaud; Muller, Heiko

    2014-01-01

    Life-science laboratories make increasing use of Next Generation Sequencing (NGS) for studying bio-macromolecules and their interactions. Array-based methods for measuring gene expression or protein-DNA interactions are being replaced by RNA-Seq and ChIP-Seq. Sequencing is generally performed by specialized facilities that have to keep track of sequencing requests, trace samples, ensure quality and make data available according to predefined privileges. An integrated tool helps to troubleshoot problems, to maintain a high quality standard, to reduce time and costs. Commercial and non-commercial tools called LIMS (Laboratory Information Management Systems) are available for this purpose. However, they often come at prohibitive cost and/or lack the flexibility and scalability needed to adjust seamlessly to the frequently changing protocols employed. In order to manage the flow of sequencing data produced at the Genomic Unit of the Italian Institute of Technology (IIT), we developed SMITH (Sequencing Machine Information Tracking and Handling). SMITH is a web application with a MySQL server at the backend. Wet-lab scientists of the Centre for Genomic Science and database experts from the Politecnico of Milan in the context of a Genomic Data Model Project developed SMITH. The data base schema stores all the information of an NGS experiment, including the descriptions of all protocols and algorithms used in the process. Notably, an attribute-value table allows associating an unconstrained textual description to each sample and all the data produced afterwards. This method permits the creation of metadata that can be used to search the database for specific files as well as for statistical analyses. SMITH runs automatically and limits direct human interaction mainly to administrative tasks. SMITH data-delivery procedures were standardized making it easier for biologists and analysts to navigate the data. Automation also helps saving time. The workflows are available

  2. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    The Business Process Execution Language for Web Services (BPEL) has emerged as the de-facto standard for implementing processes. Although intended as a language for connecting web services, its application is not limited to cross-organizational processes. It is expected that in the near future...... and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...

  3. SMITH: a LIMS for handling next-generation sequencing workflows

    Science.gov (United States)

    2014-01-01

    workflows are available through an API provided by the workflow management system. The parameters and input data are passed to the workflow engine that performs de-multiplexing, quality control, alignments, etc. Conclusions SMITH standardizes, automates, and speeds up sequencing workflows. Annotation of data with key-value pairs facilitates meta-analysis. PMID:25471934

  4. A three-level atomicity model for decentralized workflow management systems

    Science.gov (United States)

    Ben-Shaul, Israel Z.; Heineman, George T.

    1996-12-01

    A workflow management system (WFMS) employs a workflow manager (WM) to execute and automate the various activities within a workflow. To protect the consistency of data, the WM encapsulates each activity with a transaction; a transaction manager (TM) then guarantees the atomicity of activities. Since workflows often group several activities together, the TM is responsible for guaranteeing the atomicity of these units. There are scalability issues, however, with centralized WFMSs. Decentralized WFMSs provide an architecture for multiple autonomous WFMSs to interoperate, thus accommodating multiple workflows and geographically-dispersed teams. When atomic units are composed of activities spread across multiple WFMSs, however, there is a conflict between global atomicity and local autonomy of each WFMS. This paper describes a decentralized atomicity model that enables workflow administrators to specify the scope of multi-site atomicity based upon the desired semantics of multi-site tasks in the decentralized WFMS. We describe an architecture that realizes our model and execution paradigm.

  5. Observing health professionals' workflow patterns for diabetes care - First steps towards an ontology for EHR services.

    Science.gov (United States)

    Schweitzer, M; Lasierra, N; Hoerbst, A

    2015-01-01

    Increasing the flexibility from a user-perspective and enabling a workflow based interaction, facilitates an easy user-friendly utilization of EHRs for healthcare professionals' daily work. To offer such versatile EHR-functionality, our approach is based on the execution of clinical workflows by means of a composition of semantic web-services. The backbone of such architecture is an ontology which enables to represent clinical workflows and facilitates the selection of suitable services. In this paper we present the methods and results after running observations of diabetes routine consultations which were conducted in order to identify those workflows and the relation among the included tasks. Mentioned workflows were first modeled by BPMN and then generalized. As a following step in our study, interviews will be conducted with clinical personnel to validate modeled workflows.

  6. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms.

    Science.gov (United States)

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2014-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies.

  7. Inter-observer reliability assessments in time motion studies: the foundation for meaningful clinical workflow analysis.

    Science.gov (United States)

    Lopetegui, Marcelo A; Bai, Shasha; Yen, Po-Yin; Lai, Albert; Embi, Peter; Payne, Philip R O

    2013-01-01

    Understanding clinical workflow is critical for researchers and healthcare decision makers. Current workflow studies tend to oversimplify and underrepresent the complexity of clinical workflow. Continuous observation time motion studies (TMS) could enhance clinical workflow studies by providing rich quantitative data required for in-depth workflow analyses. However, methodological inconsistencies have been reported in continuous observation TMS, potentially reducing the validity of TMS' data and limiting their contribution to the general state of knowledge. We believe that a cornerstone in standardizing TMS is to ensure the reliability of the human observers. In this manuscript we review the approaches for inter-observer reliability assessment (IORA) in a representative sample of TMS focusing on clinical workflow. We found that IORA is an uncommon practice, inconsistently reported, and often uses methods that provide partial and overestimated measures of agreement. Since a comprehensive approach to IORA is yet to be proposed and validated, we provide initial recommendations for IORA reporting in continuous observation TMS.

  8. Workflow as a Service in the Cloud: Architecture and Scheduling Algorithms

    Science.gov (United States)

    Wang, Jianwu; Korambath, Prakashan; Altintas, Ilkay; Davis, Jim; Crawl, Daniel

    2017-01-01

    With more and more workflow systems adopting cloud as their execution environment, it becomes increasingly challenging on how to efficiently manage various workflows, virtual machines (VMs) and workflow execution on VM instances. To make the system scalable and easy-to-extend, we design a Workflow as a Service (WFaaS) architecture with independent services. A core part of the architecture is how to efficiently respond continuous workflow requests from users and schedule their executions in the cloud. Based on different targets, we propose four heuristic workflow scheduling algorithms for the WFaaS architecture, and analyze the differences and best usages of the algorithms in terms of performance, cost and the price/performance ratio via experimental studies. PMID:29399237

  9. On the Support of Scientific Workflows over Pub/Sub Brokers

    Directory of Open Access Journals (Sweden)

    Edwin Cedeño

    2013-08-01

    Full Text Available The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  10. On the support of scientific workflows over Pub/Sub brokers.

    Science.gov (United States)

    Morales, Augusto; Robles, Tomas; Alcarria, Ramon; Cedeño, Edwin

    2013-08-20

    The execution of scientific workflows is gaining importance as more computing resources are available in the form of grid environments. The Publish/Subscribe paradigm offers well-proven solutions for sustaining distributed scenarios while maintaining the high level of task decoupling required by scientific workflows. In this paper, we propose a new model for supporting scientific workflows that improves the dissemination of control events. The proposed solution is based on the mapping of workflow tasks to the underlying Pub/Sub event layer, and the definition of interfaces and procedures for execution on brokers. In this paper we also analyze the strengths and weaknesses of current solutions that are based on existing message exchange models for scientific workflows. Finally, we explain how our model improves the information dissemination, event filtering, task decoupling and the monitoring of scientific workflows.

  11. IPNS data acquisition system

    International Nuclear Information System (INIS)

    Worlton, T.G.; Crawford, R.K.; Haumann, J.R.; Daly, R.

    1983-01-01

    The IPNS Data Acquisition System (DAS) was designed to be reliable, flexible, and easy to use. It provides unique methods of acquiring Time-of-Flight neutron scattering data and allows collection, storage, display, and analysis of very large data arrays with a minimum of user input. Data can be collected from normal detectors, linear position-sensitive detectors, and/or area detectors. The data can be corrected for time-delays and can be time-focussed before being binned. Corrections to be made to the data and selection of inputs to be summed are entirely software controlled, as are the time ranges and resolutions for each detector element. Each system can be configured to collect data into millions of channels. Maximum continuous data rates are greater than 2000 counts/sec with full corrections, or 16,000 counts/sec for the simpler binning scheme used with area detectors. Live displays of the data may be made as a function of time, wavevector, wavelength, lattice spacing, or energy. In most cases the complete data analysis can be done on the DAS host computer. The IPNS DAS became operational for four neutron scattering instruments in 1981 and has since been expanded to seven instruments

  12. Profiling of lipid species by normal-phase liquid chromatography, nanoelectrospray ionization, and ion trap-orbitrap mass spectrometry

    DEFF Research Database (Denmark)

    Sokol, Elena; Almeida, Reinaldo; Hannibal-Bach, Hans Kristian

    2013-01-01

    Detailed analysis of lipid species can be challenging due to their structural diversity and wide concentration range in cells, tissues, and biofluids. To address these analytical challenges, we devised a reproducible, sensitive, and integrated lipidomics workflow based on normal-phase liquid......) routine for characterizing the fatty acid moieties of identified lipid species. We benchmarked the performance of the workflow by characterizing the chromatographic properties of the LC-MS system for general lipid analysis. In addition, we demonstrate the efficacy of the workflow by reporting a study...

  13. Incorporating Workflow Interference in Facility Layout Design: The Quartic Assignment Problem

    OpenAIRE

    Wen-Chyuan Chiang; Panagiotis Kouvelis; Timothy L. Urban

    2002-01-01

    Although many authors have noted the importance of minimizing workflow interference in facility layout design, traditional layout research tends to focus on minimizing the distance-based transportation cost. This paper formalizes the concept of workflow interference from a facility layout perspective. A model, formulated as a quartic assignment problem, is developed that explicitly considers the interference of workflow. Optimal and heuristic solution methodologies are developed and evaluated.

  14. Data mining workflow templates for intelligent discovery assistance in RapidMiner

    OpenAIRE

    Kietz, J U; Serban, F; Bernstein, A; Fischer, S

    2010-01-01

    Knowledge Discovery in Databases (KDD) has evolved during the last years and reached a mature stage offering plenty of operators to solve complex tasks. User support for building workflows, in contrast, has not increased proportionally. The large number of operators available in current KDD systems make it difficult for users to successfully analyze data. Moreover, workflows easily contain a large number of operators and parts of the workflows are applied several times, thus it is hard for us...

  15. Restructuring of workflows to minimise errors via stochastic model checking: An automated evolutionary approach

    International Nuclear Information System (INIS)

    Herbert, L.T.; Hansen, Z.N.L.

    2016-01-01

    This paper presents a framework for the automated restructuring of stochastic workflows to reduce the impact of faults. The framework allows for the modelling of workflows by means of a formalised subset of the BPMN workflow language. We extend this modelling formalism to describe faults and incorporate an intention preserving stochastic semantics able to model both probabilistic- and non-deterministic behaviour. Stochastic model checking techniques are employed to generate the state-space of a given workflow. Possible improvements obtained by restructuring are measured by employing the framework's capacity for tracking real-valued quantities associated with states and transitions of the workflow. The space of possible restructurings of a workflow is explored by means of an evolutionary algorithm, where the goals for improvement are defined in terms of optimising quantities, typically employed to model resources, associated with a workflow. The approach is fully automated and only the modelling of the production workflows, potential faults and the expression of the goals require manual input. We present the design of a software tool implementing this framework and explore the practical utility of this approach through an industrial case study in which the risk of production failures and their impact are reduced by restructuring the workflow. - Highlights: • We present a framework which allows for the automated restructuring of workflows. • This framework seeks to minimise the impact of errors on the workflow. • We illustrate a scalable software implementation of this framework. • We explore the practical utility of this approach through an industry case. • The impact of errors can be substantially reduced by restructuring the workflow.

  16. BIM Workflow for Mechanical Ventilation Design : Object-Based Modeling with Autodesk Revit®

    OpenAIRE

    Bonduel, Mathias

    2016-01-01

    This study is conducted for the Belgian engineering firm CENERGIE, whose main business activities are within the fields of building systems and sustainable buildings. The company wants to change their current design workflows to adapt the use of Building Information Modeling (BIM) with Autodesk Revit The research focused on the development of a BIM workflow where no models are exchanged between building partners. The aim of this study was to develop such a Revit BIM workflow for the desig...

  17. Design and implementation of a secure workflow system based on PKI/PMI

    Science.gov (United States)

    Yan, Kai; Jiang, Chao-hui

    2013-03-01

    As the traditional workflow system in privilege management has the following weaknesses: low privilege management efficiency, overburdened for administrator, lack of trust authority etc. A secure workflow model based on PKI/PMI is proposed after studying security requirements of the workflow systems in-depth. This model can achieve static and dynamic authorization after verifying user's ID through PKC and validating user's privilege information by using AC in workflow system. Practice shows that this system can meet the security requirements of WfMS. Moreover, it can not only improve system security, but also ensures integrity, confidentiality, availability and non-repudiation of the data in the system.

  18. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC environments

    Directory of Open Access Journals (Sweden)

    Athanassios M. Kintsakis

    2017-01-01

    Full Text Available Hermes introduces a new “describe once, run anywhere” paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  19. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    Science.gov (United States)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  20. How to plan workflow changes: a practical quality improvement tool used in an outpatient hospital pharmacy.

    Science.gov (United States)

    Aguilar, Christine; Chau, Connie; Giridharan, Neha; Huh, Youchin; Cooley, Janet; Warholak, Terri L

    2013-06-01

    A quality improvement tool is provided to improve pharmacy workflow with the goal of minimizing errors caused by workflow issues. This study involved workflow evaluation and reorganization, and staff opinions of these proposed changes. The study pharmacy was an outpatient pharmacy in the Tucson area. However, the quality improvement tool may be applied in all pharmacy settings, including but not limited to community, hospital, and independent pharmacies. This tool can help the user to identify potential workflow problem spots, such as high-traffic areas through the creation of current and proposed workflow diagrams. Creating a visual representation can help the user to identify problem spots and to propose changes to optimize workflow. It may also be helpful to assess employees' opinions of these changes. The workflow improvement tool can be used to assess where improvements are needed in a pharmacy's floor plan and workflow. Suggestions for improvements in the study pharmacy included increasing the number of verification points and decreasing high traffic areas in the workflow. The employees of the study pharmacy felt that the proposed changes displayed greater continuity, sufficiency, accessibility, and space within the pharmacy.

  1. Semi-Automatic Science Workflow Synthesis for High-End Computing on the NASA Earth Exchange

    Data.gov (United States)

    National Aeronautics and Space Administration — Enhance capabilities for collaborative data analysis and modeling in Earth sciences. Develop components for automatic workflow capture, archiving and management....

  2. Detecting dissonance in clinical and research workflow for translational psychiatric registries.

    Science.gov (United States)

    Cofiel, Luciana; Bassi, Débora U; Ray, Ryan Kumar; Pietrobon, Ricardo; Brentani, Helena

    2013-01-01

    The interplay between the workflow for clinical tasks and research data collection is often overlooked, ultimately making it ineffective. To the best of our knowledge, no previous studies have developed standards that allow for the comparison of workflow models derived from clinical and research tasks toward the improvement of data collection processes. In this study we used the term dissonance for the occurrences where there was a discord between clinical and research workflows. We developed workflow models for a translational research study in psychiatry and the clinic where its data collection was carried out. After identifying points of dissonance between clinical and research models we derived a corresponding classification system that ultimately enabled us to re-engineer the data collection workflow. We considered (1) the number of patients approached for enrollment and (2) the number of patients enrolled in the study as indicators of efficiency in research workflow. We also recorded the number of dissonances before and after the workflow modification. We identified 22 episodes of dissonance across 6 dissonance categories: actor, communication, information, artifact, time, and space. We were able to eliminate 18 episodes of dissonance and increase the number of patients approached and enrolled in research study trough workflow modification. The classification developed in this study is useful for guiding the identification of dissonances and reveal modifications required to align the workflow of data collection and the clinical setting. The methodology described in this study can be used by researchers to standardize data collection process.

  3. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  4. Workflow of the Grover algorithm simulation incorporating CUDA and GPGPU

    Science.gov (United States)

    Lu, Xiangwen; Yuan, Jiabin; Zhang, Weiwei

    2013-09-01

    The Grover quantum search algorithm, one of only a few representative quantum algorithms, can speed up many classical algorithms that use search heuristics. No true quantum computer has yet been developed. For the present, simulation is one effective means of verifying the search algorithm. In this work, we focus on the simulation workflow using a compute unified device architecture (CUDA). Two simulation workflow schemes are proposed. These schemes combine the characteristics of the Grover algorithm and the parallelism of general-purpose computing on graphics processing units (GPGPU). We also analyzed the optimization of memory space and memory access from this perspective. We implemented four programs on CUDA to evaluate the performance of schemes and optimization. Through experimentation, we analyzed the organization of threads suited to Grover algorithm simulations, compared the storage costs of the four programs, and validated the effectiveness of optimization. Experimental results also showed that the distinguished program on CUDA outperformed the serial program of libquantum on a CPU with a speedup of up to 23 times (12 times on average), depending on the scale of the simulation.

  5. An ontological knowledge framework for adaptive medical workflow.

    Science.gov (United States)

    Dang, Jiangbo; Hedayati, Amir; Hampel, Ken; Toklu, Candemir

    2008-10-01

    As emerging technologies, semantic Web and SOA (Service-Oriented Architecture) allow BPMS (Business Process Management System) to automate business processes that can be described as services, which in turn can be used to wrap existing enterprise applications. BPMS provides tools and methodologies to compose Web services that can be executed as business processes and monitored by BPM (Business Process Management) consoles. Ontologies are a formal declarative knowledge representation model. It provides a foundation upon which machine understandable knowledge can be obtained, and as a result, it makes machine intelligence possible. Healthcare systems can adopt these technologies to make them ubiquitous, adaptive, and intelligent, and then serve patients better. This paper presents an ontological knowledge framework that covers healthcare domains that a hospital encompasses-from the medical or administrative tasks, to hospital assets, medical insurances, patient records, drugs, and regulations. Therefore, our ontology makes our vision of personalized healthcare possible by capturing all necessary knowledge for a complex personalized healthcare scenario involving patient care, insurance policies, and drug prescriptions, and compliances. For example, our ontology facilitates a workflow management system to allow users, from physicians to administrative assistants, to manage, even create context-aware new medical workflows and execute them on-the-fly.

  6. Integration of implant planning workflows into the PACS infrastructure

    Science.gov (United States)

    Gessat, Michael; Strauß, Gero; Burgert, Oliver

    2008-03-01

    The integration of imaging devices, diagnostic workstations, and image servers into Picture Archiving and Communication Systems (PACS) has had an enormous effect on the efficiency of radiology workflows. The standardization of the information exchange between the devices with the DICOM standard has been an essential precondition for that development. For surgical procedures, no such infrastructure exists. With the increasingly important role computerized planning and assistance systems play in the surgical domain, an infrastructure that unifies the communication between devices becomes necessary. In recent publications, the need for a modularized system design has been established. A reference architecture for a Therapy Imaging and Model Management System (TIMMS) has been proposed. It was accepted by the DICOM Working Group 6 as the reference architecture for DICOM developments for surgery. In this paper we propose the inclusion of implant planning systems into the PACS infrastructure. We propose a generic information model for the patient specific selection and positioning of implants from a repository according to patient image data. The information models are based on clinical workflows from ENT, cardiac, and orthopedic surgery as well as technical requirements derived from different use cases and systems. We show an exemplary implementation of the model for application in ENT surgery: the selection and positioning of an ossicular implant in the middle ear. An implant repository is stored in the PACS. It makes use of an experimental implementation of the Surface Mesh Module that is currently being developed as extension to the DICOM standard.

  7. Automation and workflow considerations for embedding Digimarc Barcodes at scale

    Science.gov (United States)

    Rodriguez, Tony; Haaga, Don; Calhoon, Sean

    2015-03-01

    The Digimarc® Barcode is a digital watermark applied to packages and variable data labels that carries GS1 standard GTIN-14 data traditionally carried by a 1-D barcode. The Digimarc Barcode can be read with smartphones and imaging-based barcode readers commonly used in grocery and retail environments. Using smartphones, consumers can engage with products and retailers can materially increase the speed of check-out, increasing store margins and providing a better experience for shoppers. Internal testing has shown an average of 53% increase in scanning throughput, enabling 100's of millions of dollars in cost savings [1] for retailers when deployed at scale. To get to scale, the process of embedding a digital watermark must be automated and integrated within existing workflows. Creating the tools and processes to do so represents a new challenge for the watermarking community. This paper presents a description and an analysis of the workflow implemented by Digimarc to deploy the Digimarc Barcode at scale. An overview of the tools created and lessons learned during the introduction of technology to the market are provided.

  8. Autonomic Management of Application Workflows on Hybrid Computing Infrastructure

    Directory of Open Access Journals (Sweden)

    Hyunjoo Kim

    2011-01-01

    Full Text Available In this paper, we present a programming and runtime framework that enables the autonomic management of complex application workflows on hybrid computing infrastructures. The framework is designed to address system and application heterogeneity and dynamics to ensure that application objectives and constraints are satisfied. The need for such autonomic system and application management is becoming critical as computing infrastructures become increasingly heterogeneous, integrating different classes of resources from high-end HPC systems to commodity clusters and clouds. For example, the framework presented in this paper can be used to provision the appropriate mix of resources based on application requirements and constraints. The framework also monitors the system/application state and adapts the application and/or resources to respond to changing requirements or environment. To demonstrate the operation of the framework and to evaluate its ability, we employ a workflow used to characterize an oil reservoir executing on a hybrid infrastructure composed of TeraGrid nodes and Amazon EC2 instances of various types. Specifically, we show how different applications objectives such as acceleration, conservation and resilience can be effectively achieved while satisfying deadline and budget constraints, using an appropriate mix of dynamically provisioned resources. Our evaluations also demonstrate that public clouds can be used to complement and reinforce the scheduling and usage of traditional high performance computing infrastructure.

  9. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images.

    Science.gov (United States)

    Ameisen, David; Deroulers, Christophe; Perrier, Valérie; Bouhidel, Fatiha; Battistella, Maxime; Legrès, Luc; Janin, Anne; Bertheau, Philippe; Yunès, Jean-Baptiste

    2014-01-01

    Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics workflow.

  10. Towards better digital pathology workflows: programming libraries for high-speed sharpness assessment of Whole Slide Images

    Science.gov (United States)

    2014-01-01

    Background Since microscopic slides can now be automatically digitized and integrated in the clinical workflow, quality assessment of Whole Slide Images (WSI) has become a crucial issue. We present a no-reference quality assessment method that has been thoroughly tested since 2010 and is under implementation in multiple sites, both public university-hospitals and private entities. It is part of the FlexMIm R&D project which aims to improve the global workflow of digital pathology. For these uses, we have developed two programming libraries, in Java and Python, which can be integrated in various types of WSI acquisition systems, viewers and image analysis tools. Methods Development and testing have been carried out on a MacBook Pro i7 and on a bi-Xeon 2.7GHz server. Libraries implementing the blur assessment method have been developed in Java, Python, PHP5 and MySQL5. For web applications, JavaScript, Ajax, JSON and Sockets were also used, as well as the Google Maps API. Aperio SVS files were converted into the Google Maps format using VIPS and Openslide libraries. Results We designed the Java library as a Service Provider Interface (SPI), extendable by third parties. Analysis is computed in real-time (3 billion pixels per minute). Tests were made on 5000 single images, 200 NDPI WSI, 100 Aperio SVS WSI converted to the Google Maps format. Conclusions Applications based on our method and libraries can be used upstream, as calibration and quality control tool for the WSI acquisition systems, or as tools to reacquire tiles while the WSI is being scanned. They can also be used downstream to reacquire the complete slides that are below the quality threshold for surgical pathology analysis. WSI may also be displayed in a smarter way by sending and displaying the regions of highest quality before other regions. Such quality assessment scores could be integrated as WSI's metadata shared in clinical, research or teaching contexts, for a more efficient medical informatics

  11. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    interfaces. All data sets and sensor communication follow well-defined abstract models and corresponding encodings, mostly developed by the OGC Sensor Web Enablement initiative. Scientific progress is currently accelerated by an emerging new concept called scientific workflows, which organize and manage complex distributed computations. A scientific workflow represents and records the highly complex processes that a domain scientist typically would follow in exploration, discovery and ultimately, transformation of raw data to publishable results. The challenge is now to integrate the benefits of scientific workflows with those provided by the Sensor Web in order to leverage all resources for scientific exploration, problem solving, and knowledge generation. Scientific workflows for the Sensor Web represent the next evolutionary step towards efficient, powerful, and flexible earth observation frameworks and platforms. Those platforms support the entire process from capturing data, sharing and integrating, to requesting additional observations. Multiple sites and organizations will participate on single platforms and scientists from different countries and organizations interact and contribute to large-scale research projects. Simultaneously, the data- and information overload becomes manageable, as multiple layers of abstraction will free scientists to deal with underlying data-, processing or storage peculiarities. The vision are automated investigation and discovery mechanisms that allow scientists to pose queries to the system, which in turn would identify potentially related resources, schedules processing tasks and assembles all parts in workflows that may satisfy the query.

  12. Successful Completion of FY18/Q1 ASC L2 Milestone 6355: Electrical Analysis Calibration Workflow Capability Demonstration.

    Energy Technology Data Exchange (ETDEWEB)

    Copps, Kevin D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-12-01

    The Sandia Analysis Workbench (SAW) project has developed and deployed a production capability for SIERRA computational mechanics analysis workflows. However, the electrical analysis workflow capability requirements have only been demonstrated in early prototype states, with no real capability deployed for analysts’ use. This milestone aims to improve the electrical analysis workflow capability (via SAW and related tools) and deploy it for ongoing use. We propose to focus on a QASPR electrical analysis calibration workflow use case. We will include a number of new capabilities (versus today’s SAW), such as: 1) support for the XYCE code workflow component, 2) data management coupled to electrical workflow, 3) human-in-theloop workflow capability, and 4) electrical analysis workflow capability deployed on the restricted (and possibly classified) network at Sandia. While far from the complete set of capabilities required for electrical analysis workflow over the long term, this is a substantial first step toward full production support for the electrical analysts.

  13. Low Latency Workflow Scheduling and an Application of Hyperspectral Brightness Temperatures

    Science.gov (United States)

    Nguyen, P. T.; Chapman, D. R.; Halem, M.

    2012-12-01

    the Nino 4 region, as well as a 1.9 Kelvin decadal Arctic warming in the 4u and 12u spectral regions. Additionally, we will present the frequency of extreme global warming events by the use of a normalized maximum BT in a grid cell relative to its local standard deviation. A low-latency Hadoop scheduling environment maintains data integrity and fault tolerance in a MapReduce data intensive Cloud environment while improving the "time to solution" metric by 35% when compared to a more traditional parallel processing system for the same dataset. Our next step will be to improve the usability of our Hadoop task scheduling system, to enable rapid prototyping of data intensive experiments by means of processing "kernels". We will report on the performance and experience of implementing these experiments on the NEX testbed, and propose the use of a graphical directed acyclic graph (DAG) interface to help us develop on-demand scientific experiments. Our workflow system works within Hadoop infrastructure as a replacement for the FIFO or FairScheduler, thus the use of Apache "Pig" latin or other Apache tools may also be worth investigating on the NEX system to improve the usability of our workflow scheduling infrastructure for rapid experimentation.

  14. Galaxy-M: a Galaxy workflow for processing and analyzing direct infusion and liquid chromatography mass spectrometry-based metabolomics data.

    Science.gov (United States)

    Davidson, Robert L; Weber, Ralf J M; Liu, Haoyu; Sharma-Oates, Archana; Viant, Mark R

    2016-01-01

    Metabolomics is increasingly recognized as an invaluable tool in the biological, medical and environmental sciences yet lags behind the methodological maturity of other omics fields. To achieve its full potential, including the integration of multiple omics modalities, the accessibility, standardization and reproducibility of computational metabolomics tools must be improved significantly. Here we present our end-to-end mass spectrometry metabolomics workflow in the widely used platform, Galaxy. Named Galaxy-M, our workflow has been developed for both direct infusion mass spectrometry (DIMS) and liquid chromatography mass spectrometry (LC-MS) metabolomics. The range of tools presented spans from processing of raw data, e.g. peak picking and alignment, through data cleansing, e.g. missing value imputation, to preparation for statistical analysis, e.g. normalization and scaling, and principal components analysis (PCA) with associated statistical evaluation. We demonstrate the ease of using these Galaxy workflows via the analysis of DIMS and LC-MS datasets, and provide PCA scores and associated statistics to help other users to ensure that they can accurately repeat the processing and analysis of these two datasets. Galaxy and data are all provided pre-installed in a virtual machine (VM) that can be downloaded from the GigaDB repository. Additionally, source code, executables and installation instructions are available from GitHub. The Galaxy platform has enabled us to produce an easily accessible and reproducible computational metabolomics workflow. More tools could be added by the community to expand its functionality. We recommend that Galaxy-M workflow files are included within the supplementary information of publications, enabling metabolomics studies to achieve greater reproducibility.

  15. Conceptual-level workflow modeling of scientific experiments using NMR as a case study

    Directory of Open Access Journals (Sweden)

    Gryk Michael R

    2007-01-01

    Full Text Available Abstract Background Scientific workflows improve the process of scientific experiments by making computations explicit, underscoring data flow, and emphasizing the participation of humans in the process when intuition and human reasoning are required. Workflows for experiments also highlight transitions among experimental phases, allowing intermediate results to be verified and supporting the proper handling of semantic mismatches and different file formats among the various tools used in the scientific process. Thus, scientific workflows are important for the modeling and subsequent capture of bioinformatics-related data. While much research has been conducted on the implementation of scientific workflows, the initial process of actually designing and generating the workflow at the conceptual level has received little consideration. Results We propose a structured process to capture scientific workflows at the conceptual level that allows workflows to be documented efficiently, results in concise models of the workflow and more-correct workflow implementations, and provides insight into the scientific process itself. The approach uses three modeling techniques to model the structural, data flow, and control flow aspects of the workflow. The domain of biomolecular structure determination using Nuclear Magnetic Resonance spectroscopy is used to demonstrate the process. Specifically, we show the application of the approach to capture the workflow for the process of conducting biomolecular analysis using Nuclear Magnetic Resonance (NMR spectroscopy. Conclusion Using the approach, we were able to accurately document, in a short amount of time, numerous steps in the process of conducting an experiment using NMR spectroscopy. The resulting models are correct and precise, as outside validation of the models identified only minor omissions in the models. In addition, the models provide an accurate visual description of the control flow for conducting

  16. A Workflow to Improve the Alignment of Prostate Imaging with Whole-mount Histopathology.

    Science.gov (United States)

    Yamamoto, Hidekazu; Nir, Dror; Vyas, Lona; Chang, Richard T; Popert, Rick; Cahill, Declan; Challacombe, Ben; Dasgupta, Prokar; Chandra, Ashish

    2014-08-01

    Evaluation of prostate imaging tests against whole-mount histology specimens requires accurate alignment between radiologic and histologic data sets. Misalignment results in false-positive and -negative zones as assessed by imaging. We describe a workflow for three-dimensional alignment of prostate imaging data against whole-mount prostatectomy reference specimens and assess its performance against a standard workflow. Ethical approval was granted. Patients underwent motorized transrectal ultrasound (Prostate Histoscanning) to generate a three-dimensional image of the prostate before radical prostatectomy. The test workflow incorporated steps for axial alignment between imaging and histology, size adjustments following formalin fixation, and use of custom-made parallel cutters and digital caliper instruments. The control workflow comprised freehand cutting and assumed homogeneous block thicknesses at the same relative angles between pathology and imaging sections. Thirty radical prostatectomy specimens were histologically and radiologically processed, either by an alignment-optimized workflow (n = 20) or a control workflow (n = 10). The optimized workflow generated tissue blocks of heterogeneous thicknesses but with no significant drifting in the cutting plane. The control workflow resulted in significantly nonparallel blocks, accurately matching only one out of four histology blocks to their respective imaging data. The image-to-histology alignment accuracy was 20% greater in the optimized workflow (P alignment was observed in the optimized workflow. Evaluation of prostate imaging biomarkers using whole-mount histology references should include a test-to-reference spatial alignment workflow. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  17. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  18. Patron-driven acquisitions history and best practices

    CERN Document Server

    2011-01-01

    About 40 percent of the books academic libraries purchase in traditional ways never circulate and another 40 percent circulate fewer than three times. By contrast, patron-driven acquisition allows a library to borrow or buy books only when a patron needs them. In a typical workflow, the library imports bibliographic records into its catalogue at no cost. When a patron finds a patron-driven record in the course of research, a short-term loan can allow him to borrow the book, and the transaction charge to the library will be a small percentage of the list price. Typically, a library will automatically buy a book on a third or fourth use. The contributions in this volume, written by experts, describe the genesis and brief history of patron-driven acquisitions, its current status, and its promise.

  19. Experience from Tore Supra acquisition system and evolutions

    International Nuclear Information System (INIS)

    Guillerminet, B.; Buravand, Y.; Chatelier, E.; Leroux, F.

    2004-01-01

    The Tore Supra tokamak has been upgraded to explore long duration plasma discharges up to 1000s. Since summer 2001, the acquisition system operates in continuous mode apart of the data processing which is still done after the pulse. In the first part, we explore a few solutions to process continuously the data during the pulse, based on parallel processing on a Linux farm and then on a CONDOR system. The second part is devoted to the Web service exposing the Tore Supra operation. In the last part, the VME acquisition system has been redesigned to keep up with the high data rates required by a few diagnostics. The workflow is now distributed among a few computers. At the end, we give the current status of the realisation and the future planning

  20. Managing Evolving Business Workflows through the Capture of Descriptive Information

    CERN Document Server

    Gaspard, S; Dindeleux, R; McClatchey, R; Gaspard, Sebastien; Estrella, Florida

    2003-01-01

    Business systems these days need to be agile to address the needs of a changing world. In particular the discipline of Enterprise Application Integration requires business process management to be highly reconfigurable with the ability to support dynamic workflows, inter-application integration and process reconfiguration. Basing EAI systems on model-resident or on a so-called description-driven approach enables aspects of flexibility, distribution, system evolution and integration to be addressed in a domain-independent manner. Such a system called CRISTAL is described in this paper with particular emphasis on its application to EAI problem domains. A practical example of the CRISTAL technology in the domain of manufacturing systems, called Agilium, is described to demonstrate the principles of model-driven system evolution and integration. The approach is compared to other model-driven development approaches such as the Model-Driven Architecture of the OMG and so-called Adaptive Object Models.

  1. Data distribution method of workflow in the cloud environment

    Science.gov (United States)

    Wang, Yong; Wu, Junjuan; Wang, Ying

    2017-08-01

    Cloud computing for workflow applications provides the required high efficiency calculation and large storage capacity and it also brings challenges to the protection of trade secrets and other privacy data. Because of privacy data will cause the increase of the data transmission time, this paper presents a new data allocation algorithm based on data collaborative damage degree, to improve the existing data allocation strategy? Safety and public cloud computer algorithm depends on the private cloud; the static allocation method in the initial stage only to the non-confidential data division to improve the original data, in the operational phase will continue to generate data to dynamically adjust the data distribution scheme. The experimental results show that the improved method is effective in reducing the data transmission time.

  2. The robust schedule - A link to improved workflow

    DEFF Research Database (Denmark)

    Lindhard, Søren; Wandahl, Søren

    2012-01-01

    -down the contractors, and force them to rigorously adhere to the initial schedule. If delayed the work-pace or manpower has to be increased to observe the schedule. In attempt to improve productivity three independent site-mangers have been interviewed about time-scheduling. Their experiences and opinions have been...... analyzed and weaknesses in existing time scheduling have been found. The findings showed a negative side effect of keeping the schedule to tight. A too tight schedule is inflexible and cannot absorb variability in production. Flexibility is necessary because of the contractors interacting and dependable....... The result is a chaotic, complex and uncontrolled construction site. Furthermore, strict time limits entail the workflow to be optimized under non-optimal conditions. Even though productivity seems to be increasing, productivity per man-hour is decreasing resulting in increased cost. To increase productivity...

  3. submitter Performance studies of CMS workflows using Big Data technologies

    CERN Document Server

    Ambroz, Luca; Grandi, Claudio

    At the Large Hadron Collider (LHC), more than 30 petabytes of data are produced from particle collisions every year of data taking. The data processing requires large volumes of simulated events through Monte Carlo techniques. Furthermore, physics analysis implies daily access to derived data formats by hundreds of users. The Worldwide LHC Computing Grid (WLCG) - an international collaboration involving personnel and computing centers worldwide - is successfully coping with these challenges, enabling the LHC physics program. With the continuation of LHC data taking and the approval of ambitious projects such as the High-Luminosity LHC, such challenges will reach the edge of current computing capacity and performance. One of the keys to success in the next decades - also under severe financial resource constraints - is to optimize the efficiency in exploiting the computing resources. This thesis focuses on performance studies of CMS workflows, namely centrallyscheduled production activities and unpredictable d...

  4. Tools and Workflows for Collaborating on Static Website Projects

    Directory of Open Access Journals (Sweden)

    Kaitlin Newson

    2017-10-01

    Full Text Available Static website generators have seen a significant increase in popularity in recent years, offering many advantages over their dynamic counterparts. While these generators were typically used for blogs, they have grown in usage for other web-based projects, including documentation, conference websites, and image collections. However, because of their technical complexity, these tools can be inaccessible to content creators depending on their level of technical skill and comfort with web development technologies. Drawing from experience with a collaborative static website project, this article will provide an overview of static website generators, review different tools available for managing content, and explore workflows and best practices for collaborating with teams on static website projects.

  5. A Component Based Approach to Scientific Workflow Management

    CERN Document Server

    Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.

  6. A component based approach to scientific workflow management

    International Nuclear Information System (INIS)

    Baker, N.; Brooks, P.; McClatchey, R.; Kovacs, Z.; LeGoff, J.-M.

    2001-01-01

    CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta-modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse

  7. Optimizing perioperative decision making: improved information for clinical workflow planning.

    Science.gov (United States)

    Doebbeling, Bradley N; Burton, Matthew M; Wiebke, Eric A; Miller, Spencer; Baxter, Laurence; Miller, Donald; Alvarez, Jorge; Pekny, Joseph

    2012-01-01

    Perioperative care is complex and involves multiple interconnected subsystems. Delayed starts, prolonged cases and overtime are common. Surgical procedures account for 40-70% of hospital revenues and 30-40% of total costs. Most planning and scheduling in healthcare is done without modern planning tools, which have potential for improving access by assisting in operations planning support. We identified key planning scenarios of interest to perioperative leaders, in order to examine the feasibility of applying combinatorial optimization software solving some of those planning issues in the operative setting. Perioperative leaders desire a broad range of tools for planning and assessing alternate solutions. Our modeled solutions generated feasible solutions that varied as expected, based on resource and policy assumptions and found better utilization of scarce resources. Combinatorial optimization modeling can effectively evaluate alternatives to support key decisions for planning clinical workflow and improving care efficiency and satisfaction.

  8. Sequanix: a dynamic graphical interface for Snakemake workflows.

    Science.gov (United States)

    Desvillechabrol, Dimitri; Legendre, Rachel; Rioualen, Claire; Bouchier, Christiane; van Helden, Jacques; Kennedy, Sean; Cokelaer, Thomas

    2018-06-01

    We designed a PyQt graphical user interface-Sequanix-aimed at democratizing the use of Snakemake pipelines in the NGS space and beyond. By default, Sequanix includes Sequana NGS pipelines (Snakemake format) (http://sequana.readthedocs.io), and is also capable of loading any external Snakemake pipeline. New users can easily, visually, edit configuration files of expert-validated pipelines and can interactively execute these production-ready workflows. Sequanix will be useful to both Snakemake developers in exposing their pipelines and to a wide audience of users. Source on http://github.com/sequana/sequana, bio-containers on http://bioconda.github.io and Singularity hub (http://singularity-hub.org). dimitri.desvillechabrol@pasteur.fr or thomas.cokelaer@pasteur.fr. Supplementary data are available at Bioinformatics online.

  9. Acquisition Research Program Homepage

    OpenAIRE

    2015-01-01

    Includes an image of the main page on this date and compressed file containing additional web pages. Established in 2003, Naval Postgraduate School’s (NPS) Acquisition Research Program provides leadership in innovation, creative problem solving and an ongoing dialogue, contributing to the evolution of Department of Defense acquisition strategies.

  10. Making Acquisition Measurable

    Science.gov (United States)

    2011-04-30

    Corporation. All rights reserved End Users Administrator/ Maintainer (A/M) Subject Matter Expert ( SME ) Trainer/ Instructor Manager, Evaluator, Supervisor... CMMI ) - Acquisition (AQ) © 2011 The MITRE Corporation. All rights reserved 13 CMMI -Development Incremental iterative development (planning & execution...objectives Constructing games highlighting particular aspects of proposed CCOD® acquisition, and conducting exercises with Subject Matter Experts ( SMEs

  11. Soundness of Timed-Arc Workflow Nets in Discrete and Continuous-Time Semantics

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2015-01-01

    Analysis of workflow processes with quantitative aspectslike timing is of interest in numerous time-critical applications. We suggest a workflow model based on timed-arc Petri nets and studythe foundational problems of soundness and strong (time-bounded) soundness.We first consider the discrete-t...

  12. Scheduling Multilevel Deadline-Constrained Scientific Workflows on Clouds Based on Cost Optimization

    Directory of Open Access Journals (Sweden)

    Maciej Malawski

    2015-01-01

    Full Text Available This paper presents a cost optimization model for scheduling scientific workflows on IaaS clouds such as Amazon EC2 or RackSpace. We assume multiple IaaS clouds with heterogeneous virtual machine instances, with limited number of instances per cloud and hourly billing. Input and output data are stored on a cloud object store such as Amazon S3. Applications are scientific workflows modeled as DAGs as in the Pegasus Workflow Management System. We assume that tasks in the workflows are grouped into levels of identical tasks. Our model is specified using mathematical programming languages (AMPL and CMPL and allows us to minimize the cost of workflow execution under deadline constraints. We present results obtained using our model and the benchmark workflows representing real scientific applications in a variety of domains. The data used for evaluation come from the synthetic workflows and from general purpose cloud benchmarks, as well as from the data measured in our own experiments with Montage, an astronomical application, executed on Amazon EC2 cloud. We indicate how this model can be used for scenarios that require resource planning for scientific workflows and their ensembles.

  13. e-BioFlow: improving practical use of workflow systems in bioinformatics

    NARCIS (Netherlands)

    Wassink, I.; Ooms, M.; Neerincx, P.; Rauwerda, H.; Leunissen, J.A.M.; Breit, T.M.; Nijholt, A.; Vet, van der P.

    2010-01-01

    Workflow management systems (WfMSs) are useful tools for bioinformaticians. As experiences with using WfMSs accumulate, shortcomings of current systems become apparent. In this paper, we focus on practical issues that hinder WfMS users and that arise in the design and execution of workflows, and in

  14. CrossFlow: Cross-Organizational Workflow Management in Dynamic Virtual Enterprises

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Aberer, Karl; Hoffner, Yigal; Ludwig, Heiko

    In this report, we present the approach to cross-organizational workflow management of the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual enterprises is based on

  15. Deadline-constrained workflow scheduling algorithms for Infrastructure as a Service Clouds

    NARCIS (Netherlands)

    Abrishami, S.; Naghibzadeh, M.; Epema, D.H.J.

    2013-01-01

    The advent of Cloud computing as a new model of service provisioning in distributed systems encourages researchers to investigate its benefits and drawbacks on executing scientific applications such as workflows. One of the most challenging problems in Clouds is workflow scheduling, i.e., the

  16. Exploring the impact of an automated prescription-filling device on community pharmacy technician workflow

    Science.gov (United States)

    Walsh, Kristin E.; Chui, Michelle Anne; Kieser, Mara A.; Williams, Staci M.; Sutter, Susan L.; Sutter, John G.

    2012-01-01

    Objective To explore community pharmacy technician workflow change after implementation of an automated robotic prescription-filling device. Methods At an independent community pharmacy in rural Mayville, WI, pharmacy technicians were observed before and 3 months after installation of an automated robotic prescription-filling device. The main outcome measures were sequences and timing of technician workflow steps, workflow interruptions, automation surprises, and workarounds. Results Of the 77 and 80 observations made before and 3 months after robot installation, respectively, 17 different workflow sequences were observed before installation and 38 after installation. Average prescription filling time was reduced by 40 seconds per prescription with use of the robot. Workflow interruptions per observation increased from 1.49 to 1.79 (P = 0.11), and workarounds increased from 10% to 36% after robot use. Conclusion Although automated prescription-filling devices can increase efficiency, workflow interruptions and workarounds may negate that efficiency. Assessing changes in workflow and sequencing of tasks that may result from the use of automation can help uncover opportunities for workflow policy and procedure redesign. PMID:21896459

  17. A data model for analyzing user collaborations in workflow-driven e-Science

    NARCIS (Netherlands)

    Altintas, I.; Anand, M.K.; Vuong, T.N.; Bowers, S.; Ludäscher, B.; Sloot, P.M.A.

    2011-01-01

    Scientific discoveries are often the result of methodical execution of many interrelated scientific workflows, where workflows and datasets published by one set of users can be used by other users to perform subsequent analyses, leading to implicit or explicit collaboration. In this paper, we

  18. CrossFlow : cross-organizational workflow management in dynamic virtual enterprises

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Aberer, K.; Hoffner, Y.

    2000-01-01

    This paper gives a detailed overview of the approach to cross-organizational workflow management developed in the CrossFlow project. CrossFlow is a European research project aiming at the support of cross-organizational workflows in dynamic virtual enterprises. The cooperation in these virtual

  19. A-Posteriori Detection of Sensor Infrastructure Errors in Correlated Sensor Data and Business Workflows

    NARCIS (Netherlands)

    Wombacher, Andreas

    2011-01-01

    Sensor data can be interpreted as a view on physical objects effected by business processes. Since both sensor infrastructures and business workflows must deal with imprecise information, the correlation of sensor data and business workflow data might be used a-posteriori to determine the source of

  20. Integrated Automatic Workflow for Phylogenetic Tree Analysis Using Public Access and Local Web Services.

    Science.gov (United States)

    Damkliang, Kasikrit; Tandayya, Pichaya; Sangket, Unitsa; Pasomsub, Ekawat

    2016-11-28

    At the present, coding sequence (CDS) has been discovered and larger CDS is being revealed frequently. Approaches and related tools have also been developed and upgraded concurrently, especially for phylogenetic tree analysis. This paper proposes an integrated automatic Taverna workflow for the phylogenetic tree inferring analysis using public access web services at European Bioinformatics Institute (EMBL-EBI) and Swiss Institute of Bioinformatics (SIB), and our own deployed local web services. The workflow input is a set of CDS in the Fasta format. The workflow supports 1,000 to 20,000 numbers in bootstrapping replication. The workflow performs the tree inferring such as Parsimony (PARS), Distance Matrix - Neighbor Joining (DIST-NJ), and Maximum Likelihood (ML) algorithms of EMBOSS PHYLIPNEW package based on our proposed Multiple Sequence Alignment (MSA) similarity score. The local web services are implemented and deployed into two types using the Soaplab2 and Apache Axis2 deployment. There are SOAP and Java Web Service (JWS) providing WSDL endpoints to Taverna Workbench, a workflow manager. The workflow has been validated, the performance has been measured, and its results have been verified. Our workflow's execution time is less than ten minutes for inferring a tree with 10,000 replicates of the bootstrapping numbers. This paper proposes a new integrated automatic workflow which will be beneficial to the bioinformaticians with an intermediate level of knowledge and experiences. All local services have been deployed at our portal http://bioservices.sci.psu.ac.th.

  1. Requirements for Secure Logging of Decentralized Cross-Organizational Workflow Executions

    NARCIS (Netherlands)

    Wombacher, Andreas; Wieringa, Roelf J.; Jonker, Willem; Knezevic, P.; Pokraev, S.; meersman, R; Tari, Z; herrero, p; Méndez, G.; Cavedon, L.; Martin, D.; Hinze, A.; Buchanan, G.

    2005-01-01

    The control of actions performed by parties involved in a decentralized cross-organizational workflow is done by several independent workflow engines. Due to the lack of a centralized coordination control, an auditing is required which supports a reliable and secure detection of malicious actions

  2. A method to build and analyze scientific workflows from provenance through process mining

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Li, Jiafei; Liu, Zheng; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large due to the large quantities of data used. As

  3. A history-tracing XML-based provenance framework for workflows

    NARCIS (Netherlands)

    Gerhards, M; Belloum, A.; Berretz, F.; Sander, V.; Skorupa, S.

    2010-01-01

    The importance of validating and reproducing the outcome of computational processes is fundamental to many application domains. Assuring the provenance of workflows will likely become even more important with respect to the incorporation of human tasks to standard workflows by emerging standards

  4. Extending a Petri-net based workflow description language for e-business atomicity support

    NARCIS (Netherlands)

    Norta, A.H.; Artishchev, S.

    2004-01-01

    In this paper an extension of XRL is presented for supporting Webbased and inter-organizational e-business atomicity spheres in workflow applications. XRL (eXchangable Routing Language), is an extensible, instance-based language that is intended for inter-organizational workflow processes having an

  5. VLAM-G: Interactive Data Driven Workflow Engine for Grid-Enabled Resources

    Directory of Open Access Journals (Sweden)

    Vladimir Korkhov

    2007-01-01

    Full Text Available Grid brings the power of many computers to scientists. However, the development of Grid-enabled applications requires knowledge about Grid infrastructure and low-level API to Grid services. In turn, workflow management systems provide a high-level environment for rapid prototyping of experimental computing systems. Coupling Grid and workflow paradigms is important for the scientific community: it makes the power of the Grid easily available to the end user. The paradigm of data driven workflow execution is one of the ways to enable distributed workflow on the Grid. The work presented in this paper is carried out in the context of the Virtual Laboratory for e-Science project. We present the VLAM-G workflow management system and its core component: the Run-Time System (RTS. The RTS is a dataflow driven workflow engine which utilizes Grid resources, hiding the complexity of the Grid from a scientist. Special attention is paid to the concept of dataflow and direct data streaming between distributed workflow components. We present the architecture and components of the RTS, describe the features of VLAM-G workflow execution, and evaluate the system by performance measurements and a real life use case.

  6. Dynamic work distribution in workflow management systems : how to balance quality and performance

    NARCIS (Netherlands)

    Kumar, Akhil; Aalst, van der W.M.P.; Verbeek, H.M.W.

    2002-01-01

    Today's workflow management systems offer work items to workers using rather primitive mechanisms.Although most workflow systems support a role-based distribution of work, they have problems dealing with unavailability of workers as a result of vacation or illness, overloading, context-dependent

  7. Elastic Scheduling of Scientific Workflows under Deadline Constraints in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Nazia Anwar

    2018-01-01

    Full Text Available Scientific workflow applications are collections of several structured activities and fine-grained computational tasks. Scientific workflow scheduling in cloud computing is a challenging research topic due to its distinctive features. In cloud environments, it has become critical to perform efficient task scheduling resulting in reduced scheduling overhead, minimized cost and maximized resource utilization while still meeting the user-specified overall deadline. This paper proposes a strategy, Dynamic Scheduling of Bag of Tasks based workflows (DSB, for scheduling scientific workflows with the aim to minimize financial cost of leasing Virtual Machines (VMs under a user-defined deadline constraint. The proposed model groups the workflow into Bag of Tasks (BoTs based on data dependency and priority constraints and thereafter optimizes the allocation and scheduling of BoTs on elastic, heterogeneous and dynamically provisioned cloud resources called VMs in order to attain the proposed method’s objectives. The proposed approach considers pay-as-you-go Infrastructure as a Service (IaaS clouds having inherent features such as elasticity, abundance, heterogeneity and VM provisioning delays. A trace-based simulation using benchmark scientific workflows representing real world applications, demonstrates a significant reduction in workflow computation cost while the workflow deadline is met. The results validate that the proposed model produces better success rates to meet deadlines and cost efficiencies in comparison to adapted state-of-the-art algorithms for similar problems.

  8. Beyond GIS with EO4V is Trails: a geospatio-temporal scientific workflow environment

    CSIR Research Space (South Africa)

    Van Zyl, T

    2012-10-01

    Full Text Available be accommodated at once. The scientific workflows approach has other advantages to such as provenance, repeatability and collaboration. The paper presents EO4VisTrails as an example of such a scientific workflows approach to integration and discusses the benefit...

  9. Open innovation: Towards sharing of data, models and workflows.

    Science.gov (United States)

    Conrado, Daniela J; Karlsson, Mats O; Romero, Klaus; Sarr, Céline; Wilkins, Justin J

    2017-11-15

    Sharing of resources across organisations to support open innovation is an old idea, but which is being taken up by the scientific community at increasing speed, concerning public sharing in particular. The ability to address new questions or provide more precise answers to old questions through merged information is among the attractive features of sharing. Increased efficiency through reuse, and increased reliability of scientific findings through enhanced transparency, are expected outcomes from sharing. In the field of pharmacometrics, efforts to publicly share data, models and workflow have recently started. Sharing of individual-level longitudinal data for modelling requires solving legal, ethical and proprietary issues similar to many other fields, but there are also pharmacometric-specific aspects regarding data formats, exchange standards, and database properties. Several organisations (CDISC, C-Path, IMI, ISoP) are working to solve these issues and propose standards. There are also a number of initiatives aimed at collecting disease-specific databases - Alzheimer's Disease (ADNI, CAMD), malaria (WWARN), oncology (PDS), Parkinson's Disease (PPMI), tuberculosis (CPTR, TB-PACTS, ReSeqTB) - suitable for drug-disease modelling. Organized sharing of pharmacometric executable model code and associated information has in the past been sparse, but a model repository (DDMoRe Model Repository) intended for the purpose has recently been launched. In addition several other services can facilitate model sharing more generally. Pharmacometric workflows have matured over the last decades and initiatives to more fully capture those applied to analyses are ongoing. In order to maximize both the impact of pharmacometrics and the knowledge extracted from clinical data, the scientific community needs to take ownership of and create opportunities for open innovation. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Optimization of business processes in banks through flexible workflow

    Science.gov (United States)

    Postolache, V.

    2017-08-01

    This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the

  11. Mergers and Acquisitions

    DEFF Research Database (Denmark)

    Risberg, Annette

    Introduction to the study of mergers and acquisitions. This book provides an understanding of the mergers and acquisitions process, how and why they occur, and also the broader implications for organizations. It presents issues including motives and planning, partner selection, integration......, employee experiences and communication. Mergers and acquisitions remain one of the most common forms of growth, yet they present considerable challenges for the companies and management involved. The effects on stakeholders, including shareholders, managers and employees, must be considered as well...... by editorial commentaries and reflects the important organizational and behavioural aspects which have often been ignored in the past. By providing this in-depth understanding of the mergers and acquisitions process, the reader understands not only how and why mergers and acquisitions occur, but also...

  12. Data Acquisition System

    International Nuclear Information System (INIS)

    Cirstea, C.D.; Buda, S.I.; Constantin, F.

    2005-01-01

    This paper deals with a multi parametric acquisition system developed for a four input Analog to Digital Converter working in CAMAC Standard. The acquisition software is built in MS Visual C++ on a standard PC with a USB interface. It has a visual interface which permits Start/Stop of the acquisition, setting the type of acquisition (True/Live time), the time and various menus for primary data acquisition. The spectrum is dynamically visualized with a moving cursor indicating the content and position. The microcontroller PIC16C765 is used for data transfer from ADC to PC; The microcontroller and the software create an embedded system which emulates the CAMAC protocol programming the 4 input ADC for operating modes ('zero suppression', 'addressed' and 'sequential') and handling the data transfers from ADC to its internal memory. From its memory the data is transferred into the PC by the USB interface. The work is in progress. (authors)

  13. Data acquisition system

    International Nuclear Information System (INIS)

    Cirstea, D.C.; Buda, S.I.; Constantin, F.

    2005-01-01

    The topic of this paper deals with a multi parametric acquisition system developed around a four input Analog to Digital Converter working in CAMAC Standard. The acquisition software is built in MS Visual C++ on a standard PC with a USB interface. It has a visual interface which permits Start/Stop of the acquisition, setting the type of acquisition (True/Live time), the time and various menus for primary data acquisition. The spectrum is dynamically visualized with a moving cursor indicating the content and position. The microcontroller PIC16C765 is used for data transfer from ADC to PC; The microcontroller and the software create an embedded system which emulates the CAMAC protocol programming, the 4 input ADC for operating modes ('zero suppression', 'addressed' and 'sequential') and handling the data transfers from ADC to its internal memory. From its memory the data is transferred into the PC by the USB interface. The work is in progress. (authors)

  14. Integration Of Externalized Decision Models In The Definition Of Workflows For Digital Pathology

    Directory of Open Access Journals (Sweden)

    J. van Leeuwen

    2016-06-01

    We proposed a workflow solution enabling the representation of decision models as externalized executable tasks in the process definition. Our approach separates the task implementations from the workflow model, ensuring scalability and allowing for the inclusion of complex decision logic in the workflow execution. In we depict a simplified model of a pathology diagnosis workflow (starting with the digitization of the slides, represented according to the BPMN modeling conventions. The example shows a workflow sequence that automatically orders a HER2 FISH when IHC is borderline according to defined customizable thresholds. The process model integrates an image analysis algorithm that scores images. Based on the score and the thresholds the decision model evaluates the condition and recommends the pre-ordering of an additional test when the score falls between the two thresholds. This leads to faster diagnosis and allows balancing the costs of an additional test versus the overhead of the pathologist by choosing the values of the thresholds.  

  15. Management and presentation of grouped procedures: has the IHE integration profile cracked the toughest radiology workflow nut?

    Science.gov (United States)

    Parisot, Charles R.; Channin, David S.; Avrin, David E.; Lindop, Christopher

    2001-08-01

    In a simple, typical radiology workflow process, an order generates a single procedure, which in turn generates a single data set, from which, one radiology report is generated. There are, however, occasions when a single order consists of more than one procedure each with a separate report, yet the procedures are accomplished by one physical acquisition of data. The prototypical example of this is the request for computed tomographic evaluation of the chest, abdomen and pelvis. The study is accomplished, with modern day scanners, by a single helical acquisition, yet there are typically three codable and billable procedures involved, and these may be reported independently either for administrative or academic reasons. This grouping of procedures remained up to now a challenge to automate across integrated modalities, PACS and RIS. This paper discusses a number of other practical cases where this situation occurs and reviews the capabilities of the Presentation of Grouped Procedures IHE Integration Profile in solving this problem. The DICOM services used are evaluated as well as the strengths and weaknesses of this IHE Integration Profile. The implementation experience gained on both a CT and an MR for the IHE Demonstration at RSNA 2000 and HIMSS 2001 is also reviewed. In conclusion, the resulting clinical and operational benefits are discussed.

  16. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  17. Integration of 3D photogrammetric outcrop models in the reservoir modelling workflow

    Science.gov (United States)

    Deschamps, Remy; Joseph, Philippe; Lerat, Olivier; Schmitz, Julien; Doligez, Brigitte; Jardin, Anne

    2014-05-01

    3D technologies are now widely used in geosciences to reconstruct outcrops in 3D. The technology used for the 3D reconstruction is usually based on Lidar, which provides very precise models. Such datasets offer the possibility to build well-constrained outcrop analogue models for reservoir study purposes. The photogrammetry is an alternate methodology which principles are based in determining the geometric properties of an object from photographic pictures taken from different angles. Outcrop data acquisition is easy, and this methodology allows constructing 3D outcrop models with many advantages such as: - light and fast acquisition, - moderate processing time (depending on the size of the area of interest), - integration of field data and 3D outcrops into the reservoir modelling tools. Whatever the method, the advantages of digital outcrop model are numerous as already highlighted by Hodgetts (2013), McCaffrey et al. (2005) and Pringle et al. (2006): collection of data from otherwise inaccessible areas, access to different angles of view, increase of the possible measurements, attributes analysis, fast rate of data collection, and of course training and communication. This paper proposes a workflow where 3D geocellular models are built by integrating all sources of information from outcrops (surface picking, sedimentological sections, structural and sedimentary dips…). The 3D geomodels that are reconstructed can be used at the reservoir scale, in order to compare the outcrop information with subsurface models: the detailed facies models of the outcrops are transferred into petrophysical and acoustic models, which are used to test different scenarios of seismic and fluid flow modelling. The detailed 3D models are also used to test new techniques of static reservoir modelling, based either on geostatistical approaches or on deterministic (process-based) simulation techniques. A modelling workflow has been designed to model reservoir geometries and properties from

  18. Workflow for near-surface velocity automatic estimation: Source-domain full-traveltime inversion followed by waveform inversion

    KAUST Repository

    Liu, Lu; Fei, Tong; Luo, Yi; Guo, Bowen

    2017-01-01

    This paper presents a workflow for near-surface velocity automatic estimation using the early arrivals of seismic data. This workflow comprises two methods, source-domain full traveltime inversion (FTI) and early-arrival waveform inversion. Source

  19. A structured workflow for mapping human Sin3 histone deacetylase complex interactions using Halo-MudPIT AP-MS.

    Science.gov (United States)

    Banks, Charles A S; Thornton, Janet L; Eubanks, Cassandra G; Adams, Mark K; Miah, Sayem; Boanca, Gina; Liu, Xingyu; Katt, Maria; Parmely, Tari; Florens, Laurence A; Washburn, Michael P

    2018-03-29

    Although a variety of affinity purification mass spectrometry (AP-MS) strategies have been used to investigate complex interactions, many of these are susceptible to artifacts due to substantial overexpression of the exogenously expressed bait protein. Here we present a logical and systematic workflow that uses the multifunctional Halo tag to assess the correct localization and behavior of tagged subunits of the Sin3 histone deacetylase complex prior to further AP-MS analysis. Using this workflow, we modified our tagging/expression strategy with 21.7% of the tagged bait proteins that we constructed, allowing us to quickly develop validated reagents. Specifically, we apply the workflow to map interactions between stably expressed versions of the Sin3 subunits SUDS3, SAP30 or SAP30L and other cellular proteins.  Here we show that the SAP30 and SAP30L paralogues strongly associate with the core Sin3 complex, but SAP30L has unique associations with the proteasome and the myelin sheath.  Next, we demonstrate an advancement of the complex NSAF (cNSAF) approach, in which normalization to the scaffold protein SIN3A accounts for variations in the proportion of each bait capturing Sin3 complexes and allows a comparison between different baits capturing the same protein complex. This analysis reveals that although the Sin3 subunit SUDS3 appears to be used in both SIN3A and SIN3B based complexes, the SAP30 subunit is not used in SIN3B based complexes. Intriguingly, we do not detect the Sin3 subunits SAP18 and SAP25 among the 128 high-confidence interactions identified, suggesting that these subunits may not be common to all versions of the Sin3 complex in human cells. This workflow provides the framework for building validated reagents to assemble quantitative interaction networks for chromatin remodeling complexes and provides novel insights into focused protein interaction networks. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Indexing mergers and acquisitions

    OpenAIRE

    Gang, Jianhua; Guo, Jie (Michael); Hu, Nan; Li, Xi

    2017-01-01

    We measure the efficiency of mergers and acquisitions by putting forward an index (the ‘M&A Index’) based on stochastic frontier analysis. The M&A Index is calculated for each takeover deal and is standardized between 0 and 1. An acquisition with a higher index encompasses higher efficiency. We find that takeover bids with higher M&A Indices are more likely to succeed. Moreover, the M&A Index shows a strong and positive relation with the acquirers’ post-acquisition stock perfo...

  1. The standard-based open workflow system in GeoBrain (Invited)

    Science.gov (United States)

    Di, L.; Yu, G.; Zhao, P.; Deng, M.

    2013-12-01

    GeoBrain is an Earth science Web-service system developed and operated by the Center for Spatial Information Science and Systems, George Mason University. In GeoBrain, a standard-based open workflow system has been implemented to accommodate the automated processing of geospatial data through a set of complex geo-processing functions for advanced production generation. The GeoBrain models the complex geoprocessing at two levels, the conceptual and concrete. At the conceptual level, the workflows exist in the form of data and service types defined by ontologies. The workflows at conceptual level are called geo-processing models and cataloged in GeoBrain as virtual product types. A conceptual workflow is instantiated into a concrete, executable workflow when a user requests a product that matches a virtual product type. Both conceptual and concrete workflows are encoded in Business Process Execution Language (BPEL). A BPEL workflow engine, called BPELPower, has been implemented to execute the workflow for the product generation. A provenance capturing service has been implemented to generate the ISO 19115-compliant complete product provenance metadata before and after the workflow execution. The generation of provenance metadata before the workflow execution allows users to examine the usability of the final product before the lengthy and expensive execution takes place. The three modes of workflow executions defined in the ISO 19119, transparent, translucent, and opaque, are available in GeoBrain. A geoprocessing modeling portal has been developed to allow domain experts to develop geoprocessing models at the type level with the support of both data and service/processing ontologies. The geoprocessing models capture the knowledge of the domain experts and are become the operational offering of the products after a proper peer review of models is conducted. An automated workflow composition has been experimented successfully based on ontologies and artificial

  2. Acquisition Workforce Annual Report 2006

    Data.gov (United States)

    General Services Administration — This is the Federal Acquisition Institute's (FAI's) Annual demographic report on the Federal acquisition workforce, showing trends by occupational series, employment...

  3. Acquisition Workforce Annual Report 2008

    Data.gov (United States)

    General Services Administration — This is the Federal Acquisition Institute's (FAI's) Annual demographic report on the Federal acquisition workforce, showing trends by occupational series, employment...

  4. Automation in an Addiction Treatment Research Clinic: Computerized Contingency Management, Ecological Momentary Assessment, and a Protocol Workflow System

    Science.gov (United States)

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H.; Preston, Kenzie L.

    2009-01-01

    Issues A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients’ treatment needs and accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with provision of seamless methods for exporting, mining, and querying the data. Approach We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialized applications: the Automated Contingency Management (ACM) system for delivery of behavioral interventions, the Transactional Electronic Diary (TED) system for management of behavioral assessments, and the Protocol Workflow System (PWS) for computerized workflow automation and guidance of each participant’s daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorized staff. Key Findings ACM and TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80-patient capacity having an annual average of 18,000 patient-visits and 7,300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarize participant-safety data for research oversight. Implications and conclusion When developed in consultation with end users, automation in treatment-research clinics can enable more efficient operations, better communication among staff, and expansions in research methods. PMID:19320669

  5. Automation in an addiction treatment research clinic: computerised contingency management, ecological momentary assessment and a protocol workflow system.

    Science.gov (United States)

    Vahabzadeh, Massoud; Lin, Jia-Ling; Mezghanni, Mustapha; Epstein, David H; Preston, Kenzie L

    2009-01-01

    A challenge in treatment research is the necessity of adhering to protocol and regulatory strictures while maintaining flexibility to meet patients' treatment needs and to accommodate variations among protocols. Another challenge is the acquisition of large amounts of data in an occasionally hectic environment, along with the provision of seamless methods for exporting, mining and querying the data. We have automated several major functions of our outpatient treatment research clinic for studies in drug abuse and dependence. Here we describe three such specialised applications: the Automated Contingency Management (ACM) system for the delivery of behavioural interventions, the transactional electronic diary (TED) system for the management of behavioural assessments and the Protocol Workflow System (PWS) for computerised workflow automation and guidance of each participant's daily clinic activities. These modules are integrated into our larger information system to enable data sharing in real time among authorised staff. ACM and the TED have each permitted us to conduct research that was not previously possible. In addition, the time to data analysis at the end of each study is substantially shorter. With the implementation of the PWS, we have been able to manage a research clinic with an 80 patient capacity, having an annual average of 18,000 patient visits and 7300 urine collections with a research staff of five. Finally, automated data management has considerably enhanced our ability to monitor and summarise participant safety data for research oversight. When developed in consultation with end users, automation in treatment research clinics can enable more efficient operations, better communication among staff and expansions in research methods.

  6. The Acquisition of Particles

    African Journals Online (AJOL)

    process of language acquisition on the basis of linguistic evidence the child is exposed to. ..... particle verbs are recognized in language processing differs from the way morphologically ..... In Natural Language and Linguistic Theory 11.

  7. High speed data acquisition

    International Nuclear Information System (INIS)

    Cooper, P.S.

    1997-07-01

    A general introduction to high speed data acquisition system techniques in modern particle physics experiments is given. Examples are drawn from the SELEX(E78 1) high statistics charmed baryon production and decay experiment now taking data at Fermilab

  8. High performance workflow implementation for protein surface characterization using grid technology

    Directory of Open Access Journals (Sweden)

    Clematis Andrea

    2005-12-01

    Full Text Available Abstract Background This study concerns the development of a high performance workflow that, using grid technology, correlates different kinds of Bioinformatics data, starting from the base pairs of the nucleotide sequence to the exposed residues of the protein surface. The implementation of this workflow is based on the Italian Grid.it project infrastructure, that is a network of several computational resources and storage facilities distributed at different grid sites. Methods Workflows are very common in Bioinformatics because they allow to process large quantities of data by delegating the management of resources to the information streaming. Grid technology optimizes the computational load during the different workflow steps, dividing the more expensive tasks into a set of small jobs. Results Grid technology allows efficient database management, a crucial problem for obtaining good results in Bioinformatics applications. The proposed workflow is implemented to integrate huge amounts of data and the results themselves must be stored into a relational database, which results as the added value to the global knowledge. Conclusion A web interface has been developed to make this technology accessible to grid users. Once the workflow has started, by means of the simplified interface, it is possible to follow all the different steps throughout the data processing. Eventually, when the workflow has been terminated, the different features of the protein, like the amino acids exposed on the protein surface, can be compared with the data present in the output database.

  9. Experiences and lessons learned from creating a generalized workflow for data publication of field campaign datasets

    Science.gov (United States)

    Santhana Vannan, S. K.; Ramachandran, R.; Deb, D.; Beaty, T.; Wright, D.

    2017-12-01

    This paper summarizes the workflow challenges of curating and publishing data produced from disparate data sources and provides a generalized workflow solution to efficiently archive data generated by researchers. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC) for biogeochemical dynamics and the Global Hydrology Resource Center (GHRC) DAAC have been collaborating on the development of a generalized workflow solution to efficiently manage the data publication process. The generalized workflow presented here are built on lessons learned from implementations of the workflow system. Data publication consists of the following steps: Accepting the data package from the data providers, ensuring the full integrity of the data files. Identifying and addressing data quality issues Assembling standardized, detailed metadata and documentation, including file level details, processing methodology, and characteristics of data files Setting up data access mechanisms Setup of the data in data tools and services for improved data dissemination and user experience Registering the dataset in online search and discovery catalogues Preserving the data location through Digital Object Identifiers (DOI) We will describe the steps taken to automate, and realize efficiencies to the above process. The goals of the workflow system are to reduce the time taken to publish a dataset, to increase the quality of documentation and metadata, and to track individual datasets through the data curation process. Utilities developed to achieve these goal will be described. We will also share metrics driven value of the workflow system and discuss the future steps towards creation of a common software framework.

  10. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  11. Workflows des Projektes Mengenentsäuerung im Deutschen Literaturarchiv

    Directory of Open Access Journals (Sweden)

    Melanie Kubitza

    2018-03-01

    Full Text Available Das Deutsche Literaturarchiv Marbach (DLA begann 1998 mit der Evaluation geeigneter Entsäuerungsverfahren für seine unikalen Archiv- und Bibliotheksbestände. Seit 2013 erfolgt eine kontinuierliche Entsäuerungsbehandlung der Buchbestände mit dem Papersave-Verfahren der Nitrochemie AG, mit einer jährlichen Entsäuerungsleistung von insgesamt rund 4,0 t in acht Chargen. Neben den Magazinbeständen mit Erstausgaben bis hin zur Sekundärliteratur zu den einzelnen Autoren, wurden nun auch erstmals erfolgreich Bände aus den Spezialsammlungen des DLA entsäuert, zu denen auch Autorenbibliotheken gehören. Der Beitrag erläutert neben der Zusammenarbeit des Referats Bestandserhaltung mit den Abteilungen Bibliothek und Archiv auch Teilaspekte der Workflows aus restauratorischer Sicht. Dabei steht die intern entwickelte Barcode-unterstützte Dokumentation der Maßnahme auf Exemplarsebene im Bestandskatalog Kallías im Fokus. Sie enthält neben den verfahrensrelevanten Daten u. a. Informationen zu den Ausschlusskriterien oder alternativen und ergänzenden Maßnahmen der Verpackung, Restaurierung oder Digitalisierung. Das Marbacher Mengenentsäuerungsprojekt führt an den Originalen eine Qualitätskontrolle bezüglich aufgetretener Nebenwirkungen am Bibliotheksgut durch und kontrolliert die Langzeitwirkung der Entsäuerung an originalähnlichen Referenzbänden mit nicht zerstörungsfreien Messungen. In 1998 the German Literature Archive (DLA began to evaluate appropriate deacidification processes for its unique archive and library stock. Since 2013, Nitrochemie’s papersave procedure has been used for an ongoing deacidification treatment of the bookstock with an annual output of about 4,0 t in eight batches. The deacidification process has been successfully used for primary and secondary literature as well as - for the first time - for volumes of the rare materials collection of the DLA, including authors‘ libraries. The article describes the

  12. Data and Workflow Management Challenges in Global Adjoint Tomography

    Science.gov (United States)

    Lei, W.; Ruan, Y.; Smith, J. A.; Modrak, R. T.; Orsvuran, R.; Krischer, L.; Chen, Y.; Balasubramanian, V.; Hill, J.; Turilli, M.; Bozdag, E.; Lefebvre, M. P.; Jha, S.; Tromp, J.

    2017-12-01

    It is crucial to take the complete physics of wave propagation into account in seismic tomography to further improve the resolution of tomographic images. The adjoint method is an efficient way of incorporating 3D wave simulations in seismic tomography. However, global adjoint tomography is computationally expensive, requiring thousands of wavefield simulations and massive data processing. Through our collaboration with the Oak Ridge National Laboratory (ORNL) computing group and an allocation on Titan, ORNL's GPU-accelerated supercomputer, we are now performing our global inversions by assimilating waveform data from over 1,000 earthquakes. The first challenge we encountered is dealing with the sheer amount of seismic data. Data processing based on conventional data formats and processing tools (such as SAC), which are not designed for parallel systems, becomes our major bottleneck. To facilitate the data processing procedures, we designed the Adaptive Seismic Data Format (ASDF) and developed a set of Python-based processing tools to replace legacy FORTRAN-based software. These tools greatly enhance reproducibility and accountability while taking full advantage of highly parallel system and showing superior scaling on modern computational platforms. The second challenge is that the data processing workflow contains more than 10 sub-procedures, making it delicate to handle and prone to human mistakes. To reduce human intervention as much as possible, we are developing a framework specifically designed for seismic inversion based on the state-of-the art workflow management research, specifically the Ensemble Toolkit (EnTK), in collaboration with the RADICAL team from Rutgers University. Using the initial developments of the EnTK, we are able to utilize the full computing power of the data processing cluster RHEA at ORNL while keeping human interaction to a minimum and greatly reducing the data processing time. Thanks to all the improvements, we are now able to

  13. DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  14. The Live Access Server Scientific Product Generation Through Workflow Orchestration

    Science.gov (United States)

    Hankin, S.; Calahan, J.; Li, J.; Manke, A.; O'Brien, K.; Schweitzer, R.

    2006-12-01

    The Live Access Server (LAS) is a well-established Web-application for display and analysis of geo-science data sets. The software, which can be downloaded and installed by anyone, gives data providers an easy way to establish services for their on-line data holdings, so their users can make plots; create and download data sub-sets; compare (difference) fields; and perform simple analyses. Now at version 7.0, LAS has been in operation since 1994. The current "Armstrong" release of LAS V7 consists of three components in a tiered architecture: user interface, workflow orchestration and Web Services. The LAS user interface (UI) communicates with the LAS Product Server via an XML protocol embedded in an HTTP "get" URL. Libraries (APIs) have been developed in Java, JavaScript and perl that can readily generate this URL. As a result of this flexibility it is common to find LAS user interfaces of radically different character, tailored to the nature of specific datasets or the mindset of specific users. When a request is received by the LAS Product Server (LPS -- the workflow orchestration component), business logic converts this request into a series of Web Service requests invoked via SOAP. These "back- end" Web services perform data access and generate products (visualizations, data subsets, analyses, etc.). LPS then packages these outputs into final products (typically HTML pages) via Jakarta Velocity templates for delivery to the end user. "Fine grained" data access is performed by back-end services that may utilize JDBC for data base access; the OPeNDAP "DAPPER" protocol; or (in principle) the OGC WFS protocol. Back-end visualization services are commonly legacy science applications wrapped in Java or Python (or perl) classes and deployed as Web Services accessible via SOAP. Ferret is the default visualization application used by LAS, though other applications such as Matlab, CDAT, and GrADS can also be used. Other back-end services may include generation of Google

  15. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    Full Text Available Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF tissues, 6 formalin-fixed paraffin-embedded (FFPE tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard

  16. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  17. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  18. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  19. Support for Taverna workflows in the VPH-Share cloud platform.

    Science.gov (United States)

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Structuring research methods and data with the research object model: genomics workflows as a case study.

    Science.gov (United States)

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  1. Characterizing workflow for pediatric asthma patients in emergency departments using electronic health records.

    Science.gov (United States)

    Ozkaynak, Mustafa; Dziadkowiec, Oliwier; Mistry, Rakesh; Callahan, Tiffany; He, Ze; Deakyne, Sara; Tham, Eric

    2015-10-01

    The purpose of this study was to describe a workflow analysis approach and apply it in emergency departments (EDs) using data extracted from the electronic health record (EHR) system. We used data that were obtained during 2013 from the ED of a children's hospital and its four satellite EDs. Workflow-related data were extracted for all patient visits with either a primary or secondary diagnosis on discharge of asthma (ICD-9 code=493). For each patient visit, eight different a priori time-stamped events were identified. Data were also collected on mode of arrival, patient demographics, triage score (i.e. acuity level), and primary/secondary diagnosis. Comparison groups were by acuity levels 2 and 3 with 2 being more acute than 3, arrival mode (ambulance versus walk-in), and site. Data were analyzed using a visualization method and Markov Chains. To demonstrate the viability and benefit of the approach, patient care workflows were visually and quantitatively compared. The analysis of the EHR data allowed for exploration of workflow patterns and variation across groups. Results suggest that workflow was different for different arrival modes, settings and acuity levels. EHRs can be used to explore workflow with statistical and visual analytics techniques novel to the health care setting. The results generated by the proposed approach could be utilized to help institutions identify workflow issues, plan for varied workflows and ultimately improve efficiency in caring for diverse patient groups. EHR data and novel analytic techniques in health care can expand our understanding of workflow in both large and small ED units. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Health information exchange technology on the front lines of healthcare: workflow factors and patterns of use

    Science.gov (United States)

    Johnson, Kevin B; Lorenzi, Nancy M

    2011-01-01

    Objective The goal of this study was to develop an in-depth understanding of how a health information exchange (HIE) fits into clinical workflow at multiple clinical sites. Materials and Methods The ethnographic qualitative study was conducted over a 9-month period in six emergency departments (ED) and eight ambulatory clinics in Memphis, Tennessee, USA. Data were collected using direct observation, informal interviews during observation, and formal semi-structured interviews. The authors observed for over 180 h, during which providers used the exchange 130 times. Results HIE-related workflow was modeled for each ED site and ambulatory clinic group and substantial site-to-site workflow differences were identified. Common patterns in HIE-related workflow were also identified across all sites, leading to the development of two role-based workflow models: nurse based and physician based. The workflow elements framework was applied to the two role-based patterns. An in-depth description was developed of how providers integrated HIE into existing clinical workflow, including prompts for HIE use. Discussion Workflow differed substantially among sites, but two general role-based HIE usage models were identified. Although providers used HIE to improve continuity of patient care, patient–provider trust played a significant role. Types of information retrieved related to roles, with nurses seeking to retrieve recent hospitalization data and more open-ended usage by nurse practitioners and physicians. User and role-specific customization to accommodate differences in workflow and information needs may increase the adoption and use of HIE. Conclusion Understanding end users' perspectives towards HIE technology is crucial to the long-term success of HIE. By applying qualitative methods, an in-depth understanding of HIE usage was developed. PMID:22003156

  3. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  4. Health information technology: integration of clinical workflow into meaningful use of electronic health records.

    Science.gov (United States)

    Bowens, Felicia M; Frye, Patricia A; Jones, Warren A

    2010-10-01

    This article examines the role that clinical workflow plays in successful implementation and meaningful use of electronic health record (EHR) technology in ambulatory care. The benefits and barriers of implementing EHRs in ambulatory care settings are discussed. The researchers conclude that widespread adoption and meaningful use of EHR technology rely on the successful integration of health information technology (HIT) into clinical workflow. Without successful integration of HIT into clinical workflow, clinicians in today's ambulatory care settings will continue to resist adoption and implementation of EHR technology.

  5. Flexible Data-Aware Scheduling for Workflows over an In-Memory Object Store

    Energy Technology Data Exchange (ETDEWEB)

    Duro, Francisco Rodrigo; Garcia Blas, Javier; Isaila, Florin; Wozniak, Justin M.; Carretero, Jesus; Ross, Rob

    2016-01-01

    This paper explores novel techniques for improving the performance of many-task workflows based on the Swift scripting language. We propose novel programmer options for automated distributed data placement and task scheduling. These options trigger a data placement mechanism used for distributing intermediate workflow data over the servers of Hercules, a distributed key-value store that can be used to cache file system data. We demonstrate that these new mechanisms can significantly improve the aggregated throughput of many-task workflows with up to 86x, reduce the contention on the shared file system, exploit the data locality, and trade off locality and load balance.

  6. Declarative Event-Based Workflow as Distributed Dynamic Condition Response Graphs

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2010-01-01

    We present Dynamic Condition Response Graphs (DCR Graphs) as a declarative, event-based process model inspired by the workflow language employed by our industrial partner and conservatively generalizing prime event structures. A dynamic condition response graph is a directed graph with nodes repr...... exemplify the use of distributed DCR Graphs on a simple workflow taken from a field study at a Danish hospital, pointing out their flexibility compared to imperative workflow models. Finally we provide a mapping from DCR Graphs to Buchi-automata....

  7. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    Science.gov (United States)

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Accelerating Science Impact through Big Data Workflow Management and Supercomputing

    Directory of Open Access Journals (Sweden)

    De K.

    2016-01-01

    Full Text Available The Large Hadron Collider (LHC, operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. ATLAS, one of the largest collaborations ever assembled in the the history of science, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. To manage the workflow for all data processing on hundreds of data centers the PanDA (Production and Distributed AnalysisWorkload Management System is used. An ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF, is realizing within BigPanDA and megaPanDA projects. These projects are now exploring how PanDA might be used for managing computing jobs that run on supercomputers including OLCF’s Titan and NRC-KI HPC2. The main idea is to reuse, as much as possible, existing components of the PanDA system that are already deployed on the LHC Grid for analysis of physics data. The next generation of PanDA will allow many data-intensive sciences employing a variety of computing platforms to benefit from ATLAS experience and proven tools in highly scalable processing.

  9. Soarian--workflow management applied for health care.

    Science.gov (United States)

    Haux, R; Seggewies, C; Baldauf-Sobez, W; Kullmann, P; Reichert, H; Luedecke, L; Seibold, H

    2003-01-01

    To describe and comment on functionality and architecture of the software product Soarian developed by Siemens, to identify key differentiators to related products, and to comment on predecessor systems and beta versions. This has been done in the framework of a conference on health information systems of the IMIA. Analyzing existing literature. Site visit of a predecessor system at Haukeland Sykehus, Bergen. Pilot of a beta version at the Erlangen University Medical Center, elaborating on major characteristics in discussion rounds. Soarian is a functional comprehensive, clinically oriented software product to support health care processes and to be used for health care professional workstations. It is a software product, designed and written completely new. Three major key differentiators were identified in comparison to related software products: Soarian's workflow engine, its embedded analytics, and its 'smart' user interface. The targeted reduced installation time is stated to be 12 months or less. Soarian has good chances to become one of the major software products for health care professional workstations in the international market to support patient-centered, shared care. Its global design may help to better support and maintain national or language specific versions. The first installations of Soarian will be critical, as they will show how the system will be accepted. To use such software products efficiently, organizational aspects within hospitals as well as between health care institutions have to be considered, e.g. strategic IT planning.

  10. The Diabetic Retinopathy Screening Workflow: Potential for Smartphone Imaging.

    Science.gov (United States)

    Bolster, Nigel M; Giardini, Mario E; Bastawrous, Andrew

    2015-11-23

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems' use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. © 2015 Diabetes Technology Society.

  11. Aligning observed and modelled behaviour based on workflow decomposition

    Science.gov (United States)

    Wang, Lu; Du, YuYue; Liu, Wei

    2017-09-01

    When business processes are mostly supported by information systems, the availability of event logs generated from these systems, as well as the requirement of appropriate process models are increasing. Business processes can be discovered, monitored and enhanced by extracting process-related information. However, some events cannot be correctly identified because of the explosion of the amount of event logs. Therefore, a new process mining technique is proposed based on a workflow decomposition method in this paper. Petri nets (PNs) are used to describe business processes, and then conformance checking of event logs and process models is investigated. A decomposition approach is proposed to divide large process models and event logs into several separate parts that can be analysed independently; while an alignment approach based on a state equation method in PN theory enhances the performance of conformance checking. Both approaches are implemented in programmable read-only memory (ProM). The correctness and effectiveness of the proposed methods are illustrated through experiments.

  12. Perceptions of Risk Stratification Workflows in Primary Care

    Directory of Open Access Journals (Sweden)

    Rachel L. Ross

    2017-10-01

    Full Text Available Risk stratification (RS in primary care is frequently used by policy-makers, payers, and health systems; the process requires risk assessment for adverse health outcomes across a population to assign patients into risk tiers and allow care management (CM resources to be targeted effectively. Our objective was to understand the approach to and perception of RS in primary care practices. An online survey was developed, tested, and administered to 148 representatives of 37 primary care practices engaged in RS varying in size, location and ownership. The survey assessed practices’ approach to, perception of, and confidence in RS, and its effect on subsequent CM activities. We examined psychometric properties of the survey to determine validity and conducted chi-square analyses to determine the association between practice characteristics and confidence and agreement with risk scores. The survey yielded a 68% response rate (100 respondents. Overall, participants felt moderately confident in their risk scores (range 41–53.8%, and moderately to highly confident in their subsequent CM workflows (range 46–68%. Respondents from small and independent practices were more likely to have higher confidence and agreement with their RS approaches and scores (p < 0.01. Confidence levels were highest, however, when practices incorporated human review into their RS processes (p < 0.05. This trend was not affected by respondents’ professional roles. Additional work from a broad mixed-methods effort will add to our understanding of RS implementation processes and outcomes.

  13. Automated Finite State Workflow for Distributed Data Production

    Science.gov (United States)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  14. Automated Finite State Workflow for Distributed Data Production

    International Nuclear Information System (INIS)

    Hajdu, L; Didenko, L; Lauret, J; Betts, W; Amol, J; Jang, H J; Noh, S Y

    2016-01-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ∼400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure. (paper)

  15. Clinical workflow for personalized foot pressure ulcer prevention.

    Science.gov (United States)

    Bucki, M; Luboz, V; Perrier, A; Champion, E; Diot, B; Vuillerme, N; Payan, Y

    2016-09-01

    Foot pressure ulcers are a common complication of diabetes because of patient's lack of sensitivity due to neuropathy. Deep pressure ulcers appear internally when pressures applied on the foot create high internal strains nearby bony structures. Monitoring tissue strains in persons with diabetes is therefore important for an efficient prevention. We propose to use personalized biomechanical foot models to assess strains within the foot and to determine the risk of ulcer formation. Our workflow generates a foot model adapted to a patient's morphology by deforming an atlas model to conform it to the contours of segmented medical images of the patient's foot. Our biomechanical model is composed of rigid bodies for the bones, joined by ligaments and muscles, and a finite element mesh representing the soft tissues. Using our registration algorithm to conform three datasets, three new patient models were created. After applying a pressure load below these foot models, the Von Mises equivalent strains and "cluster volumes" (i.e. volumes of contiguous elements with strains above a given threshold) were measured within eight functionally meaningful foot regions. The results show the variability of both location and strain values among the three considered patients. This study also confirms that the anatomy of the foot has an influence on the risk of pressure ulcer. Copyright © 2016. Published by Elsevier Ltd.

  16. 76 FR 71928 - Defense Federal Acquisition Regulation Supplement; Updates to Wide Area WorkFlow (DFARS Case 2011...

    Science.gov (United States)

    2011-11-21

    ...'' under the heading ``Enter keyword or ID'' and selecting ``Search.'' Select the link ``Submit a Comment... the introductory text to remove ``232.7004'' and insert in its place ``232.7004(a)''; (b) Amending the... payment requests through WAWF. Both can be accessed by selecting the ``Web Based Training'' link on the...

  17. Data acquisition upgrade in the RFX experiment

    International Nuclear Information System (INIS)

    Barana, O.; Luchetta, A.; Manduchi, G.; Taliercio, C.

    2005-01-01

    The control and data acquisition system of RFX has been completely renewed and is currently under re-commissioning. Most data acquisition is now performed by means of CompactPCI (CPCI) devices supervised by Pentium PCs running Linux. Real-time control systems, implemented using VME and PowerPCs running VxWorks, also produce data that are then acquired by the data acquisition system. The older CAMAC systems have only been retained for existing diagnostics. New diagnostic systems will employ either CompactPCI data acquisition or custom solutions, usually running under Windows, due to the fact that the drivers for the used devices are normally available for this platform. Despite the variety of hardware and software platforms involved in data acquisition, the same software package is used for all components, thus providing a uniform view of the system. Such functionality is provided by the MDSplus software. MDSplus is now available for a variety of platforms, and includes several Java components that are platform-independent

  18. Post-Acquisition IT Integration

    DEFF Research Database (Denmark)

    Henningsson, Stefan; Yetton, Philip

    2013-01-01

    The extant research on post-acquisition IT integration analyzes how acquirers realize IT-based value in individual acquisitions. However, serial acquirers make 60% of acquisitions. These acquisitions are not isolated events, but are components in growth-by-acquisition programs. To explain how...... serial acquirers realize IT-based value, we develop three propositions on the sequential effects on post-acquisition IT integration in acquisition programs. Their combined explanation is that serial acquirers must have a growth-by-acquisition strategy that includes the capability to improve...... IT integration capabilities, to sustain high alignment across acquisitions and to maintain a scalable IT infrastructure with a flat or decreasing cost structure. We begin the process of validating the three propositions by investigating a longitudinal case study of a growth-by-acquisition program....

  19. A framework for service enterprise workflow simulation with multi-agents cooperation

    Science.gov (United States)

    Tan, Wenan; Xu, Wei; Yang, Fujun; Xu, Lida; Jiang, Chuanqun

    2013-11-01

    Process dynamic modelling for service business is the key technique for Service-Oriented information systems and service business management, and the workflow model of business processes is the core part of service systems. Service business workflow simulation is the prevalent approach to be used for analysis of service business process dynamically. Generic method for service business workflow simulation is based on the discrete event queuing theory, which is lack of flexibility and scalability. In this paper, we propose a service workflow-oriented framework for the process simulation of service businesses using multi-agent cooperation to address the above issues. Social rationality of agent is introduced into the proposed framework. Adopting rationality as one social factor for decision-making strategies, a flexible scheduling for activity instances has been implemented. A system prototype has been developed to validate the proposed simulation framework through a business case study.

  20. The OCoN approach to workflow modeling in object-oriented systems

    NARCIS (Netherlands)

    Wirtz, G.; Weske, M.H.; Giese, H.

    2001-01-01

    Workflow management aims at modeling and executing application processes in complex technical and organizational environments. Modern information systems are often based on object-oriented design techniques, for instance, the Unified Modeling Language (UML). These systems consist of application