WorldWideScience

Sample records for spectrometry workflow combining

  1. Highly Reproducible Automated Proteomics Sample Preparation Workflow for Quantitative Mass Spectrometry.

    Science.gov (United States)

    Fu, Qin; Kowalski, Michael P; Mastali, Mitra; Parker, Sarah J; Sobhani, Kimia; van den Broek, Irene; Hunter, Christie L; Van Eyk, Jennifer E

    2018-01-05

    Sample preparation for protein quantification by mass spectrometry requires multiple processing steps including denaturation, reduction, alkylation, protease digestion, and peptide cleanup. Scaling these procedures for the analysis of numerous complex biological samples can be tedious and time-consuming, as there are many liquid transfer steps and timed reactions where technical variations can be introduced and propagated. We established an automated sample preparation workflow with a total processing time for 96 samples of 5 h, including a 2 h incubation with trypsin. Peptide cleanup is accomplished by online diversion during the LC/MS/MS analysis. In a selected reaction monitoring (SRM) assay targeting 6 plasma biomarkers and spiked β-galactosidase, mean intraday and interday cyclic voltammograms (CVs) for 5 serum and 5 plasma samples over 5 days were samples repeated on 3 separate days had total CVs below 20%. Similar results were obtained when the workflow was transferred to a second site: 93% of peptides had CVs below 20%. An automated trypsin digestion workflow yields uniformly processed samples in less than 5 h. Reproducible quantification of peptides was observed across replicates, days, instruments, and laboratory sites, demonstrating the broad applicability of this approach.

  2. An evolving computational platform for biological mass spectrometry: workflows, statistics and data mining with MASSyPup64.

    Science.gov (United States)

    Winkler, Robert

    2015-01-01

    In biological mass spectrometry, crude instrumental data need to be converted into meaningful theoretical models. Several data processing and data evaluation steps are required to come to the final results. These operations are often difficult to reproduce, because of too specific computing platforms. This effect, known as 'workflow decay', can be diminished by using a standardized informatic infrastructure. Thus, we compiled an integrated platform, which contains ready-to-use tools and workflows for mass spectrometry data analysis. Apart from general unit operations, such as peak picking and identification of proteins and metabolites, we put a strong emphasis on the statistical validation of results and Data Mining. MASSyPup64 includes e.g., the OpenMS/TOPPAS framework, the Trans-Proteomic-Pipeline programs, the ProteoWizard tools, X!Tandem, Comet and SpiderMass. The statistical computing language R is installed with packages for MS data analyses, such as XCMS/metaXCMS and MetabR. The R package Rattle provides a user-friendly access to multiple Data Mining methods. Further, we added the non-conventional spreadsheet program teapot for editing large data sets and a command line tool for transposing large matrices. Individual programs, console commands and modules can be integrated using the Workflow Management System (WMS) taverna. We explain the useful combination of the tools by practical examples: (1) A workflow for protein identification and validation, with subsequent Association Analysis of peptides, (2) Cluster analysis and Data Mining in targeted Metabolomics, and (3) Raw data processing, Data Mining and identification of metabolites in untargeted Metabolomics. Association Analyses reveal relationships between variables across different sample sets. We present its application for finding co-occurring peptides, which can be used for target proteomics, the discovery of alternative biomarkers and protein-protein interactions. Data Mining derived models

  3. Pre-analytic evaluation of volumetric absorptive microsampling and integration in a mass spectrometry-based metabolomics workflow.

    Science.gov (United States)

    Volani, Chiara; Caprioli, Giulia; Calderisi, Giovanni; Sigurdsson, Baldur B; Rainer, Johannes; Gentilini, Ivo; Hicks, Andrew A; Pramstaller, Peter P; Weiss, Guenter; Smarason, Sigurdur V; Paglia, Giuseppe

    2017-10-01

    Volumetric absorptive microsampling (VAMS) is a novel approach that allows single-drop (10 μL) blood collection. Integration of VAMS with mass spectrometry (MS)-based untargeted metabolomics is an attractive solution for both human and animal studies. However, to boost the use of VAMS in metabolomics, key pre-analytical questions need to be addressed. Therefore, in this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. We first evaluated the best extraction procedure for the polar metabolome and found that the highest number and amount of metabolites were recovered upon extraction with acetonitrile/water (70:30). In contrast, basic conditions (pH 9) resulted in divergent metabolite profiles mainly resulting from the extraction of intracellular metabolites originating from red blood cells. In addition, the prolonged storage of blood samples at room temperature caused significant changes in metabolome composition, but once the VAMS devices were stored at - 80 °C, the metabolome remained stable for up to 6 months. The time used for drying the sample did also affect the metabolome. In fact, some metabolites were rapidly degraded or accumulated in the sample during the first 48 h at room temperature, indicating that a longer drying step will significantly change the concentration in the sample. Graphical abstract Volumetric absorptive microsampling (VAMS) is a novel technology that allows single-drop blood collection and, in combination with mass spectrometry (MS)-based untargeted metabolomics, represents an attractive solution for both human and animal studies. In this work, we integrated VAMS in a MS-based untargeted metabolomics workflow and investigated pre-analytical strategies such as sample extraction procedures and metabolome stability at different storage conditions. The latter revealed that

  4. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia.

    Science.gov (United States)

    Hernandez-Valladares, Maria; Aasebø, Elise; Selheim, Frode; Berven, Frode S; Bruserud, Øystein

    2016-08-22

    Global mass spectrometry (MS)-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML) biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC) or metal oxide affinity chromatography (MOAC). We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP) as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  5. Selecting Sample Preparation Workflows for Mass Spectrometry-Based Proteomic and Phosphoproteomic Analysis of Patient Samples with Acute Myeloid Leukemia

    Directory of Open Access Journals (Sweden)

    Maria Hernandez-Valladares

    2016-08-01

    Full Text Available Global mass spectrometry (MS-based proteomic and phosphoproteomic studies of acute myeloid leukemia (AML biomarkers represent a powerful strategy to identify and confirm proteins and their phosphorylated modifications that could be applied in diagnosis and prognosis, as a support for individual treatment regimens and selection of patients for bone marrow transplant. MS-based studies require optimal and reproducible workflows that allow a satisfactory coverage of the proteome and its modifications. Preparation of samples for global MS analysis is a crucial step and it usually requires method testing, tuning and optimization. Different proteomic workflows that have been used to prepare AML patient samples for global MS analysis usually include a standard protein in-solution digestion procedure with a urea-based lysis buffer. The enrichment of phosphopeptides from AML patient samples has previously been carried out either with immobilized metal affinity chromatography (IMAC or metal oxide affinity chromatography (MOAC. We have recently tested several methods of sample preparation for MS analysis of the AML proteome and phosphoproteome and introduced filter-aided sample preparation (FASP as a superior methodology for the sensitive and reproducible generation of peptides from patient samples. FASP-prepared peptides can be further fractionated or IMAC-enriched for proteome or phosphoproteome analyses. Herein, we will review both in-solution and FASP-based sample preparation workflows and encourage the use of the latter for the highest protein and phosphorylation coverage and reproducibility.

  6. A Comprehensive Workflow of Mass Spectrometry-Based Untargeted Metabolomics in Cancer Metabolic Biomarker Discovery Using Human Plasma and Urine

    Directory of Open Access Journals (Sweden)

    Jianwen She

    2013-09-01

    Full Text Available Current available biomarkers lack sensitivity and/or specificity for early detection of cancer. To address this challenge, a robust and complete workflow for metabolic profiling and data mining is described in details. Three independent and complementary analytical techniques for metabolic profiling are applied: hydrophilic interaction liquid chromatography (HILIC–LC, reversed-phase liquid chromatography (RP–LC, and gas chromatography (GC. All three techniques are coupled to a mass spectrometer (MS in the full scan acquisition mode, and both unsupervised and supervised methods are used for data mining. The univariate and multivariate feature selection are used to determine subsets of potentially discriminative predictors. These predictors are further identified by obtaining accurate masses and isotopic ratios using selected ion monitoring (SIM and data-dependent MS/MS and/or accurate mass MSn ion tree scans utilizing high resolution MS. A list combining all of the identified potential biomarkers generated from different platforms and algorithms is used for pathway analysis. Such a workflow combining comprehensive metabolic profiling and advanced data mining techniques may provide a powerful approach for metabolic pathway analysis and biomarker discovery in cancer research. Two case studies with previous published data are adapted and included in the context to elucidate the application of the workflow.

  7. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    International Nuclear Information System (INIS)

    Kulkarni, Shilpa; Koller, Antonius; Mani, Kartik M.; Wen, Ruofeng; Alfieri, Alan; Saha, Subhrajit; Wang, Jian; Patel, Purvi; Bandeira, Nuno; Guha, Chandan

    2016-01-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  8. Identifying Urinary and Serum Exosome Biomarkers for Radiation Exposure Using a Data Dependent Acquisition and SWATH-MS Combined Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Kulkarni, Shilpa [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Koller, Antonius [Proteomics Center, Stony Brook University School of Medicine, Stony Brook, New York (United States); Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Mani, Kartik M. [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wen, Ruofeng [Department of Applied Mathematics and Statistics, Stony Brook University, Stony Brook, New York (United States); Alfieri, Alan; Saha, Subhrajit [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); Wang, Jian [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Patel, Purvi [Proteomics Shared Resource, Herbert Irving Comprehensive Cancer Center, New York, New York (United States); Department of Pharmacological Sciences, Stony Brook University, Stony Brook, New York (United States); Bandeira, Nuno [Center for Computational Mass Spectrometry, University of California, San Diego, California (United States); Department of Computer Science and Engineering, University of California, San Diego, California (United States); Skaggs School of Pharmacy and Pharmaceutical Sciences, University of California, San Diego, California (United States); Guha, Chandan, E-mail: cguha@montefiore.org [Department of Radiation Oncology, Albert Einstein College of Medicine, Bronx, New York (United States); and others

    2016-11-01

    Purpose: Early and accurate assessment of radiation injury by radiation-responsive biomarkers is critical for triage and early intervention. Biofluids such as urine and serum are convenient for such analysis. Recent research has also suggested that exosomes are a reliable source of biomarkers in disease progression. In the present study, we analyzed total urine proteome and exosomes isolated from urine or serum for potential biomarkers of acute and persistent radiation injury in mice exposed to lethal whole body irradiation (WBI). Methods and Materials: For feasibility studies, the mice were irradiated at 10.4 Gy WBI, and urine and serum samples were collected 24 and 72 hours after irradiation. Exosomes were isolated and analyzed using liquid chromatography mass spectrometry/mass spectrometry-based workflow for radiation exposure signatures. A data dependent acquisition and SWATH-MS combined workflow approach was used to identify significantly exosome biomarkers indicative of acute or persistent radiation-induced responses. For the validation studies, mice were exposed to 3, 6, 8, or 10 Gy WBI, and samples were analyzed for comparison. Results: A comparison between total urine proteomics and urine exosome proteomics demonstrated that exosome proteomic analysis was superior in identifying radiation signatures. Feasibility studies identified 23 biomarkers from urine and 24 biomarkers from serum exosomes after WBI. Urinary exosome signatures identified different physiological parameters than the ones obtained in serum exosomes. Exosome signatures from urine indicated injury to the liver, gastrointestinal, and genitourinary tracts. In contrast, serum showed vascular injuries and acute inflammation in response to radiation. Selected urinary exosomal biomarkers also showed changes at lower radiation doses in validation studies. Conclusions: Exosome proteomics revealed radiation- and time-dependent protein signatures after WBI. A total of 47 differentially secreted

  9. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data.

    Science.gov (United States)

    Vu, Trung N; Valkenborg, Dirk; Smets, Koen; Verwaest, Kim A; Dommisse, Roger; Lemière, Filip; Verschoren, Alain; Goethals, Bart; Laukens, Kris

    2011-10-20

    Nuclear magnetic resonance spectroscopy (NMR) is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA). The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT) cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio) is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear improvements in comparison to other methods, and the down

  10. An integrated workflow for robust alignment and simplified quantitative analysis of NMR spectrometry data

    Directory of Open Access Journals (Sweden)

    Dommisse Roger

    2011-10-01

    Full Text Available Abstract Background Nuclear magnetic resonance spectroscopy (NMR is a powerful technique to reveal and compare quantitative metabolic profiles of biological tissues. However, chemical and physical sample variations make the analysis of the data challenging, and typically require the application of a number of preprocessing steps prior to data interpretation. For example, noise reduction, normalization, baseline correction, peak picking, spectrum alignment and statistical analysis are indispensable components in any NMR analysis pipeline. Results We introduce a novel suite of informatics tools for the quantitative analysis of NMR metabolomic profile data. The core of the processing cascade is a novel peak alignment algorithm, called hierarchical Cluster-based Peak Alignment (CluPA. The algorithm aligns a target spectrum to the reference spectrum in a top-down fashion by building a hierarchical cluster tree from peak lists of reference and target spectra and then dividing the spectra into smaller segments based on the most distant clusters of the tree. To reduce the computational time to estimate the spectral misalignment, the method makes use of Fast Fourier Transformation (FFT cross-correlation. Since the method returns a high-quality alignment, we can propose a simple methodology to study the variability of the NMR spectra. For each aligned NMR data point the ratio of the between-group and within-group sum of squares (BW-ratio is calculated to quantify the difference in variability between and within predefined groups of NMR spectra. This differential analysis is related to the calculation of the F-statistic or a one-way ANOVA, but without distributional assumptions. Statistical inference based on the BW-ratio is achieved by bootstrapping the null distribution from the experimental data. Conclusions The workflow performance was evaluated using a previously published dataset. Correlation maps, spectral and grey scale plots show clear

  11. Development of a data independent acquisition mass spectrometry workflow to enable glycopeptide analysis without predefined glycan compositional knowledge.

    Science.gov (United States)

    Lin, Chi-Hung; Krisp, Christoph; Packer, Nicolle H; Molloy, Mark P

    2018-02-10

    Glycoproteomics investigates glycan moieties in a site specific manner to reveal the functional roles of protein glycosylation. Identification of glycopeptides from data-dependent acquisition (DDA) relies on high quality MS/MS spectra of glycopeptide precursors and often requires manual validation to ensure confident assignments. In this study, we investigated pseudo-MRM (MRM-HR) and data-independent acquisition (DIA) as alternative acquisition strategies for glycopeptide analysis. These approaches allow data acquisition over the full MS/MS scan range allowing data re-analysis post-acquisition, without data re-acquisition. The advantage of MRM-HR over DDA for N-glycopeptide detection was demonstrated from targeted analysis of bovine fetuin where all three N-glycosylation sites were detected, which was not the case with DDA. To overcome the duty cycle limitation of MRM-HR acquisition needed for analysis of complex samples such as plasma we trialed DIA. This allowed development of a targeted DIA method to identify N-glycopeptides without pre-defined knowledge of the glycan composition, thus providing the potential to identify N-glycopeptides with unexpected structures. This workflow was demonstrated by detection of 59 N-glycosylation sites from 41 glycoproteins from a HILIC enriched human plasma tryptic digest. 21 glycoforms of IgG1 glycopeptides were identified including two truncated structures that are rarely reported. We developed a data-independent mass spectrometry workflow to identify specific glycopeptides from complex biological mixtures. The novelty is that this approach does not require glycan composition to be pre-defined, thereby allowing glycopeptides carrying unexpected glycans to be identified. This is demonstrated through the analysis of immunoglobulins in human plasma where we detected two IgG1 glycoforms that are rarely observed. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Galaxy-M: a Galaxy workflow for processing and analyzing direct infusion and liquid chromatography mass spectrometry-based metabolomics data.

    Science.gov (United States)

    Davidson, Robert L; Weber, Ralf J M; Liu, Haoyu; Sharma-Oates, Archana; Viant, Mark R

    2016-01-01

    Metabolomics is increasingly recognized as an invaluable tool in the biological, medical and environmental sciences yet lags behind the methodological maturity of other omics fields. To achieve its full potential, including the integration of multiple omics modalities, the accessibility, standardization and reproducibility of computational metabolomics tools must be improved significantly. Here we present our end-to-end mass spectrometry metabolomics workflow in the widely used platform, Galaxy. Named Galaxy-M, our workflow has been developed for both direct infusion mass spectrometry (DIMS) and liquid chromatography mass spectrometry (LC-MS) metabolomics. The range of tools presented spans from processing of raw data, e.g. peak picking and alignment, through data cleansing, e.g. missing value imputation, to preparation for statistical analysis, e.g. normalization and scaling, and principal components analysis (PCA) with associated statistical evaluation. We demonstrate the ease of using these Galaxy workflows via the analysis of DIMS and LC-MS datasets, and provide PCA scores and associated statistics to help other users to ensure that they can accurately repeat the processing and analysis of these two datasets. Galaxy and data are all provided pre-installed in a virtual machine (VM) that can be downloaded from the GigaDB repository. Additionally, source code, executables and installation instructions are available from GitHub. The Galaxy platform has enabled us to produce an easily accessible and reproducible computational metabolomics workflow. More tools could be added by the community to expand its functionality. We recommend that Galaxy-M workflow files are included within the supplementary information of publications, enabling metabolomics studies to achieve greater reproducibility.

  13. Quantitative, multiplexed workflow for deep analysis of human blood plasma and biomarker discovery by mass spectrometry.

    Science.gov (United States)

    Keshishian, Hasmik; Burgess, Michael W; Specht, Harrison; Wallace, Luke; Clauser, Karl R; Gillette, Michael A; Carr, Steven A

    2017-08-01

    Proteomic characterization of blood plasma is of central importance to clinical proteomics and particularly to biomarker discovery studies. The vast dynamic range and high complexity of the plasma proteome have, however, proven to be serious challenges and have often led to unacceptable tradeoffs between depth of coverage and sample throughput. We present an optimized sample-processing pipeline for analysis of the human plasma proteome that provides greatly increased depth of detection, improved quantitative precision and much higher sample analysis throughput as compared with prior methods. The process includes abundant protein depletion, isobaric labeling at the peptide level for multiplexed relative quantification and ultra-high-performance liquid chromatography coupled to accurate-mass, high-resolution tandem mass spectrometry analysis of peptides fractionated off-line by basic pH reversed-phase (bRP) chromatography. The overall reproducibility of the process, including immunoaffinity depletion, is high, with a process replicate coefficient of variation (CV) of 4,500 proteins are detected and quantified per patient sample on average, with two or more peptides per protein and starting from as little as 200 μl of plasma. The approach can be multiplexed up to 10-plex using tandem mass tags (TMT) reagents, further increasing throughput, albeit with some decrease in the number of proteins quantified. In addition, we provide a rapid protocol for analysis of nonfractionated depleted plasma samples analyzed in 10-plex. This provides ∼600 quantified proteins for each of the ten samples in ∼5 h of instrument time.

  14. Workflow optimisation for multimodal imaging procedures: a case of combined X-ray and MRI-guided TACE.

    Science.gov (United States)

    Fernández-Gutiérrez, Fabiola; Wolska-Krawczyk, Malgorzata; Buecker, Arno; Houston, J Graeme; Melzer, Andreas

    2017-02-01

    This study presents a framework for workflow optimisation of multimodal image-guided procedures (MIGP) based on discrete event simulation (DES). A case of a combined X-Ray and magnetic resonance image-guided transarterial chemoembolisation (TACE) is presented to illustrate the application of this method. We used a ranking and selection optimisation algorithm to measure the performance of a number of proposed alternatives to improve a current scenario. A DES model was implemented with detail data collected from 59 TACE procedures and durations of magnetic resonance imaging (MRI) diagnostic procedures usually performed in a common MRI suite. Fourteen alternatives were proposed and assessed to minimise the waiting times and improve workflow. Data analysis observed an average of 20.68 (7.68) min of waiting between angiography and MRI for TACE patients in 71.19% of the cases. Following the optimisation analysis, an alternative was identified to reduce waiting times in angiography suite up to 48.74%. The model helped to understand and detect 'bottlenecks' during multimodal TACE procedures, identifying a better alternative to the current workflow and reducing waiting times. Simulation-based workflow analysis provides a cost-effective way to face some of the challenges of introducing MIGP in clinical radiology, highligthed in this study.

  15. Secretome Analysis of Lipid-Induced Insulin Resistance in Skeletal Muscle Cells by a Combined Experimental and Bioinformatics Workflow

    DEFF Research Database (Denmark)

    Deshmukh, Atul S; Cox, Juergen; Jensen, Lars Juhl

    2015-01-01

    , in principle, allows an unbiased and comprehensive analysis of cellular secretomes; however, the distinction of bona fide secreted proteins from proteins released upon lysis of a small fraction of dying cells remains challenging. Here we applied highly sensitive MS and streamlined bioinformatics to analyze......-resistant conditions. Our study demonstrates an efficient combined experimental and bioinformatics workflow to identify putative secreted proteins from insulin-resistant skeletal muscle cells, which could easily be adapted to other cellular models....

  16. Combining Cloud-based Workflow Management System with SOA and CEP to Create Agility in Collaborative Environment

    Directory of Open Access Journals (Sweden)

    Marian STOICA

    2017-01-01

    Full Text Available In current economy, technological solutions like cloud computing, service-oriented architecture (SOA and complex event processing (CEP are recognized as modern approaches used for increasing the business agility and achieving innovation. The complexity of collaborative business environment raises more and more the need for performant workflow management systems (WfMS that meet current requirements. Each approach has advantages, but also faces challenges. In this paper we propose a solution for integration of cloud computing with WfMS, SOA and CEP that allows these technologies to complete each other and bank on their benefits to increase agility and reduce the challenges/problems. The paper presents a short introduction in the subject, followed by an analysis of the combination between cloud computing and WfMS and the benefits of cloud based workflow management system. The paper ends with a solution for combining cloud WfMS with SOA and CEP in order to gain business agility and real time collaboration, followed by conclusions and research directions.

  17. Influence of a combined CT/C-arm system on periprocedural workflow and procedure times in mechanical thrombectomy

    Energy Technology Data Exchange (ETDEWEB)

    Pfaff, Johannes; Herweh, Christian; Pham, Mirko; Heiland, Sabine; Bendszus, Martin; Moehlenbruch, Markus Alfred [University of Heidelberg, Department of Neuroradiology, Heidelberg (Germany); Schoenenberger, Silvia; Nagel, Simon; Ringleb, Peter Arthur [University of Heidelberg, Department of Neurology, Heidelberg (Germany)

    2017-09-15

    To achieve the fastest possible workflow in ischaemic stroke, we developed a CT/C-arm system, which allows imaging and endovascular treatment on the same patient table. This prospective, monocentric trial was conducted between October 2014 and August 2016. Patients received stroke imaging and mechanical thrombectomy under general anaesthesia (GA) or conscious sedation (CS) using our combined setup comprising a CT-scanner and a mobile C-arm X-ray device. Primary endpoint was time between stroke imaging and groin puncture. We compared periprocedural workflow and procedure times with the literature and a matched patient cohort treated with a biplane angiographic system before installation of the CT/C-arm system. In 50 patients with acute ischaemic stroke due to large-vessel occlusion in the anterior circulation, comparable recanalization rates were achieved by using the CT/C-arm setup (TICI2b-3:CT/C-arm-GA: 85.7%; CT/C-arm-CS: 90.9%; Angiosuite: 78.6%; p = 0.269) without increasing periprocedural complications. Elimination of patient transport resulted in a significant reduction of the time between stroke imaging and groin puncture: median, min (IQR): CT/C-arm-GA: 43 (35-52); CT/C-arm-CS: 39 (28-49); Angiosuite: 64 (48-74); p < 0.0001. The combined CT/C-arm system allows comparable recanalization rates as a biplane angiographic system and accelerates the start of the endovascular stroke treatment. (orig.)

  18. Influence of a combined CT/C-arm system on periprocedural workflow and procedure times in mechanical thrombectomy

    International Nuclear Information System (INIS)

    Pfaff, Johannes; Herweh, Christian; Pham, Mirko; Heiland, Sabine; Bendszus, Martin; Moehlenbruch, Markus Alfred; Schoenenberger, Silvia; Nagel, Simon; Ringleb, Peter Arthur

    2017-01-01

    To achieve the fastest possible workflow in ischaemic stroke, we developed a CT/C-arm system, which allows imaging and endovascular treatment on the same patient table. This prospective, monocentric trial was conducted between October 2014 and August 2016. Patients received stroke imaging and mechanical thrombectomy under general anaesthesia (GA) or conscious sedation (CS) using our combined setup comprising a CT-scanner and a mobile C-arm X-ray device. Primary endpoint was time between stroke imaging and groin puncture. We compared periprocedural workflow and procedure times with the literature and a matched patient cohort treated with a biplane angiographic system before installation of the CT/C-arm system. In 50 patients with acute ischaemic stroke due to large-vessel occlusion in the anterior circulation, comparable recanalization rates were achieved by using the CT/C-arm setup (TICI2b-3:CT/C-arm-GA: 85.7%; CT/C-arm-CS: 90.9%; Angiosuite: 78.6%; p = 0.269) without increasing periprocedural complications. Elimination of patient transport resulted in a significant reduction of the time between stroke imaging and groin puncture: median, min (IQR): CT/C-arm-GA: 43 (35-52); CT/C-arm-CS: 39 (28-49); Angiosuite: 64 (48-74); p < 0.0001. The combined CT/C-arm system allows comparable recanalization rates as a biplane angiographic system and accelerates the start of the endovascular stroke treatment. (orig.)

  19. Influence of a combined CT/C-arm system on periprocedural workflow and procedure times in mechanical thrombectomy.

    Science.gov (United States)

    Pfaff, Johannes; Schönenberger, Silvia; Herweh, Christian; Pham, Mirko; Nagel, Simon; Ringleb, Peter Arthur; Heiland, Sabine; Bendszus, Martin; Möhlenbruch, Markus Alfred

    2017-09-01

    To achieve the fastest possible workflow in ischaemic stroke, we developed a CT/C-arm system, which allows imaging and endovascular treatment on the same patient table. This prospective, monocentric trial was conducted between October 2014 and August 2016. Patients received stroke imaging and mechanical thrombectomy under general anaesthesia (GA) or conscious sedation (CS) using our combined setup comprising a CT-scanner and a mobile C-arm X-ray device. Primary endpoint was time between stroke imaging and groin puncture. We compared periprocedural workflow and procedure times with the literature and a matched patient cohort treated with a biplane angiographic system before installation of the CT/C-arm system. In 50 patients with acute ischaemic stroke due to large-vessel occlusion in the anterior circulation, comparable recanalization rates were achieved by using the CT/C-arm setup (TICI2b-3:CT/C-arm-GA: 85.7%; CT/C-arm-CS: 90.9%; Angiosuite: 78.6%; p = 0.269) without increasing periprocedural complications. Elimination of patient transport resulted in a significant reduction of the time between stroke imaging and groin puncture: median, min (IQR): CT/C-arm-GA: 43 (35-52); CT/C-arm-CS: 39 (28-49); Angiosuite: 64 (48-74); p < 0.0001. The combined CT/C-arm system allows comparable recanalization rates as a biplane angiographic system and accelerates the start of the endovascular stroke treatment. • The CT/C-arm setup reduces median time from stroke imaging to groin puncture. • Mechanical thrombectomy using a C-arm device is feasible without increasing peri-interventional complications. • The CT/C-arm setup might be a valuable fallback solution for emergency procedures. • The CT/C-arm setup allows immediate control CT images during and after treatment.

  20. Anatomy and evolution of database search engines-a central component of mass spectrometry based proteomic workflows.

    Science.gov (United States)

    Verheggen, Kenneth; Raeder, Helge; Berven, Frode S; Martens, Lennart; Barsnes, Harald; Vaudel, Marc

    2017-09-13

    Sequence database search engines are bioinformatics algorithms that identify peptides from tandem mass spectra using a reference protein sequence database. Two decades of development, notably driven by advances in mass spectrometry, have provided scientists with more than 30 published search engines, each with its own properties. In this review, we present the common paradigm behind the different implementations, and its limitations for modern mass spectrometry datasets. We also detail how the search engines attempt to alleviate these limitations, and provide an overview of the different software frameworks available to the researcher. Finally, we highlight alternative approaches for the identification of proteomic mass spectrometry datasets, either as a replacement for, or as a complement to, sequence database search engines. © 2017 Wiley Periodicals, Inc.

  1. A rapid diagnostic workflow for cefotaxime-resistant Escherichia coli and Klebsiella pneumoniae detection from blood cultures by MALDI-TOF mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Elena De Carolis

    Full Text Available Nowadays, the global spread of resistance to oxyimino-cephalosporins in Enterobacteriaceae implies the need for novel diagnostics that can rapidly target resistant organisms from these bacterial species.In this study, we developed and evaluated a Direct Mass Spectrometry assay for Beta-Lactamase (D-MSBL that allows direct identification of (oxyiminocephalosporin-resistant Escherichia coli or Klebsiella pneumoniae from positive blood cultures (BCs, by using the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS technology.The D-MSBL assay was performed on 93 E. coli or K. pneumoniae growing BC samples that were shortly co-incubated with cefotaxime (CTX as the indicator cephalosporin. Susceptibility and resistance defining peaks from the samples' mass spectra were analyzed by a novel algorithm for bacterial organism classification. The D-MSBL assay allowed discrimination between E. coli and K. pneumoniae that were resistant or susceptible to CTX with a sensitivity of 86.8% and a specificity of 98.2%.The proposed algorithm-based D-MSBL assay, if integrated in the routine laboratory diagnostic workflow, may be useful to enhance the establishment of appropriate antibiotic therapy and to control the threat of oxyimino-cephalosporin resistance in hospital.

  2. A rapid diagnostic workflow for cefotaxime-resistant Escherichia coli and Klebsiella pneumoniae detection from blood cultures by MALDI-TOF mass spectrometry.

    Science.gov (United States)

    De Carolis, Elena; Paoletti, Silvia; Nagel, Domenico; Vella, Antonietta; Mello, Enrica; Palucci, Ivana; De Angelis, Giulia; D'Inzeo, Tiziana; Sanguinetti, Maurizio; Posteraro, Brunella; Spanu, Teresa

    2017-01-01

    Nowadays, the global spread of resistance to oxyimino-cephalosporins in Enterobacteriaceae implies the need for novel diagnostics that can rapidly target resistant organisms from these bacterial species. In this study, we developed and evaluated a Direct Mass Spectrometry assay for Beta-Lactamase (D-MSBL) that allows direct identification of (oxyimino)cephalosporin-resistant Escherichia coli or Klebsiella pneumoniae from positive blood cultures (BCs), by using the matrix-assisted laser desorption ionization-time of flight mass spectrometry (MALDI-TOF MS) technology. The D-MSBL assay was performed on 93 E. coli or K. pneumoniae growing BC samples that were shortly co-incubated with cefotaxime (CTX) as the indicator cephalosporin. Susceptibility and resistance defining peaks from the samples' mass spectra were analyzed by a novel algorithm for bacterial organism classification. The D-MSBL assay allowed discrimination between E. coli and K. pneumoniae that were resistant or susceptible to CTX with a sensitivity of 86.8% and a specificity of 98.2%. The proposed algorithm-based D-MSBL assay, if integrated in the routine laboratory diagnostic workflow, may be useful to enhance the establishment of appropriate antibiotic therapy and to control the threat of oxyimino-cephalosporin resistance in hospital.

  3. LipidMatch: an automated workflow for rule-based lipid identification using untargeted high-resolution tandem mass spectrometry data.

    Science.gov (United States)

    Koelmel, Jeremy P; Kroeger, Nicholas M; Ulmer, Candice Z; Bowden, John A; Patterson, Rainey E; Cochran, Jason A; Beecher, Christopher W W; Garrett, Timothy J; Yost, Richard A

    2017-07-10

    Lipids are ubiquitous and serve numerous biological functions; thus lipids have been shown to have great potential as candidates for elucidating biomarkers and pathway perturbations associated with disease. Methods expanding coverage of the lipidome increase the likelihood of biomarker discovery and could lead to more comprehensive understanding of disease etiology. We introduce LipidMatch, an R-based tool for lipid identification for liquid chromatography tandem mass spectrometry workflows. LipidMatch currently has over 250,000 lipid species spanning 56 lipid types contained in in silico fragmentation libraries. Unique fragmentation libraries, compared to other open source software, include oxidized lipids, bile acids, sphingosines, and previously uncharacterized adducts, including ammoniated cardiolipins. LipidMatch uses rule-based identification. For each lipid type, the user can select which fragments must be observed for identification. Rule-based identification allows for correct annotation of lipids based on the fragments observed, unlike typical identification based solely on spectral similarity scores, where over-reporting structural details that are not conferred by fragmentation data is common. Another unique feature of LipidMatch is ranking lipid identifications for a given feature by the sum of fragment intensities. For each lipid candidate, the intensities of experimental fragments with exact mass matches to expected in silico fragments are summed. The lipid identifications with the greatest summed intensity using this ranking algorithm were comparable to other lipid identification software annotations, MS-DIAL and Greazy. For example, for features with identifications from all 3 software, 92% of LipidMatch identifications by fatty acyl constituents were corroborated by at least one other software in positive mode and 98% in negative ion mode. LipidMatch allows users to annotate lipids across a wide range of high resolution tandem mass spectrometry

  4. Analysis of wastewater samples by direct combination of thin-film microextraction and desorption electrospray ionization mass spectrometry.

    Science.gov (United States)

    Strittmatter, Nicole; Düring, Rolf-Alexander; Takáts, Zoltán

    2012-09-07

    An analysis method for aqueous samples by the direct combination of C18/SCX mixed mode thin-film microextraction (TFME) and desorption electrospray ionization mass spectrometry (DESI-MS) was developed. Both techniques make analytical workflow simpler and faster, hence the combination of the two techniques enables considerably shorter analysis time compared to the traditional liquid chromatography mass spectrometry (LC-MS) approach. The method was characterized using carbamazepine and triclosan as typical examples for pharmaceuticals and personal care product (PPCP) components which draw increasing attention as wastewater-derived environmental contaminants. Both model compounds were successfully detected in real wastewater samples and their concentrations determined using external calibration with isotope labeled standards. Effects of temperature, agitation, sample volume, and exposure time were investigated in the case of spiked aqueous samples. Results were compared to those of parallel HPLC-MS determinations and good agreement was found through a three orders of magnitude wide concentration range. Serious matrix effects were observed in treated wastewater, but lower limits of detection were still found to be in the low ng L(-1) range. Using an Orbitrap mass spectrometer, the technique was found to be ideal for screening purposes and led to the detection of various different PPCP components in wastewater treatment plant effluents, including beta-blockers, nonsteroidal anti-inflammatory drugs, and UV filters.

  5. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    Science.gov (United States)

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  6. Middle-down hybrid chromatography/tandem mass spectrometry workflow for characterization of combinatorial post-translational modifications in histones.

    Science.gov (United States)

    Sidoli, Simone; Schwämmle, Veit; Ruminowicz, Chrystian; Hansen, Thomas A; Wu, Xudong; Helin, Kristian; Jensen, Ole N

    2014-10-01

    We present an integrated middle-down proteomics platform for sensitive mapping and quantification of coexisting PTMs in large polypeptides (5-7 kDa). We combined an RP trap column with subsequent weak cation exchange-hydrophilic interaction LC interfaced directly to high mass accuracy ESI MS/MS using electron transfer dissociation. This enabled automated and efficient separation and sequencing of hypermodified histone N-terminal tails for unambiguous localization of combinatorial PTMs. We present Histone Coder and IsoScale software to extract, filter, and analyze MS/MS data, including quantification of cofragmenting isobaric polypeptide species. We characterized histone tails derived from murine embryonic stem cells knockout in suppressor of zeste12 (Suz12(-/-) ) and quantified 256 combinatorial histone marks in histones H3, H4, and H2A. Furthermore, a total of 713 different combinatorial histone marks were identified in purified histone H3. We measured a seven-fold reduction of H3K27me2/me3 (where me2 and me3 are dimethylation and trimethylation, respectively) in Suz12(-) (/) (-) cells and detected significant changes of the relative abundance of 16 other single PTMs of histone H3 and other combinatorial marks. We conclude that the inactivation of Suz12 is associated with changes in the abundance of not only H3K27 methylation but also multiple other PTMs in histone H3 tails. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Online Ozonolysis Combined with Ion Mobility-Mass Spectrometry Provides a New Platform for Lipid Isomer Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Poad, Berwyck L.; Zheng, Xueyun; Mitchell, Todd A.; Smith, Richard D.; Baker, Erin M.; Blanksby, Stephen J.

    2017-12-21

    One of the most significant challenges in contemporary lipidomics lies in the separation and identification of lipid isomers that differ only in site(s) of unsaturation or geometric configuration of the carbon-carbon double bonds. While analytical separation techniques including ion mobility spectrometry (IMS) and liquid chromatography (LC) can separate isomeric lipids under appropriate conditions, conventional tandem mass spectrometry cannot provide unequivocal identification. To address this challenge, we have implemented ozone-induced dissociation (OzID) in-line with LC, IMS and high resolution mass spectrometry. Modification of an IMS- capable quadrupole time-of-flight mass spectrometer was undertaken to allow the introduction of ozone into the high-pressure trapping ion funnel region preceding the IMS cell. This enabled the novel LC-OzID-IMS-MS configuration where ozonolysis of ionized lipids occurred rapidly (10 ms) without prior mass-selection. LC-elution time alignment combined with accurate mass and arrival time extraction of ozonolysis products facilitated correlation of precursor and product ions without mass-selection (and associated reductions in duty cycle). Unsaturated lipids across 11 classes were examined using this workflow in both positive and negative ion modalities and in all cases the positions of carbon-carbon double bonds were unequivocally assigned based on predictable OzID transitions. Under these conditions geometric isomers exhibited different IMS arrival time distributions and distinct OzID product ion ratios providing a means for discrimination of cis/trans double bonds in complex lipids. The combination of OzID with multidimensional separations shows significant promise for facile profiling of unsaturation patterns within complex lipidomes.

  8. The combined measurement of uranium by alpha spectrometry and secondary ion mass spectrometry (SIMS)

    International Nuclear Information System (INIS)

    Harvan, D.

    2009-01-01

    The aim of thesis was to found the dependence between radiometric method - alpha spectrometry and surface sensitive method - Secondary Ion Mass Spectrometry (SIMS). Uranium or naturally occurring uranium isotopes were studied. Samples (high polished stainless steel discs) with uranium isotopes were prepared by electrodeposition. Samples were measured by alpha spectrometry after electrodeposition and treatment. It gives surface activities. Weights, as well as surface's weights of uranium isotopes were calculated from their activities, After alpha spectrometry samples were analyzed by TOF-SIMS IV instrument in International Laser Centre in Bratislava. By the SIMS analysis intensities of uranium-238 were obtained. The interpretation of SIMS intensities vs. surface activity, or surface's weights of uranium isotopes indicates the possibility to use SIMS in quantitative analysis of surface contamination by uranium isotopes, especially 238 U. (author)

  9. Scientific Workflow Management in Proteomics

    Science.gov (United States)

    de Bruin, Jeroen S.; Deelder, André M.; Palmblad, Magnus

    2012-01-01

    Data processing in proteomics can be a challenging endeavor, requiring extensive knowledge of many different software packages, all with different algorithms, data format requirements, and user interfaces. In this article we describe the integration of a number of existing programs and tools in Taverna Workbench, a scientific workflow manager currently being developed in the bioinformatics community. We demonstrate how a workflow manager provides a single, visually clear and intuitive interface to complex data analysis tasks in proteomics, from raw mass spectrometry data to protein identifications and beyond. PMID:22411703

  10. Large-Scale Compute-Intensive Analysis via a Combined In-situ and Co-scheduling Workflow Approach

    Energy Technology Data Exchange (ETDEWEB)

    Messer, Bronson [ORNL; Sewell, Christopher [Los Alamos National Laboratory (LANL); Heitmann, Katrin [ORNL; Finkel, Dr. Hal J [Argonne National Laboratory (ANL); Fasel, Patricia [Los Alamos National Laboratory (LANL); Zagaris, George [Lawrence Livermore National Laboratory (LLNL); Pope, Adrian [Los Alamos National Laboratory (LANL); Habib, Salman [ORNL; Parete-Koon, Suzanne T [ORNL

    2015-01-01

    Large-scale simulations can produce tens of terabytes of data per analysis cycle, complicating and limiting the efficiency of workflows. Traditionally, outputs are stored on the file system and analyzed in post-processing. With the rapidly increasing size and complexity of simulations, this approach faces an uncertain future. Trending techniques consist of performing the analysis in situ, utilizing the same resources as the simulation, and/or off-loading subsets of the data to a compute-intensive analysis system. We introduce an analysis framework developed for HACC, a cosmological N-body code, that uses both in situ and co-scheduling approaches for handling Petabyte-size outputs. An initial in situ step is used to reduce the amount of data to be analyzed, and to separate out the data-intensive tasks handled off-line. The analysis routines are implemented using the PISTON/VTK-m framework, allowing a single implementation of an algorithm that simultaneously targets a variety of GPU, multi-core, and many-core architectures.

  11. An introduction to the technique of combined ion mobility spectrometry-mass spectrometry for the analysis of complex biological samples

    International Nuclear Information System (INIS)

    McDowall, Mark A.; Bateman, Robert H.; Bajic, Steve; Giles, Kevin; Langridge, Jim; McKenna, Therese; Pringle, Steven D.; Wildgoose, Jason L.

    2008-01-01

    Full Text: Ultra Performance Liquid Chromatography (UPLC) offers several advantages compared with conventional High Performance Liquid Chromatography (HPLC) as an 'inlet system' for mass spectrometry. UPLC provides improved chromatographic resolution, increased sensitivity and reduced analysis time. This is achieved through the use of sub 2μm particles (stationary phase) combined with high-pressure solvent delivery (up to 15,000 psi). When coupled with orthogonal acceleration time-of-flight (oa-TOF) mass spectrometry (MS), UPLC presents a means to achieve high sample throughput with reduced spectral overlap, increased sensitivity, and exact mass measurement capabilities with high mass spectral resolution (Ca 20,000 FWHM). Dispersive ion mobility spectrometry (IMS) implemented within a traveling-wave ion guide provides an orthogonal separation strategy for ions in the gas phase that can resolve isobaric ions formed by either Electrospray of MALDI ionization typically in Ca 20 mille seconds. All three techniques have the potential to be combined on-line (e.g. UPLC-IMS-MS/MS) in real time to maximize peak capacity and resolving power for the analysis of complex biological mixtures including; intact proteins, modified peptides and endogenous/exogenous metabolites

  12. Qualitative and quantitative characterization of plasma proteins when incorporating traveling wave ion mobility into a liquid chromatography-mass spectrometry workflow for biomarker discovery: use of product ion quantitation as an alternative data analysis tool for label free quantitation.

    Science.gov (United States)

    Daly, Charlotte E; Ng, Leong L; Hakimi, Amirmansoor; Willingale, Richard; Jones, Donald J L

    2014-02-18

    Discovery of protein biomarkers in clinical samples necessitates significant prefractionation prior to liquid chromatography-mass spectrometry (LC-MS) analysis. Integrating traveling wave ion mobility spectrometry (TWIMS) enables in-line gas phase separation which when coupled with nanoflow liquid chromatography and data independent acquisition tandem mass spectrometry, confers significant advantages to the discovery of protein biomarkers by improving separation and inherent sensitivity. Incorporation of TWIMS leads to a packet of concentrated ions which ultimately provides a significant improvement in sensitivity. As a consequence of ion packeting, when present at high concentrations, accurate quantitation of proteins can be affected due to detector saturation effects. Human plasma was analyzed in triplicate using liquid-chromatography data independent acquisition mass spectrometry (LC-DIA-MS) and using liquid-chromatography ion-mobility data independent acquisition mass spectrometry (LC-IM-DIA-MS). The inclusion of TWIMS was assessed for the effect on sample throughput, data integrity, confidence of protein and peptide identification, and dynamic range. The number of identified proteins is significantly increased by an average of 84% while both the precursor and product mass accuracies are maintained between the modalities. Sample dynamic range is also maintained while quantitation is achieved for all but the most abundant proteins by incorporating a novel data interpretation method that allows accurate quantitation to occur. This additional separation is all achieved within a workflow with no discernible deleterious effect on throughput. Consequently, TWIMS greatly enhances proteome coverage and can be reliably used for quantification when using an alternative product ion quantification strategy. Using TWIMS in biomarker discovery in human plasma is thus recommended.

  13. Combination of Flow Injection and Electrothermal Atomic Absorption Spectrometry

    DEFF Research Database (Denmark)

    Hansen, Elo Harald; Nielsen, Steffen

    1999-01-01

    The paper discusses the advantages gained by exploiting this combination, FI-ETAAS. Emphasis is placed on illlustrating various avenues to perform on-line preconcentration of metal ions in order to obtain very low limits of detection of the measurand, and ways and means to enhance the selectivity...

  14. ERROR HANDLING IN INTEGRATION WORKFLOWS

    Directory of Open Access Journals (Sweden)

    Alexey M. Nazarenko

    2017-01-01

    Full Text Available Simulation experiments performed while solving multidisciplinary engineering and scientific problems require joint usage of multiple software tools. Further, when following a preset plan of experiment or searching for optimum solu- tions, the same sequence of calculations is run multiple times with various simulation parameters, input data, or conditions while overall workflow does not change. Automation of simulations like these requires implementing of a workflow where tool execution and data exchange is usually controlled by a special type of software, an integration environment or plat- form. The result is an integration workflow (a platform-dependent implementation of some computing workflow which, in the context of automation, is a composition of weakly coupled (in terms of communication intensity typical subtasks. These compositions can then be decomposed back into a few workflow patterns (types of subtasks interaction. The pat- terns, in their turn, can be interpreted as higher level subtasks.This paper considers execution control and data exchange rules that should be imposed by the integration envi- ronment in the case of an error encountered by some integrated software tool. An error is defined as any abnormal behavior of a tool that invalidates its result data thus disrupting the data flow within the integration workflow. The main requirementto the error handling mechanism implemented by the integration environment is to prevent abnormal termination of theentire workflow in case of missing intermediate results data. Error handling rules are formulated on the basic pattern level and on the level of a composite task that can combine several basic patterns as next level subtasks. The cases where workflow behavior may be different, depending on user's purposes, when an error takes place, and possible error handling op- tions that can be specified by the user are also noted in the work.

  15. Absolute quantitation of proteins by Acid hydrolysis combined with amino Acid detection by mass spectrometry

    DEFF Research Database (Denmark)

    Mirgorodskaya, Olga A; Körner, Roman; Kozmin, Yuri P

    2012-01-01

    Amino acid analysis is among the most accurate methods for absolute quantification of proteins and peptides. Here, we combine acid hydrolysis with the addition of isotopically labeled standard amino acids and analysis by mass spectrometry for accurate and sensitive protein quantitation...

  16. Conformational analysis of large and highly disulfide-stabilized proteins by integrating online electrochemical reduction into an optimized H/D exchange mass spectrometry workflow

    DEFF Research Database (Denmark)

    Trabjerg, Esben; Jakobsen, Rasmus Uffe; Mysling, Simon

    2015-01-01

    Analysis of disulfide-bonded proteins by HDX-MS requires effective and rapid reduction of disulfide bonds before enzymatic digestion in order to increase sequence coverage. In a conventional HDX-MS workflow, disulfide bonds are reduced chemically by addition of a reducing agent to the quench......-antibody, respectively. The presented results demonstrate the successful electrochemical reduction during HDX-MS analysis of both a small exceptional tightly disulfide-bonded protein (NGF) as well as the largest protein attempted to date (IgG1-antibody). We envision that online electrochemical reduction...... the electrochemical reduction efficiency during HDX-MS analysis of two particularly challenging disulfide stabilized proteins: a therapeutic IgG1-antibody and Nerve Growth Factor-β (NGF). Several different parameters (flow rate, applied square wave potential as well as the type of labeling- and quench buffer) were...

  17. A comprehensive high-resolution mass spectrometry approach for characterization of metabolites by combination of ambient ionization, chromatography and imaging methods.

    Science.gov (United States)

    Berisha, Arton; Dold, Sebastian; Guenther, Sabine; Desbenoit, Nicolas; Takats, Zoltan; Spengler, Bernhard; Römpp, Andreas

    2014-08-30

    An ideal method for bioanalytical applications would deliver spatially resolved quantitative information in real time and without sample preparation. In reality these requirements can typically not be met by a single analytical technique. Therefore, we combine different mass spectrometry approaches: chromatographic separation, ambient ionization and imaging techniques, in order to obtain comprehensive information about metabolites in complex biological samples. Samples were analyzed by laser desorption followed by electrospray ionization (LD-ESI) as an ambient ionization technique, by matrix-assisted laser desorption/ionization (MALDI) mass spectrometry imaging for spatial distribution analysis and by high-performance liquid chromatography/electrospray ionization mass spectrometry (HPLC/ESI-MS) for quantitation and validation of compound identification. All MS data were acquired with high mass resolution and accurate mass (using orbital trapping and ion cyclotron resonance mass spectrometers). Grape berries were analyzed and evaluated in detail, whereas wheat seeds and mouse brain tissue were analyzed in proof-of-concept experiments. In situ measurements by LD-ESI without any sample preparation allowed for fast screening of plant metabolites on the grape surface. MALDI imaging of grape cross sections at 20 µm pixel size revealed the detailed distribution of metabolites which were in accordance with their biological function. HPLC/ESI-MS was used to quantify 13 anthocyanin species as well as to separate and identify isomeric compounds. A total of 41 metabolites (amino acids, carbohydrates, anthocyanins) were identified with all three approaches. Mass accuracy for all MS measurements was better than 2 ppm (root mean square error). The combined approach provides fast screening capabilities, spatial distribution information and the possibility to quantify metabolites. Accurate mass measurements proved to be critical in order to reliably combine data from different MS

  18. Mass spectrometry of solid samples in open air using combined laser ionization and ambient metastable ionization

    International Nuclear Information System (INIS)

    He, X.N.; Xie, Z.Q.; Gao, Y.; Hu, W.; Guo, L.B.; Jiang, L.; Lu, Y.F.

    2012-01-01

    Mass spectrometry of solid samples in open air was carried out using combined laser ionization and metastable ionization time-of-flight mass spectrometry (LI-MI-TOFMS) in ambient environment for qualitative and semiquantitative (relative analyte information, not absolute information) analysis. Ambient metastable ionization using a direct analysis in realtime (DART) ion source was combined with laser ionization time-of-flight mass spectrometry (LI-TOFMS) to study the effects of combining metastable and laser ionization. A series of metallic samples from the National Institute of Standards and Technology (NIST 494, 495, 498, 499, and 500) and a pure carbon target were characterized using LI-TOFMS in open air. LI-MI-TOFMS was found to be superior to laser-induced breakdown spectroscopy (LIBS). Laser pulse energies between 10 and 200 mJ at the second harmonic (532 nm) of an Nd:YAG laser were applied in the experiment to obtain a high degree of ionization in plasmas. Higher laser pulse energy improves signal intensities of trace elements (such as Fe, Cr, Mn, Ni, Ca, Al, and Ag). Data were analyzed by numerically calculating relative sensitivity coefficients (RSCs) and limit of detections (LODs) from mass spectrometry (MS) and LIBS spectra. Different parameters, such as boiling point, ionization potential, RSC, LOD, and atomic weight, were shown to analyze the ionization and MS detection processes in open air.

  19. Metabolomics by Gas Chromatography-Mass Spectrometry: the combination of targeted and untargeted profiling

    Science.gov (United States)

    Fiehn, Oliver

    2016-01-01

    Gas chromatography-mass spectrometry (GC-MS)-based metabolomics is ideal for identifying and quantitating small molecular metabolites (metabolomics easily allows integrating targeted assays for absolute quantification of specific metabolites with untargeted metabolomics to discover novel compounds. Complemented by database annotations using large spectral libraries and validated, standardized standard operating procedures, GC-MS can identify and semi-quantify over 200 compounds per study in human body fluids (e.g., plasma, urine or stool) samples. Deconvolution software enables detection of more than 300 additional unidentified signals that can be annotated through accurate mass instruments with appropriate data processing workflows, similar to liquid chromatography-MS untargeted profiling (LC-MS). Hence, GC-MS is a mature technology that not only uses classic detectors (‘quadrupole’) but also target mass spectrometers (‘triple quadrupole’) and accurate mass instruments (‘quadrupole-time of flight’). This unit covers the following aspects of GC-MS-based metabolomics: (i) sample preparation from mammalian samples, (ii) acquisition of data, (iii) quality control, and (iv) data processing. PMID:27038389

  20. Multi-level meta-workflows: new concept for regularly occurring tasks in quantum chemistry.

    Science.gov (United States)

    Arshad, Junaid; Hoffmann, Alexander; Gesing, Sandra; Grunzke, Richard; Krüger, Jens; Kiss, Tamas; Herres-Pawlis, Sonja; Terstyanszky, Gabor

    2016-01-01

    In Quantum Chemistry, many tasks are reoccurring frequently, e.g. geometry optimizations, benchmarking series etc. Here, workflows can help to reduce the time of manual job definition and output extraction. These workflows are executed on computing infrastructures and may require large computing and data resources. Scientific workflows hide these infrastructures and the resources needed to run them. It requires significant efforts and specific expertise to design, implement and test these workflows. Many of these workflows are complex and monolithic entities that can be used for particular scientific experiments. Hence, their modification is not straightforward and it makes almost impossible to share them. To address these issues we propose developing atomic workflows and embedding them in meta-workflows. Atomic workflows deliver a well-defined research domain specific function. Publishing workflows in repositories enables workflow sharing inside and/or among scientific communities. We formally specify atomic and meta-workflows in order to define data structures to be used in repositories for uploading and sharing them. Additionally, we present a formal description focused at orchestration of atomic workflows into meta-workflows. We investigated the operations that represent basic functionalities in Quantum Chemistry, developed the relevant atomic workflows and combined them into meta-workflows. Having these workflows we defined the structure of the Quantum Chemistry workflow library and uploaded these workflows in the SHIWA Workflow Repository.Graphical AbstractMeta-workflows and embedded workflows in the template representation.

  1. Database Support for Workflow Management: The WIDE Project

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Pernici, B; Sánchez, G.; Unknown, [Unknown

    1999-01-01

    Database Support for Workflow Management: The WIDE Project presents the results of the ESPRIT WIDE project on advanced database support for workflow management. The book discusses the state of the art in combining database management and workflow management technology, especially in the areas of

  2. Workflow in Almaraz NPP

    International Nuclear Information System (INIS)

    Gonzalez Crego, E.; Martin Lopez-Suevos, C.

    2000-01-01

    Almaraz NPP decided to incorporate Workflow into its information system in response to the need to provide exhaustive follow-up and monitoring of each phase of the different procedures it manages. Oracle's Workflow was chosen for this purpose and it was integrated with previously developed applications. The objectives to be met in the incorporation of Workflow were as follows: Strict monitoring of procedures and processes. Detection of bottlenecks in the flow of information. Notification of those affected by pending tasks. Flexible allocation of tasks to user groups. Improved monitoring of management procedures. Improved communication. Similarly, special care was taken to: Integrate workflow processes with existing control panels. Synchronize workflow with installation procedures. Ensure that the system reflects use of paper forms. At present the Corrective Maintenance Request module is being operated using Workflow and the Work Orders and Notice of Order modules are about to follow suit. (Author)

  3. Agreement Workflow Tool (AWT)

    Data.gov (United States)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  4. Liquid-phase microextraction combined with graphite furnace atomic absorption spectrometry: A review.

    Science.gov (United States)

    de la Calle, Inmaculada; Pena-Pereira, Francisco; Lavilla, Isela; Bendicho, Carlos

    2016-09-14

    An overview of the combination of liquid-phase microextraction (LPME) techniques with graphite furnace atomic absorption spectrometry (GFAAS) is reported herein. The high sensitivity of GFAAS is significantly enhanced by its association with a variety of miniaturized solvent extraction approaches. LPME-GFAAS thus represents a powerful combination for determination of metals, metalloids and organometallic compounds at (ultra)trace level. Different LPME modes used with GFAAS are briefly described, and the experimental parameters that show an impact in those microextraction processes are discussed. Special attention is paid to those parameters affecting GFAAS analysis. Main issues found when coupling LPME and GFAAS, as well as those strategies reported in the literature to solve them, are summarized. Relevant applications published on the topic so far are included. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Querying Workflow Logs

    Directory of Open Access Journals (Sweden)

    Yan Tang

    2018-01-01

    Full Text Available A business process or workflow is an assembly of tasks that accomplishes a business goal. Business process management is the study of the design, configuration/implementation, enactment and monitoring, analysis, and re-design of workflows. The traditional methodology for the re-design and improvement of workflows relies on the well-known sequence of extract, transform, and load (ETL, data/process warehousing, and online analytical processing (OLAP tools. In this paper, we study the ad hoc queryiny of process enactments for (data-centric business processes, bypassing the traditional methodology for more flexibility in querying. We develop an algebraic query language based on “incident patterns” with four operators inspired from Business Process Model and Notation (BPMN representation, allowing the user to formulate ad hoc queries directly over workflow logs. A formal semantics of this query language, a preliminary query evaluation algorithm, and a group of elementary properties of the operators are provided.

  6. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  7. Responsive web design workflow

    OpenAIRE

    LAAK, TIMO

    2013-01-01

    Responsive Web Design Workflow is a literature review about Responsive Web Design, a web standards based modern web design paradigm. The goals of this research were to define what responsive web design is, determine its importance in building modern websites and describe a workflow for responsive web design projects. Responsive web design is a paradigm to create adaptive websites, which respond to the properties of the media that is used to render them. The three key elements of responsi...

  8. Combined discrete nebulization and microextraction process for molybdenum determination by flame atomic absorption spectrometry (FAAS)

    International Nuclear Information System (INIS)

    Oviedo, Jenny A.; Jesus, Amanda M.D. de; Fialho, Lucimar L.; Pereira-Filho, Edenir R.

    2014-01-01

    Simple and sensitive procedures for the extraction/preconcentration of molybdenum based on vortex-assisted solidified floating organic drop microextraction (VA-SFODME) and cloud point combined with flame absorption atomic spectrometry (FAAS) and discrete nebulization were developed. The influence of the discrete nebulization on the sensitivity of the molybdenum preconcentration processes was studied. An injection volume of 200 μ resulted in a lower relative standard deviation with both preconcentration procedures. Enrichment factors of 31 and 67 and limits of detection of 25 and 5 μ L -1 were obtained for cloud point and VA-SFODME, respectively. The developed procedures were applied to the determination of Mo in mineral water and multivitamin samples. (author)

  9. Combined X-ray CT and mass spectrometry for biomedical imaging applications

    Science.gov (United States)

    Schioppa, E., Jr.; Ellis, S.; Bruinen, A. L.; Visser, J.; Heeren, R. M. A.; Uher, J.; Koffeman, E.

    2014-04-01

    Imaging technologies play a key role in many branches of science, especially in biology and medicine. They provide an invaluable insight into both internal structure and processes within a broad range of samples. There are many techniques that allow one to obtain images of an object. Different techniques are based on the analysis of a particular sample property by means of a dedicated imaging system, and as such, each imaging modality provides the researcher with different information. The use of multimodal imaging (imaging with several different techniques) can provide additional and complementary information that is not possible when employing a single imaging technique alone. In this study, we present for the first time a multi-modal imaging technique where X-ray computerized tomography (CT) is combined with mass spectrometry imaging (MSI). While X-ray CT provides 3-dimensional information regarding the internal structure of the sample based on X-ray absorption coefficients, MSI of thin sections acquired from the same sample allows the spatial distribution of many elements/molecules, each distinguished by its unique mass-to-charge ratio (m/z), to be determined within a single measurement and with a spatial resolution as low as 1 μm or even less. The aim of the work is to demonstrate how molecular information from MSI can be spatially correlated with 3D structural information acquired from X-ray CT. In these experiments, frozen samples are imaged in an X-ray CT setup using Medipix based detectors equipped with a CO2 cooled sample holder. Single projections are pre-processed before tomographic reconstruction using a signal-to-thickness calibration. In the second step, the object is sliced into thin sections (circa 20 μm) that are then imaged using both matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) and secondary ion (SIMS) mass spectrometry, where the spatial distribution of specific molecules within the sample is determined. The

  10. Liquid-phase microextraction combined with graphite furnace atomic absorption spectrometry: A review

    Energy Technology Data Exchange (ETDEWEB)

    Calle, Inmaculada de la; Pena-Pereira, Francisco; Lavilla, Isela; Bendicho, Carlos, E-mail: bendicho@uvigo.es

    2016-09-14

    An overview of the combination of liquid-phase microextraction (LPME) techniques with graphite furnace atomic absorption spectrometry (GFAAS) is reported herein. The high sensitivity of GFAAS is significantly enhanced by its association with a variety of miniaturized solvent extraction approaches. LPME-GFAAS thus represents a powerful combination for determination of metals, metalloids and organometallic compounds at (ultra)trace level. Different LPME modes used with GFAAS are briefly described, and the experimental parameters that show an impact in those microextraction processes are discussed. Special attention is paid to those parameters affecting GFAAS analysis. Main issues found when coupling LPME and GFAAS, as well as those strategies reported in the literature to solve them, are summarized. Relevant applications published on the topic so far are included. - Highlights: • We review the LPME-GFAAS combination in a comprehensive way. • A brief description of main LPME modes is included. • Effect of experimental parameters in the performance of LPME-GFAAS is discussed. • Main applications for trace element analysis and speciation are reviewed.

  11. Liquid-phase microextraction combined with graphite furnace atomic absorption spectrometry: A review

    International Nuclear Information System (INIS)

    Calle, Inmaculada de la; Pena-Pereira, Francisco; Lavilla, Isela; Bendicho, Carlos

    2016-01-01

    An overview of the combination of liquid-phase microextraction (LPME) techniques with graphite furnace atomic absorption spectrometry (GFAAS) is reported herein. The high sensitivity of GFAAS is significantly enhanced by its association with a variety of miniaturized solvent extraction approaches. LPME-GFAAS thus represents a powerful combination for determination of metals, metalloids and organometallic compounds at (ultra)trace level. Different LPME modes used with GFAAS are briefly described, and the experimental parameters that show an impact in those microextraction processes are discussed. Special attention is paid to those parameters affecting GFAAS analysis. Main issues found when coupling LPME and GFAAS, as well as those strategies reported in the literature to solve them, are summarized. Relevant applications published on the topic so far are included. - Highlights: • We review the LPME-GFAAS combination in a comprehensive way. • A brief description of main LPME modes is included. • Effect of experimental parameters in the performance of LPME-GFAAS is discussed. • Main applications for trace element analysis and speciation are reviewed.

  12. ANALYSIS OF ARTEMISININ AND RELATED SESQUITERPENOIDS FROM ARTEMISIA-ANNUA L BY COMBINED GAS-CHROMATOGRAPHY MASS-SPECTROMETRY

    NARCIS (Netherlands)

    WOERDENBAG, HJ; PRAS, N; BOS, R; VISSER, JF; HENDRIKS, H; MALINGRE, TM

    1991-01-01

    The sesquiterpenoid artemisinin (3) and its biosynthetic precursors arteannuic acid (1), arteannuin B (2) and artemisitene (4) can be separated and identified by combined gas chromatography/mass spectrometry both as a mixture of reference standards as well as in extracts of Artemisia annua L. From

  13. Workflow management: an overview

    NARCIS (Netherlands)

    Ouyang, C.; Adams, M.; Wynn, M.T.; Hofstede, ter A.H.M.; Brocke, vom J.; Rosemann, M.

    2010-01-01

    Workflow management has its origin in the office automation systems of the seventies, but it is not until fairly recently that conceptual and technological breakthroughs have led to its widespread adoption. In fact, nowadays, processawareness has become an accepted and integral part of various types

  14. Ferret Workflow Anomaly Detection System

    National Research Council Canada - National Science Library

    Smith, Timothy J; Bryant, Stephany

    2005-01-01

    The Ferret workflow anomaly detection system project 2003-2004 has provided validation and anomaly detection in accredited workflows in secure knowledge management systems through the use of continuous, automated audits...

  15. Identification and monitoring of host cell proteins by mass spectrometry combined with high performance immunochemistry testing.

    Directory of Open Access Journals (Sweden)

    Katrin Bomans

    Full Text Available Biotherapeutics are often produced in non-human host cells like Escherichia coli, yeast, and various mammalian cell lines. A major focus of any therapeutic protein purification process is to reduce host cell proteins to an acceptable low level. In this study, various E. coli host cell proteins were identified at different purifications steps by HPLC fractionation, SDS-PAGE analysis, and tryptic peptide mapping combined with online liquid chromatography mass spectrometry (LC-MS. However, no host cell proteins could be verified by direct LC-MS analysis of final drug substance material. In contrast, the application of affinity enrichment chromatography prior to comprehensive LC-MS was adequate to identify several low abundant host cell proteins at the final drug substance level. Bacterial alkaline phosphatase (BAP was identified as being the most abundant host cell protein at several purification steps. Thus, we firstly established two different assays for enzymatic and immunological BAP monitoring using the cobas® technology. By using this strategy we were able to demonstrate an almost complete removal of BAP enzymatic activity by the established therapeutic protein purification process. In summary, the impact of fermentation, purification, and formulation conditions on host cell protein removal and biological activity can be conducted by monitoring process-specific host cell proteins in a GMP-compatible and high-throughput (> 1000 samples/day manner.

  16. Optimization of the combination micro-high-performance liquid-chromatography/mass spectrometry

    International Nuclear Information System (INIS)

    Haider, K.

    1997-03-01

    The coupling of liquid chromatography and mass spectrometry is still growing in significance. In this thesis, a particle beam interface has been investigated for combining ion chromatography with mass spectrometric detection. To introduce the eluent directly (without membrane suppressor) into the spectrometer, only methods with low flow rates like microcolumn chromatography can be used. For the preparation of the columns, reversed-phase and silica-based anion exchange materials were packed into PEEK, steel and fused-silica capillaries with i.d. from 130 to 1000 μm using different methods. The performance of the particle beam interface (modified with a new miniaturized aerosol generator) and the mass spectrometric detection has been studied for a series of inorganic anions as well as aminopolycarboxylic acids and the metal-EDTA complexes. Detection limits between 10 and 100 ng injected could be achieved in the multiple ion detection mode of the mass spectrometer for the investigated solutes. A second type of interface, the direct liquid introduction (DLI) has been used to analyze the priority pollutant phenols. This interface is based on a modified GC-interface into the MS. Separation columns used so far include packed fused-silica capillaries with inner diameter of 75 μm and polystyrene-divinylbenzene (functionalized with tert. butyl groups) as stationary phase. Aspects of instrumentation and effects of chemical ionization in the direct liquid introduction mode are discussed. (author)

  17. X-ray fluorescence and gamma-ray spectrometry combined with multivariate analysis for topographic studies in agricultural soil

    International Nuclear Information System (INIS)

    Castilhos, Natara D.B. de; Melquiades, Fábio L.; Thomaz, Edivaldo L.; Bastos, Rodrigo Oliveira

    2015-01-01

    Physical and chemical properties of soils play a major role in the evaluation of different geochemical signature, soil quality, discrimination of land use type, soil provenance and soil degradation. The objectives of the present study are the soil elemental characterization and soil differentiation in topographic sequence and depth, using Energy Dispersive X-Ray Fluorescence (EDXRF) as well as gamma-ray spectrometry data combined with Principal Component Analysis (PCA). The study area is an agricultural region of Boa Vista catchment which is located at Guamiranga municipality, Brazil. PCA analysis was performed with four different data sets: spectral data from EDXRF, spectral data from gamma-ray spectrometry, concentration values from EDXRF measurements and concentration values from gamma-ray spectrometry. All PCAs showed similar results, confirmed by hierarchical cluster analysis, allowing the data grouping into top, bottom and riparian zone samples, i.e. the samples were separated due to its landscape position. The two hillslopes present the same behavior independent of the land use history. There are distinctive and characteristic patterns in the analyzed soil. The methodologies presented are promising and could be used to infer significant information about the region to be studied. - Highlights: • Characterization of topographic sequence of two hillslopes from agricultural soil. • Employment of EDXRF and gamma-ray spectrometry data combined with PCA. • The combination of green analytical methodologies with chemometric studies allowed soil differentiation. • The innovative methodology is promising for direct characterization of agricultural catchments

  18. Workflows for Full Waveform Inversions

    Science.gov (United States)

    Boehm, Christian; Krischer, Lion; Afanasiev, Michael; van Driel, Martin; May, Dave A.; Rietmann, Max; Fichtner, Andreas

    2017-04-01

    Despite many theoretical advances and the increasing availability of high-performance computing clusters, full seismic waveform inversions still face considerable challenges regarding data and workflow management. While the community has access to solvers which can harness modern heterogeneous computing architectures, the computational bottleneck has fallen to these often manpower-bounded issues that need to be overcome to facilitate further progress. Modern inversions involve huge amounts of data and require a tight integration between numerical PDE solvers, data acquisition and processing systems, nonlinear optimization libraries, and job orchestration frameworks. To this end we created a set of libraries and applications revolving around Salvus (http://salvus.io), a novel software package designed to solve large-scale full waveform inverse problems. This presentation focuses on solving passive source seismic full waveform inversions from local to global scales with Salvus. We discuss (i) design choices for the aforementioned components required for full waveform modeling and inversion, (ii) their implementation in the Salvus framework, and (iii) how it is all tied together by a usable workflow system. We combine state-of-the-art algorithms ranging from high-order finite-element solutions of the wave equation to quasi-Newton optimization algorithms using trust-region methods that can handle inexact derivatives. All is steered by an automated interactive graph-based workflow framework capable of orchestrating all necessary pieces. This naturally facilitates the creation of new Earth models and hopefully sparks new scientific insights. Additionally, and even more importantly, it enhances reproducibility and reliability of the final results.

  19. Effect-directed fingerprints of 77 botanical extracts via a generic high-performance thin-layer chromatography method combined with assays and mass spectrometry.

    Science.gov (United States)

    Krüger, S; Hüsken, L; Fornasari, R; Scainelli, I; Morlock, G E

    2017-12-22

    Quantitative effect-directed profiles of 77 industrially and freshly extracted botanicals like herbs, spices, vegetables and fruits, widely used as food ingredients, dietary supplements or traditional medicine, gave relevant information on their quality. It allows the assessment of food, dietary supplements and phytomedicines with regard to potential health-promoting activities. In contrary to sum parameter assays and targeted analysis, chromatography combined with effect-directed analysis allows fast assignment of single active compounds and evaluation of their contribution to the overall activity, originating from a food or botanical sample. High-performance thin-layer chromatography was hyphenated with UV/Vis/FLD detection and effect-directed analysis, using the 2,2-diphenyl-1-picrylhydrazyl radical, Gram-negative Aliivibrio fischeri, Gram-positive Bacillus subtilis, acetylcholinesterase and tyrosinase assays. Bioactive compounds of interest were eluted using an elution head-based interface and further characterized by electrospray ionization (high-resolution) mass spectrometry. This highly streamlined workflow resulted in a hyphenated HPTLC-UV/Vis/FLD-EDA-ESI + /ESI - -(HR)MS method. The excellent quantification power of the method was shown on three compounds. For rosmarinic acid, contents ranged from 4.5mg/g (rooibos) to 32.6mg/g (rosemary), for kaempferol-3-glucoside from 0.6mg/g (caraway) to 4.4mg/g (wine leaves), and for quercetin-3-glucoside from 1.1mg/g (hawthorn leaves) to 17.7mg/g (thyme). Three mean repeatabilities (%RSD) over 18 quantifications for the three compounds were ≤2.2% and the mean intermediate precision over three different days (%RSD, n=3) was 5.2%. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Rapid identification and susceptibility testing of Candida spp. from positive blood cultures by combination of direct MALDI-TOF mass spectrometry and direct inoculation of Vitek 2.

    Science.gov (United States)

    Idelevich, Evgeny A; Grunewald, Camilla M; Wüllenweber, Jörg; Becker, Karsten

    2014-01-01

    Fungaemia is associated with high mortality rates and early appropriate antifungal therapy is essential for patient management. However, classical diagnostic workflow takes up to several days due to the slow growth of yeasts. Therefore, an approach for direct species identification and direct antifungal susceptibility testing (AFST) without prior time-consuming sub-culturing of yeasts from positive blood cultures (BCs) is urgently needed. Yeast cell pellets prepared using Sepsityper kit were used for direct identification by MALDI-TOF mass spectrometry (MS) and for direct inoculation of Vitek 2 AST-YS07 card for AFST. For comparison, MALDI-TOF MS and Vitek 2 testing were performed from yeast subculture. A total of twenty four positive BCs including twelve C. glabrata, nine C. albicans, two C. dubliniensis and one C. krusei isolate were processed. Applying modified thresholds for species identification (score ≥ 1.5 with two identical consecutive propositions), 62.5% of BCs were identified by direct MALDI-TOF MS. AFST results were generated for 72.7% of BCs directly tested by Vitek 2 and for 100% of standardized suspensions from 24 h cultures. Thus, AFST comparison was possible for 70 isolate-antifungal combinations. Essential agreement (minimum inhibitory concentration difference ≤ 1 double dilution step) was 88.6%. Very major errors (VMEs) (false-susceptibility), major errors (false-resistance) and minor errors (false categorization involving intermediate result) amounted to 33.3% (of resistant isolates), 1.9% (of susceptible isolates) and 1.4% providing 90.0% categorical agreement. All VMEs were due to fluconazole or voriconazole. This direct method saved on average 23.5 h for identification and 15.1 h for AFST, compared to routine procedures. However, performance for azole susceptibility testing was suboptimal and testing from subculture remains indispensable to validate the direct finding.

  1. Rapid identification and susceptibility testing of Candida spp. from positive blood cultures by combination of direct MALDI-TOF mass spectrometry and direct inoculation of Vitek 2.

    Directory of Open Access Journals (Sweden)

    Evgeny A Idelevich

    Full Text Available Fungaemia is associated with high mortality rates and early appropriate antifungal therapy is essential for patient management. However, classical diagnostic workflow takes up to several days due to the slow growth of yeasts. Therefore, an approach for direct species identification and direct antifungal susceptibility testing (AFST without prior time-consuming sub-culturing of yeasts from positive blood cultures (BCs is urgently needed. Yeast cell pellets prepared using Sepsityper kit were used for direct identification by MALDI-TOF mass spectrometry (MS and for direct inoculation of Vitek 2 AST-YS07 card for AFST. For comparison, MALDI-TOF MS and Vitek 2 testing were performed from yeast subculture. A total of twenty four positive BCs including twelve C. glabrata, nine C. albicans, two C. dubliniensis and one C. krusei isolate were processed. Applying modified thresholds for species identification (score ≥ 1.5 with two identical consecutive propositions, 62.5% of BCs were identified by direct MALDI-TOF MS. AFST results were generated for 72.7% of BCs directly tested by Vitek 2 and for 100% of standardized suspensions from 24 h cultures. Thus, AFST comparison was possible for 70 isolate-antifungal combinations. Essential agreement (minimum inhibitory concentration difference ≤ 1 double dilution step was 88.6%. Very major errors (VMEs (false-susceptibility, major errors (false-resistance and minor errors (false categorization involving intermediate result amounted to 33.3% (of resistant isolates, 1.9% (of susceptible isolates and 1.4% providing 90.0% categorical agreement. All VMEs were due to fluconazole or voriconazole. This direct method saved on average 23.5 h for identification and 15.1 h for AFST, compared to routine procedures. However, performance for azole susceptibility testing was suboptimal and testing from subculture remains indispensable to validate the direct finding.

  2. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2018-01-01

    From 2025 onwards, the ATLAS collaboration at the Large Hadron Collider (LHC) at CERN will experience a massive increase in data quantity as well as complexity. Including mitigating factors, the prevalent computing power by that time will only fulfil one tenth of the requirement. This contribution will focus on Cloud computing as an approach to help overcome this challenge by providing flexible hardware that can be configured to the specific needs of a workflow. Experience with Cloud computing exists, but there is a large uncertainty if and to which degree it can be able to reduce the burden by 2025. In order to understand and quantify the benefits of Cloud computing, the "Workflow and Infrastructure Model" was created. It estimates the viability of Cloud computing by combining different inputs from the workflow side with infrastructure specifications. The model delivers metrics that enable the comparison of different Cloud configurations as well as different Cloud offerings with each other. A wide range of r...

  3. Insightful Workflow For Grid Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Charles Earl

    2008-10-09

    We developed a workflow adaptation and scheduling system for Grid workflow. The system currently interfaces with and uses the Karajan workflow system. We developed machine learning agents that provide the planner/scheduler with information needed to make decisions about when and how to replan. The Kubrick restructures workflow at runtime, making it unique among workflow scheduling systems. The existing Kubrick system provides a platform on which to integrate additional quality of service constraints and in which to explore the use of an ensemble of scheduling and planning algorithms. This will be the principle thrust of our Phase II work.

  4. Intact cell MALDI-TOF mass spectrometry on single bovine oocyte and follicular cells combined with top-down proteomics: A novel approach to characterise markers of oocyte maturation.

    Science.gov (United States)

    Labas, Valérie; Teixeira-Gomes, Ana-Paula; Bouguereau, Laura; Gargaros, Audrey; Spina, Lucie; Marestaing, Aurélie; Uzbekova, Svetlana

    2018-03-20

    Intact cell MALDI-TOF mass spectrometry (ICM-MS) was adapted to bovine follicular cells from individual ovarian follicles to obtain the protein/peptide signatures (top-down workflow using high resolution MS/MS (TD HR-MS) was performed on the protein extracts from oocytes, CC and GC. The TD HR-MS proteomic approach allowed for: (1) identification of 386 peptide/proteoforms encoded by 194 genes; and (2) characterisation of proteolysis products likely resulting from the action of kallikreins and caspases. In total, 136 peaks observed by ICM-MS were annotated by TD HR-MS (ProteomeXchange PXD004892). Among these, 16 markers of maturation were identified, including IGF2 binding protein 3 and hemoglobin B in the oocyte, thymosins beta-4 and beta-10, histone H2B and ubiquitin in CC. The combination of ICM-MS and TD HR-MS proved to be a suitable strategy to identify non-invasive markers of oocyte quality using limited biological samples. Intact cell MALDI-TOF mass spectrometry on single oocytes and their surrounding cumulus cells, coupled to an optimised top-down HR-MS proteomic approach on ovarian follicular cells, was used to identify specific markers of oocyte meiotic maturation represented by whole low molecular weight proteins or products of degradation by specific proteases. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Recent advances in combination of capillary electrophoresis with mass spectrometry: Methodology and theory

    OpenAIRE

    Klepárník, K. (Karel)

    2015-01-01

    This review focuses on the latest development of microseparation electromigration methods in capillaries and microfluidic devices with mass spectrometry detection and identification. A wide selection of 183 relevant articles covers the literature published from June 2012 till May 2014.

  6. Workflow User Interfaces Patterns

    Directory of Open Access Journals (Sweden)

    Jean Vanderdonckt

    2012-03-01

    Full Text Available Este trabajo presenta una colección de patrones de diseño de interfaces de usuario para sistemas de información para el flujo de trabajo; la colección incluye cuarenta y tres patrones clasificados en siete categorías identificados a partir de la lógica del ciclo de vida de la tarea sobre la base de la oferta y la asignación de tareas a los responsables de realizarlas (i. e. recursos humanos durante el flujo de trabajo. Cada patrón de la interfaz de usuario de flujo de trabajo (WUIP, por sus siglas en inglés se caracteriza por las propiedades expresadas en el lenguaje PLML para expresar patrones y complementado por otros atributos y modelos que se adjuntan a dicho modelo: la interfaz de usuario abstracta y el modelo de tareas correspondiente. Estos modelos se especifican en un lenguaje de descripción de interfaces de usuario. Todos los WUIPs se almacenan en una biblioteca y se pueden recuperar a través de un editor de flujo de trabajo que vincula a cada patrón de asignación de trabajo a su WUIP correspondiente.A collection of user interface design patterns for workflow information systems is presented that contains forty three resource patterns classified in seven categories. These categories and their corresponding patterns have been logically identified from the task life cycle based on offering and allocation operations. Each Workflow User Interface Pattern (WUIP is characterized by properties expressed in the PLML markup language for expressing patterns and augmented by additional attributes and models attached to the pattern: the abstract user interface and the corresponding task model. These models are specified in a User Interface Description Language. All WUIPs are stored in a library and can be retrieved within a workflow editor that links each workflow pattern to its corresponding WUIP, thus giving rise to a user interface for each workflow pattern.

  7. Accelerating the scientific exploration process with scientific workflows

    International Nuclear Information System (INIS)

    Altintas, Ilkay; Barney, Oscar; Cheng, Zhengang; Critchlow, Terence; Ludaescher, Bertram; Parker, Steve; Shoshani, Arie; Vouk, Mladen

    2006-01-01

    Although an increasing amount of middleware has emerged in the last few years to achieve remote data access, distributed job execution, and data management, orchestrating these technologies with minimal overhead still remains a difficult task for scientists. Scientific workflow systems improve this situation by creating interfaces to a variety of technologies and automating the execution and monitoring of the workflows. Workflow systems provide domain-independent customizable interfaces and tools that combine different tools and technologies along with efficient methods for using them. As simulations and experiments move into the petascale regime, the orchestration of long running data and compute intensive tasks is becoming a major requirement for the successful steering and completion of scientific investigations. A scientific workflow is the process of combining data and processes into a configurable, structured set of steps that implement semi-automated computational solutions of a scientific problem. Kepler is a cross-project collaboration, co-founded by the SciDAC Scientific Data Management (SDM) Center, whose purpose is to develop a domain-independent scientific workflow system. It provides a workflow environment in which scientists design and execute scientific workflows by specifying the desired sequence of computational actions and the appropriate data flow, including required data transformations, between these steps. Currently deployed workflows range from local analytical pipelines to distributed, high-performance and high-throughput applications, which can be both data- and compute-intensive. The scientific workflow approach offers a number of advantages over traditional scripting-based approaches, including ease of configuration, improved reusability and maintenance of workflows and components (called actors), automated provenance management, 'smart' re-running of different versions of workflow instances, on-the-fly updateable parameters, monitoring

  8. Dispersive liquid-liquid microextraction combined with graphite furnace atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Zeini Jahromi, Elham; Bidari, Araz; Assadi, Yaghoub; Milani Hosseini, Mohammad Reza; Jamali, Mohammad Reza

    2007-01-01

    Dispersive liquid-liquid microextraction (DLLME) technique was successfully used as a sample preparation method for graphite furnace atomic absorption spectrometry (GF AAS). In this extraction method, 500 μL methanol (disperser solvent) containing 34 μL carbon tetrachloride (extraction solvent) and 0.00010 g ammonium pyrrolidine dithiocarbamate (chelating agent) was rapidly injected by syringe into the water sample containing cadmium ions (interest analyte). Thereby, a cloudy solution formed. The cloudy state resulted from the formation of fine droplets of carbon tetrachloride, which have been dispersed, in bulk aqueous sample. At this stage, cadmium reacts with ammonium pyrrolidine dithiocarbamate, and therefore, hydrophobic complex forms which is extracted into the fine droplets of carbon tetrachloride. After centrifugation (2 min at 5000 rpm), these droplets were sedimented at the bottom of the conical test tube (25 ± 1 μL). Then a 20 μL of sedimented phase containing enriched analyte was determined by GF AAS. Some effective parameters on extraction and complex formation, such as extraction and disperser solvent type and their volume, extraction time, salt effect, pH and concentration of the chelating agent have been optimized. Under the optimum conditions, the enrichment factor 125 was obtained from only 5.00 mL of water sample. The calibration graph was linear in the rage of 2-20 ng L -1 with detection limit of 0.6 ng L -1 . The relative standard deviation (R.S.D.s) for ten replicate measurements of 20 ng L -1 of cadmium was 3.5%. The relative recoveries of cadmium in tap, sea and rivers water samples at spiking level of 5 and 10 ng L -1 are 108, 95, 87 and 98%, respectively. The characteristics of the proposed method have been compared with cloud point extraction (CPE), on-line liquid-liquid extraction, single drop microextraction (SDME), on-line solid phase extraction (SPE) and co-precipitation based on bibliographic data. Therefore, DLLME combined with

  9. Affinity purification combined with mass spectrometry to identify herpes simplex virus protein-protein interactions.

    Science.gov (United States)

    Meckes, David G

    2014-01-01

    The identification and characterization of herpes simplex virus protein interaction complexes are fundamental to understanding the molecular mechanisms governing the replication and pathogenesis of the virus. Recent advances in affinity-based methods, mass spectrometry configurations, and bioinformatics tools have greatly increased the quantity and quality of protein-protein interaction datasets. In this chapter, detailed and reliable methods that can easily be implemented are presented for the identification of protein-protein interactions using cryogenic cell lysis, affinity purification, trypsin digestion, and mass spectrometry.

  10. Modelling and analysis of workflow for lean supply chains

    Science.gov (United States)

    Ma, Jinping; Wang, Kanliang; Xu, Lida

    2011-11-01

    Cross-organisational workflow systems are a component of enterprise information systems which support collaborative business process among organisations in supply chain. Currently, the majority of workflow systems is developed in perspectives of information modelling without considering actual requirements of supply chain management. In this article, we focus on the modelling and analysis of the cross-organisational workflow systems in the context of lean supply chain (LSC) using Petri nets. First, the article describes the assumed conditions of cross-organisation workflow net according to the idea of LSC and then discusses the standardisation of collaborating business process between organisations in the context of LSC. Second, the concept of labelled time Petri nets (LTPNs) is defined through combining labelled Petri nets with time Petri nets, and the concept of labelled time workflow nets (LTWNs) is also defined based on LTPNs. Cross-organisational labelled time workflow nets (CLTWNs) is then defined based on LTWNs. Third, the article proposes the notion of OR-silent CLTWNS and a verifying approach to the soundness of LTWNs and CLTWNs. Finally, this article illustrates how to use the proposed method by a simple example. The purpose of this research is to establish a formal method of modelling and analysis of workflow systems for LSC. This study initiates a new perspective of research on cross-organisational workflow management and promotes operation management of LSC in real world settings.

  11. Recent advances in combination of capillary electrophoresis with mass spectrometry: Methodology and theory

    Czech Academy of Sciences Publication Activity Database

    Klepárník, Karel

    2015-01-01

    Roč. 36, č. 1 (2015), s. 159-179 ISSN 0173-0835 R&D Projects: GA ČR(CZ) GA14-28254S Institutional support: RVO:68081715 Keywords : capillary electrophoresis * electrospray * mass spectrometry * Microfluidic devices Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 2.482, year: 2015

  12. Native State Mass Spectrometry, Surface Plasmon Resonance, and X-ray Crystallography Correlate Strongly as a Fragment Screening Combination.

    Science.gov (United States)

    Woods, Lucy A; Dolezal, Olan; Ren, Bin; Ryan, John H; Peat, Thomas S; Poulsen, Sally-Ann

    2016-03-10

    Fragment-based drug discovery (FBDD) is contingent on the development of analytical methods to identify weak protein-fragment noncovalent interactions. Herein we have combined an underutilized fragment screening method, native state mass spectrometry, together with two proven and popular fragment screening methods, surface plasmon resonance and X-ray crystallography, in a fragment screening campaign against human carbonic anhydrase II (CA II). In an initial fragment screen against a 720-member fragment library (the "CSIRO Fragment Library") seven CA II binding fragments, including a selection of nonclassical CA II binding chemotypes, were identified. A further 70 compounds that comprised the initial hit chemotypes were subsequently sourced from the full CSIRO compound collection and screened. The fragment results were extremely well correlated across the three methods. Our findings demonstrate that there is a tremendous opportunity to apply native state mass spectrometry as a complementary fragment screening method to accelerate drug discovery.

  13. Data Workflow - A Workflow Model for Continuous Data Processing

    NARCIS (Netherlands)

    Wombacher, Andreas

    2010-01-01

    Online data or streaming data are getting more and more important for enterprise information systems, e.g. by integrating sensor data and workflows. The continuous flow of data provided e.g. by sensors requires new workflow models addressing the data perspective of these applications, since

  14. On Secure Workflow Decentralisation on the Internet

    Directory of Open Access Journals (Sweden)

    Petteri Kaskenpalo

    2010-06-01

    Full Text Available Decentralised workflow management systems are a new research area, where most work to-date has focused on the system's overall architecture. As little attention has been given to the security aspects in such systems, we follow a security driven approach, and consider, from the perspective of available security building blocks, how security can be implemented and what new opportunities are presented when empowering the decentralised environment with modern distributed security protocols. Our research is motivated by a more general question of how to combine the positive enablers that email exchange enjoys, with the general benefits of workflow systems, and more specifically with the benefits that can be introduced in a decentralised environment. This aims to equip email users with a set of tools to manage the semantics of a message exchange, contents, participants and their roles in the exchange in an environment that provides inherent assurances of security and privacy. This work is based on a survey of contemporary distributed security protocols, and considers how these protocols could be used in implementing a distributed workflow management system with decentralised control . We review a set of these protocols, focusing on the required message sequences in reviewing the protocols, and discuss how these security protocols provide the foundations for implementing core control-flow, data, and resource patterns in a distributed workflow environment.

  15. Conventions and workflows for using Situs

    International Nuclear Information System (INIS)

    Wriggers, Willy

    2012-01-01

    Recent developments of the Situs software suite for multi-scale modeling are reviewed. Typical workflows and conventions encountered during processing of biophysical data from electron microscopy, tomography or small-angle X-ray scattering are described. Situs is a modular program package for the multi-scale modeling of atomic resolution structures and low-resolution biophysical data from electron microscopy, tomography or small-angle X-ray scattering. This article provides an overview of recent developments in the Situs package, with an emphasis on workflows and conventions that are important for practical applications. The modular design of the programs facilitates scripting in the bash shell that allows specific programs to be combined in creative ways that go beyond the original intent of the developers. Several scripting-enabled functionalities, such as flexible transformations of data type, the use of symmetry constraints or the creation of two-dimensional projection images, are described. The processing of low-resolution biophysical maps in such workflows follows not only first principles but often relies on implicit conventions. Situs conventions related to map formats, resolution, correlation functions and feature detection are reviewed and summarized. The compatibility of the Situs workflow with CCP4 conventions and programs is discussed

  16. Identification of chemical components in Baidianling Capsule based on gas chromatography-mass spectrometry and high-performance liquid chromatography combined with Fourier transform ion cyclotron resonance mass spectrometry.

    Science.gov (United States)

    Wu, Wenying; Chen, Yu; Wang, Binjie; Sun, Xiaoyang; Guo, Ping; Chen, Xiaohui

    2017-08-01

    Baidianling Capsule, which is made from 16 Chinese herbs, has been widely used for treating vitiligo clinically. In this study, the sensitive and rapid method has been developed for the analysis of chemical components in Baidianling Capsule by gas chromatography-mass spectrometry in combination with retention indices and high-performance liquid chromatography combined with Fourier transform ion cyclotron resonance mass spectrometry. Firstly, a total of 110 potential volatile compounds obtained from different extraction procedures including alkanes, alkenes, alkynes, ketones, ethers, aldehydes, alcohols, phenols, organic acids, esters, furans, pyrrole, acid amides, heterocycles, and oxides were detected from Baidianling Capsule by gas chromatography-mass spectrometry, of which 75 were identified by mass spectrometry in combination with the retention index. Then, a total of 124 components were tentatively identified by high-performance liquid chromatography combined with Fourier transform ion cyclotron resonance mass spectrometry. Fifteen constituents from Baidianling Capsule were accurately identified by comparing the retention times with those of reference compounds, others were identified by comparing the retention times and mass spectrometry data, as well as retrieving the reference literature. This study provides a practical strategy for rapidly screening and identifying the multiple constituents of a complex traditional Chinese medicine. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Parametric Room Acoustic Workflows

    DEFF Research Database (Denmark)

    Parigi, Dario; Svidt, Kjeld; Molin, Erik

    2017-01-01

    The paper investigates and assesses different room acoustics software and the opportunities they offer to engage in parametric acoustics workflow and to influence architectural designs. The first step consists in the testing and benchmarking of different tools on the basis of accuracy, speed...... and interoperability with Grasshopper 3d. The focus will be placed to the benchmarking of three different acoustic analysis tools based on raytracing. To compare the accuracy and speed of the acoustic evaluation across different tools, a homogeneous set of acoustic parameters is chosen. The room acoustics parameters...... included in the set are reverberation time (EDT, RT30), clarity (C50), loudness (G), and definition (D50). Scenarios are discussed for determining at different design stages the most suitable acoustic tool. Those scenarios are characterized, by the use of less accurate but fast evaluation tools to be used...

  18. Implementing Oracle Workflow

    CERN Document Server

    Mathieson, D W

    1999-01-01

    CERN (see [CERN]) is the world's largest physics research centre. Currently there are around 5,000 people working at the CERN site located on the border of France and Switzerland near Geneva along with another 4,000 working remotely at institutes situated all around the globe. CERN is currently working on the construction of our newest scientific instrument called the Large Hadron Collider (LHC); the construction alone of this 27-kilometre particle accelerator will not complete until 2005. Like many businesses in the current economic climate CERN is expected to continue growing, yet staff numbers are planned to fall in the coming years. In essence, do more with less. In an environment such as this, it is critical that the administration is as efficient as possible. One of the ways that administrative procedures are streamlined is by the use of an organisation-wide workflow system.

  19. Immunoaffinity chromatography of abscisic acid combined with electrospray liquid chromatography–mass spectrometry

    Czech Academy of Sciences Publication Activity Database

    Hradecká, Veronika; Novák, Ondřej; Havlíček, Libor; Strnad, Miroslav

    2007-01-01

    Roč. 847, č. 2 (2007), s. 162-173 ISSN 1570-0232 R&D Projects: GA MŠk(CZ) LC06034; GA AV ČR IBS5038351 Institutional research plan: CEZ:AV0Z50380511 Source of funding: V - iné verejné zdroje ; V - iné verejné zdroje Keywords : abscisic acid * immunoaffinity chromatography * liquid chromatography-mass spectrometry Subject RIV: ED - Physiology Impact factor: 2.935, year: 2007

  20. Digital workflows in contemporary orthodontics

    Directory of Open Access Journals (Sweden)

    Lars R Christensen

    2017-01-01

    Full Text Available Digital workflows are now increasingly possible in orthodontic practice. Workflows designed to improve the customization of orthodontic appliances are now available through laboratories and orthodontic manufacturing facilities in many parts of the world. These now have the potential to improve certain aspects of patient care.

  1. Constructing Workflows from Script Applications

    Directory of Open Access Journals (Sweden)

    Mikołaj Baranowski

    2012-01-01

    Full Text Available For programming and executing complex applications on grid infrastructures, scientific workflows have been proposed as convenient high-level alternative to solutions based on general-purpose programming languages, APIs and scripts. GridSpace is a collaborative programming and execution environment, which is based on a scripting approach and it extends Ruby language with a high-level API for invoking operations on remote resources. In this paper we describe a tool which enables to convert the GridSpace application source code into a workflow representation which, in turn, may be used for scheduling, provenance, or visualization. We describe how we addressed the issues of analyzing Ruby source code, resolving variable and method dependencies, as well as building workflow representation. The solutions to these problems have been developed and they were evaluated by testing them on complex grid application workflows such as CyberShake, Epigenomics and Montage. Evaluation is enriched by representing typical workflow control flow patterns.

  2. Interrogating the Venom of the Viperid Snake Sistrurus catenatus edwardsii by a Combined Approach of Electrospray and MALDI Mass Spectrometry.

    Directory of Open Access Journals (Sweden)

    Alex Chapeaurouge

    Full Text Available The complete sequence characterization of snake venom proteins by mass spectrometry is rather challenging due to the presence of multiple isoforms from different protein families. In the present study, we investigated the tryptic digest of the venom of the viperid snake Sistrurus catenatus edwardsii by a combined approach of liquid chromatography coupled to either electrospray (online or MALDI (offline mass spectrometry. These different ionization techniques proved to be complementary allowing the identification a great variety of isoforms of diverse snake venom protein families, as evidenced by the detection of the corresponding unique peptides. For example, ten out of eleven predicted isoforms of serine proteinases of the venom of S. c. edwardsii were distinguished using this approach. Moreover, snake venom protein families not encountered in a previous transcriptome study of the venom gland of this snake were identified. In essence, our results support the notion that complementary ionization techniques of mass spectrometry allow for the detection of even subtle sequence differences of snake venom proteins, which is fundamental for future structure-function relationship and possible drug design studies.

  3. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    International Nuclear Information System (INIS)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    Highlights: ► Survey of bio-analytical approaches utilizing biomolecule labelling. ► Detailed discussion of methodology and chemistry of elemental labelling. ► Biomedical and bio-analytical applications of elemental labelling. ► FI-ICP-MS and LC–ICP-MS for quantification of elemental labelled biomolecules. ► Review of selected applications. - Abstract: This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given.

  4. Elemental labelling combined with liquid chromatography inductively coupled plasma mass spectrometry for quantification of biomolecules: A review

    Science.gov (United States)

    Kretschy, Daniela; Koellensperger, Gunda; Hann, Stephan

    2012-01-01

    This article reviews novel quantification concepts where elemental labelling is combined with flow injection inductively coupled plasma mass spectrometry (FI-ICP-MS) or liquid chromatography inductively coupled plasma mass spectrometry (LC–ICP-MS), and employed for quantification of biomolecules such as proteins, peptides and related molecules in challenging sample matrices. In the first sections an overview on general aspects of biomolecule quantification, as well as of labelling will be presented emphasizing the potential, which lies in such methodological approaches. In this context, ICP-MS as detector provides high sensitivity, selectivity and robustness in biological samples and offers the capability for multiplexing and isotope dilution mass spectrometry (IDMS). Fundamental methodology of elemental labelling will be highlighted and analytical, as well as biomedical applications will be presented. A special focus will lie on established applications underlining benefits and bottlenecks of such approaches for the implementation in real life analysis. Key research made in this field will be summarized and a perspective for future developments including sophisticated and innovative applications will given. PMID:23062431

  5. Security aspects in teleradiology workflow

    Science.gov (United States)

    Soegner, Peter I.; Helweg, Gernot; Holzer, Heimo; zur Nedden, Dieter

    2000-05-01

    The medicolegal necessity of privacy, security and confidentiality was the aim of the attempt to develop a secure teleradiology workflow between the telepartners -- radiologist and the referring physician. To avoid the lack of dataprotection and datasecurity we introduced biometric fingerprint scanners in combination with smart cards to identify the teleradiology partners and communicated over an encrypted TCP/IP satellite link between Innsbruck and Reutte. We used an asymmetric kryptography method to guarantee authentification, integrity of the data-packages and confidentiality of the medical data. It was necessary to use a biometric feature to avoid a case of mistaken identity of persons, who wanted access to the system. Only an invariable electronical identification allowed a legal liability to the final report and only a secure dataconnection allowed the exchange of sensible medical data between different partners of Health Care Networks. In our study we selected the user friendly combination of a smart card and a biometric fingerprint technique, called SkymedTM Double Guard Secure Keyboard (Agfa-Gevaert) to confirm identities and log into the imaging workstations and the electronic patient record. We examined the interoperability of the used software with the existing platforms. Only the WIN-XX operating systems could be protected at the time of our study.

  6. Research and Implementation of Key Technologies in Multi-Agent System to Support Distributed Workflow

    Science.gov (United States)

    Pan, Tianheng

    2018-01-01

    In recent years, the combination of workflow management system and Multi-agent technology is a hot research field. The problem of lack of flexibility in workflow management system can be improved by introducing multi-agent collaborative management. The workflow management system adopts distributed structure. It solves the problem that the traditional centralized workflow structure is fragile. In this paper, the agent of Distributed workflow management system is divided according to its function. The execution process of each type of agent is analyzed. The key technologies such as process execution and resource management are analyzed.

  7. Combination of solid phase extraction and flame atomic absorption spectrometry for trace analysis of cadmium

    OpenAIRE

    Ensafi, Ali A.; Shiraz, Ameneh Zendegi

    2008-01-01

    A new selective method was developed for the separation and preconcentration of Cd(II) ions based on its complex formation with Xylenol orange loaded on activated carbon as a solid support in a mini-column. The preconcentrated ions were eluted by passing 5.0 mL 0.5 mol L-1 HNO3 solution through the solid support and then the Cd(II) contents was measured by flame atomic absorption spectrometry. Conditions for preparation of the modified activated carbon, pH and flow variables were studied, as ...

  8. Determination of rare earth elements in high purity rare earth oxides by liquid chromatography, thermionic mass spectrometry and combined liquid chromatography/thermionic mass spectrometry

    International Nuclear Information System (INIS)

    Stijfhoorn, D.E.; Stray, H.; Hjelmseth, H.

    1993-01-01

    A high-performance liquid chromatographic (HPLC) method for the determination of rare earth elements in rocks has been modified and used for the determination of rare earth elements (REE) in high purity rare earth oxides. The detection limit was 1-1.5 ng or 2-3 mg/kg when a solution corresponding to 0.5 mg of the rare earth oxide was injected. The REE determination was also carried out by adding a mixture of selected REE isotopes to the sample and analysing the collected HPLC-fractions by mass spectrometry (MS) using a thermionic source. Since the matrix element was not collected, interference from this element during the mass spectrometric analysis was avoided. Detection limits as low as 0.5 mg/kg could then be obtained. Detection limits as low as 0.05 mg/kg were possible by MS without HPLC-pre-separation, but this approach could only be used for those elements that were not affected by the matrix. Commercial samples of high purity Nd 2 O 3 , Gd 2 O 3 and Dy 2 O 3 were analysed in this study, and a comparison of results obtained by HPLC, combined HPLC/MS and direct MS is presented. (Author)

  9. Integrative Workflows for Metagenomic Analysis

    Directory of Open Access Journals (Sweden)

    Efthymios eLadoukakis

    2014-11-01

    Full Text Available The rapid evolution of all sequencing technologies, described by the term Next Generation Sequencing (NGS, have revolutionized metagenomic analysis. They constitute a combination of high-throughput analytical protocols, coupled to delicate measuring techniques, in order to potentially discover, properly assemble and map allelic sequences to the correct genomes, achieving particularly high yields for only a fraction of the cost of traditional processes (i.e. Sanger. From a bioinformatic perspective, this boils down to many gigabytes of data being generated from each single sequencing experiment, rendering the management or even the storage, critical bottlenecks with respect to the overall analytical endeavor. The enormous complexity is even more aggravated by the versatility of the processing steps available, represented by the numerous bioinformatic tools that are essential, for each analytical task, in order to fully unveil the genetic content of a metagenomic dataset. These disparate tasks range from simple, nonetheless non-trivial, quality control of raw data to exceptionally complex protein annotation procedures, requesting a high level of expertise for their proper application or the neat implementation of the whole workflow. Furthermore, a bioinformatic analysis of such scale, requires grand computational resources, imposing as the sole realistic solution, the utilization of cloud computing infrastructures. In this review article we discuss different, integrative, bioinformatic solutions available, which address the aforementioned issues, by performing a critical assessment of the available automated pipelines for data management, quality control and annotation of metagenomic data, embracing various, major sequencing technologies and applications.

  10. ATLAS Grid Workflow Performance Optimization

    CERN Document Server

    Elmsheuser, Johannes; The ATLAS collaboration

    2018-01-01

    The CERN ATLAS experiment grid workflow system manages routinely 250 to 500 thousand concurrently running production and analysis jobs to process simulation and detector data. In total more than 300 PB of data is distributed over more than 150 sites in the WLCG. At this scale small improvements in the software and computing performance and workflows can lead to significant resource usage gains. ATLAS is reviewing together with CERN IT experts several typical simulation and data processing workloads for potential performance improvements in terms of memory and CPU usage, disk and network I/O. All ATLAS production and analysis grid jobs are instrumented to collect many performance metrics for detailed statistical studies using modern data analytics tools like ElasticSearch and Kibana. This presentation will review and explain the performance gains of several ATLAS simulation and data processing workflows and present analytics studies of the ATLAS grid workflows.

  11. Privacy-aware workflow management

    NARCIS (Netherlands)

    Alhaqbani, B.; Adams, M.; Fidge, C.J.; Hofstede, ter A.H.M.; Glykas, M.

    2013-01-01

    Information security policies play an important role in achieving information security. Confidentiality, Integrity, and Availability are classic information security goals attained by enforcing appropriate security policies. Workflow Management Systems (WfMSs) also benefit from inclusion of these

  12. Summer Student Report - AV Workflow

    CERN Document Server

    Abramson, Jessie

    2014-01-01

    The AV Workflow is web application which allows cern users to publish, update and delete videos from cds. During my summer internship I implemented the backend of the new version of the AV Worklow in python using the django framework.

  13. Discovery of safety biomarkers for atorvastatin in rat urine using mass spectrometry based metabolomics combined with global and targeted approach

    International Nuclear Information System (INIS)

    Kumar, Bhowmik Salil; Lee, Young-Joo; Yi, Hong Jae; Chung, Bong Chul; Jung, Byung Hwa

    2010-01-01

    In order to develop a safety biomarker for atorvastatin, this drug was orally administrated to hyperlipidemic rats, and a metabolomic study was performed. Atorvastatin was given in doses of either 70 mg kg -1 day -1 or 250 mg kg -1 day -1 for a period of 7 days (n = 4 for each group). To evaluate any abnormal effects of the drug, physiological and plasma biochemical parameters were measured and histopathological tests were carried out. Safety biomarkers were derived by comparing these parameters and using both global and targeted metabolic profiling. Global metabolic profiling was performed using liquid chromatography/time of flight/mass spectrometry (LC/TOF/MS) with multivariate data analysis. Several safety biomarker candidates that included various steroids and amino acids were discovered as a result of global metabolic profiling, and they were also confirmed by targeted metabolic profiling using gas chromatography/mass spectrometry (GC/MS) and capillary electrophoresis/mass spectrometry (CE/MS). Serum biochemical and histopathological tests were used to detect abnormal drug reactions in the liver after repeating oral administration of atorvastatin. The metabolic differences between control and the drug-treated groups were compared using PLS-DA score plots. These results were compared with the physiological and plasma biochemical parameters and the results of a histopathological test. Estrone, cortisone, proline, cystine, 3-ureidopropionic acid and histidine were proposed as potential safety biomarkers related with the liver toxicity of atorvastatin. These results indicate that the combined application of global and targeted metabolic profiling could be a useful tool for the discovery of drug safety biomarkers.

  14. Discovery of safety biomarkers for atorvastatin in rat urine using mass spectrometry based metabolomics combined with global and targeted approach

    Energy Technology Data Exchange (ETDEWEB)

    Kumar, Bhowmik Salil [Bioanalysis and Biotransformation Research Center, Korea Institute of Science and Technology, P.O. Box 131, Cheongryang, Seoul 130-650 (Korea, Republic of); University of Science and Technology, (305-333) 113 Gwahangno, Yuseong-gu, Daejeon (Korea, Republic of); Lee, Young-Joo; Yi, Hong Jae [College of Pharmacy, Kyung Hee University, Hoegi-dong, Dongdaemun-gu, Seoul 130-791 (Korea, Republic of); Chung, Bong Chul [Bioanalysis and Biotransformation Research Center, Korea Institute of Science and Technology, P.O. Box 131, Cheongryang, Seoul 130-650 (Korea, Republic of); Jung, Byung Hwa, E-mail: jbhluck@kist.re.kr [Bioanalysis and Biotransformation Research Center, Korea Institute of Science and Technology, P.O. Box 131, Cheongryang, Seoul 130-650 (Korea, Republic of); University of Science and Technology, (305-333) 113 Gwahangno, Yuseong-gu, Daejeon (Korea, Republic of)

    2010-02-19

    In order to develop a safety biomarker for atorvastatin, this drug was orally administrated to hyperlipidemic rats, and a metabolomic study was performed. Atorvastatin was given in doses of either 70 mg kg{sup -1} day{sup -1} or 250 mg kg{sup -1} day{sup -1} for a period of 7 days (n = 4 for each group). To evaluate any abnormal effects of the drug, physiological and plasma biochemical parameters were measured and histopathological tests were carried out. Safety biomarkers were derived by comparing these parameters and using both global and targeted metabolic profiling. Global metabolic profiling was performed using liquid chromatography/time of flight/mass spectrometry (LC/TOF/MS) with multivariate data analysis. Several safety biomarker candidates that included various steroids and amino acids were discovered as a result of global metabolic profiling, and they were also confirmed by targeted metabolic profiling using gas chromatography/mass spectrometry (GC/MS) and capillary electrophoresis/mass spectrometry (CE/MS). Serum biochemical and histopathological tests were used to detect abnormal drug reactions in the liver after repeating oral administration of atorvastatin. The metabolic differences between control and the drug-treated groups were compared using PLS-DA score plots. These results were compared with the physiological and plasma biochemical parameters and the results of a histopathological test. Estrone, cortisone, proline, cystine, 3-ureidopropionic acid and histidine were proposed as potential safety biomarkers related with the liver toxicity of atorvastatin. These results indicate that the combined application of global and targeted metabolic profiling could be a useful tool for the discovery of drug safety biomarkers.

  15. Combined analysis of 1,3-benzodioxoles by crystalline sponge X-ray crystallography and laser desorption ionization mass spectrometry.

    Science.gov (United States)

    Hayashi, Yukako; Ohara, Kazuaki; Taki, Rika; Saeki, Tomomi; Yamaguchi, Kentaro

    2018-03-12

    The crystalline sponge (CS) method, which employs single-crystal X-ray diffraction to determine the structure of an analyte present as a liquid or an oil and having a low melting point, was used in combination with laser desorption ionization mass spectrometry (LDI-MS). 1,3-Benzodioxole derivatives were encapsulated in CS and their structures were determined by combining X-ray crystallography and MS. After the X-ray analysis, the CS was subjected to imaging mass spectrometry (IMS) with an LDI spiral-time-of-flight mass spectrometer (TOF-MS). The ion detection area matched the microscopic image of the encapsulated CS. In addition, the accumulated 1D mass spectra showed that fragmentation of the guest molecule (hereafter, guest) can be easily visualized without any interference from the fragment ions of CS except for two strong ion peaks derived from the tridentate ligand TPT (2,4,6-tris(4-pyridyl)-1,3,5-triazine) of the CS and its fragment. X-ray analysis clearly showed the presence of the guest as well as the π-π, CH-halogen, and CH-O interactions between the guest and the CS framework. However, some guests remained randomly diffused in the nanopores of CS. In addition, the detection limit was less than sub-pmol order based on the weight and density of CS determined by X-ray analysis. Spectroscopic data, such as UV-vis and NMR, also supported the encapsulation of the guest through the interaction between the guest and CS components. The results denote that the CS-LDI-MS method, which combines CS, X-ray analysis and LDI-MS, is effective for structure determination.

  16. Combination of atomic force microscopy and mass spectrometry for the detection of target protein in the serum samples of children with autism spectrum disorders

    Science.gov (United States)

    Kaysheva, A. L.; Pleshakova, T. O.; Kopylov, A. T.; Shumov, I. D.; Iourov, I. Y.; Vorsanova, S. G.; Yurov, Y. B.; Ziborov, V. S.; Archakov, A. I.; Ivanov, Y. D.

    2017-10-01

    Possibility of detection of target proteins associated with development of autistic disorders in children with use of combined atomic force microscopy and mass spectrometry (AFM/MS) method is demonstrated. The proposed method is based on the combination of affine enrichment of proteins from biological samples and visualization of these proteins by AFM and MS analysis with quantitative detection of target proteins.

  17. [Screening differentially expressed plasma proteins in cold stress rats based on iTRAQ combined with mass spectrometry technology].

    Science.gov (United States)

    Liu, Yan-zhi; Guo, Jing-ru; Peng, Meng-ling; Ma, Li; Zhen, Li; Ji, Hong; Yang, Huan-min

    2015-09-01

    Isobaric tags for relative and absolute quantitation (iTRAQ) combined with mass spectrometry were used to screen differentially expressed plasma proteins in cold stress rats. Thirty health SPF Wistar rats were randomly divided into cold stress group A and control group B, then A and B were randomly divided into 3 groups (n = 5): A1, A2, A3 and B1, B2, B3. The temperature of room raising was (24.0 +/- 0.1) degrees C, and the cold stress temperature was (4.0 +/- 0.1) degrees C. The rats were treated with different temperatures until 12 h. The abdominal aortic blood was collected with heparin anticoagulation suction tube. Then, the plasma was separated for protein extraction, quantitative, enzymolysis, iTHAQ labeling, scx fractionation and mass spectrometry analysis. Totally, 1085 proteins were identified in the test, 39 differentially expressed proteins were screened, including 29 up-regulated proteins and 10 down-regulated proteins. Three important differentially expressed proteins related to cold stress were screened by bioinfonnatics analysis (Minor histocompatihility protein HA-1, Has-related protein Rap-1b, Integrin beta-1). In the experiment, the differentially expressed plasma proteins were successfully screened in cold stress rats. iTRAQ technology provided a good platform to screen protein diaguostic markers on cold stress rats, and laid a good foundation for further. study on animal cold stress mechanism.

  18. The mass spectrometry technology MALDI-TOF (Matrix-Assisted Laser Desorption/Ionization Time- Of-Flight for a more rapid and economic workflow in the clinical microbiology laboratory

    Directory of Open Access Journals (Sweden)

    Simona Barnini

    2012-12-01

    Full Text Available Introduction: In order to improve the outcome of patients, reduce length of stay, costs and resources engaged in diagnostics, more rapid reports are requested to the clinical microbiologists.The purpose of this study is to assess the impact on workflow of MALDI-TOF technology, recently made available for use in routine diagnostics. Methods:The work list by the management information system is sent to the instrument MALDI-TOF, where are held at least three successive analytic sessions: the first includes bacteria isolated from CSF, blood cultures, and cases already reported as serious/urgent, the second includes all other germs isolated, the third, microorganisms that require extraction with trifluoroacetic acid (TFA or formic acid (FA for identification.The results of each session direct to the execution of different types of susceptibility testing. Results:The times of microbial identifications are reduced by 24 or 48 hours and made available to the clinician for the rational empirical therapy.The reagent costs are reduced by 40%.The subcultures were reduced by 80%, and microscopic examinations by 50%.The antibiotic susceptibility tests were immediately performed with the most appropriate method, based on the knowledge of local epidemiology and microbial species. Conclusion:The bacteriology is the less automated discipline among the clinical laboratory activities and results of diagnostic tests are poorly well-timed. The new interpretative algorithms of MALDI-TOF spectra, now available, allow the correct identification of bacteria in near real time, completely eliminating the wait is necessary for biochemical identification and guiding the operator in selecting the most appropriate antibiotic susceptibility tests. This technology makes work more rapid, economic and efficient, eliminating errors and, together with effective computerization of data, transforms the information content of the microbiological report, making it much more effective

  19. Discrimination of geographical origin of lentils (Lens culinaris Medik.) using isotope ratio mass spectrometry combined with chemometrics.

    Science.gov (United States)

    Longobardi, F; Casiello, G; Cortese, M; Perini, M; Camin, F; Catucci, L; Agostiano, A

    2015-12-01

    The aim of this study was to predict the geographic origin of lentils by using isotope ratio mass spectrometry (IRMS) in combination with chemometrics. Lentil samples from two origins, i.e. Italy and Canada, were analysed obtaining the stable isotope ratios of δ(13)C, δ(15)N, δ(2)H, δ(18)O, and δ(34)S. A comparison between median values (U-test) highlighted statistically significant differences (porigin but with overlapping zones; consequently, two supervised discriminant techniques, i.e. partial least squares discriminant analysis and k-nearest neighbours algorithm were used. Both models showed good performances with external prediction abilities of about 93% demonstrating the suitability of the methods developed. Subsequently, isotopic determinations were also performed on the protein and starch fractions and the relevant results are reported. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Combined experimental and statistical strategy for mass spectrometry based serum protein profiling for diagnosis of breast cancer

    DEFF Research Database (Denmark)

    Callesen, Anne Kjærgaard; Vach, Werner; Jørgensen, Per E

    2008-01-01

    it in a well-described breast cancer case-control study. A rigorous sample collection protocol ensured high quality specimen and reduced bias from preanalytical factors. Preoperative serum samples obtained from 48 breast cancer patients and 28 controls were used to generate MALDI MS protein profiles. A total...... and controls. A diagnostic rule based on these 72 mass values was constructed and exhibited a cross-validated sensitivity and specificity of approximately 85% for the detection of breast cancer. With this method, it was possible to distinguish early stage cancers from controls without major loss of sensitivity...... and specificity. We conclude that optimized serum sample handling and mass spectrometry data acquisition strategies in combination with statistical analysis provide a viable platform for serum protein profiling in cancer diagnosis....

  1. Combining metal oxide affinity chromatography (MOAC and selective mass spectrometry for robust identification of in vivo protein phosphorylation sites

    Directory of Open Access Journals (Sweden)

    Weckwerth Wolfram

    2005-11-01

    Full Text Available Abstract Background Protein phosphorylation is accepted as a major regulatory pathway in plants. More than 1000 protein kinases are predicted in the Arabidopsis proteome, however, only a few studies look systematically for in vivo protein phosphorylation sites. Owing to the low stoichiometry and low abundance of phosphorylated proteins, phosphorylation site identification using mass spectrometry imposes difficulties. Moreover, the often observed poor quality of mass spectra derived from phosphopeptides results frequently in uncertain database hits. Thus, several lines of evidence have to be combined for a precise phosphorylation site identification strategy. Results Here, a strategy is presented that combines enrichment of phosphoproteins using a technique termed metaloxide affinity chromatography (MOAC and selective ion trap mass spectrometry. The complete approach involves (i enrichment of proteins with low phosphorylation stoichiometry out of complex mixtures using MOAC, (ii gel separation and detection of phosphorylation using specific fluorescence staining (confirmation of enrichment, (iii identification of phosphoprotein candidates out of the SDS-PAGE using liquid chromatography coupled to mass spectrometry, and (iv identification of phosphorylation sites of these enriched proteins using automatic detection of H3PO4 neutral loss peaks and data-dependent MS3-fragmentation of the corresponding MS2-fragment. The utility of this approach is demonstrated by the identification of phosphorylation sites in Arabidopsis thaliana seed proteins. Regulatory importance of the identified sites is indicated by conservation of the detected sites in gene families such as ribosomal proteins and sterol dehydrogenases. To demonstrate further the wide applicability of MOAC, phosphoproteins were enriched from Chlamydomonas reinhardtii cell cultures. Conclusion A novel phosphoprotein enrichment procedure MOAC was applied to seed proteins of A. thaliana and to

  2. LQCD workflow execution framework: Models, provenance and fault-tolerance

    International Nuclear Information System (INIS)

    Piccoli, Luciano; Simone, James N; Kowalkowlski, James B; Dubey, Abhishek

    2010-01-01

    Large computing clusters used for scientific processing suffer from systemic failures when operated over long continuous periods for executing workflows. Diagnosing job problems and faults leading to eventual failures in this complex environment is difficult, specifically when the success of an entire workflow might be affected by a single job failure. In this paper, we introduce a model-based, hierarchical, reliable execution framework that encompass workflow specification, data provenance, execution tracking and online monitoring of each workflow task, also referred to as participants. The sequence of participants is described in an abstract parameterized view, which is translated into a concrete data dependency based sequence of participants with defined arguments. As participants belonging to a workflow are mapped onto machines and executed, periodic and on-demand monitoring of vital health parameters on allocated nodes is enabled according to pre-specified rules. These rules specify conditions that must be true pre-execution, during execution and post-execution. Monitoring information for each participant is propagated upwards through the reflex and healing architecture, which consists of a hierarchical network of decentralized fault management entities, called reflex engines. They are instantiated as state machines or timed automatons that change state and initiate reflexive mitigation action(s) upon occurrence of certain faults. We describe how this cluster reliability framework is combined with the workflow execution framework using formal rules and actions specified within a structure of first order predicate logic that enables a dynamic management design that reduces manual administrative workload, and increases cluster-productivity.

  3. Evolutionary optimization of production materials workflow processes

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter

    2014-01-01

    We present an evolutionary optimisation technique for stochastic production processes, which is able to find improved production materials workflow processes with respect to arbitrary combinations of numerical quantities associated with the production process. Working from a core fragment...... of the BPMN language, we employ an evolutionary algorithm where stochastic model checking is used as a fitness function to determine the degree of improvement of candidate processes derived from the original process through mutation and cross-over operations. We illustrate this technique using a case study...

  4. Redox speciation of final repository relevant elements using separation methods in combination with ICP mass spectrometry

    International Nuclear Information System (INIS)

    Graser, Carl-Heinrich

    2015-01-01

    The long-term safety assessment for nuclear waste repositories requires a detailed understanding of the chemistry of actinide elements in the geosphere. The development of advanced analytical tools is required to gain detailed insights into actinide redox speciation in a given system. The mobility of radionuclides is mostly determined by the geochemical conditions which control the redox state of radionuclides. Besides the longlived radionuclides plutonium (Pu) and neptunium (Np), which are key elements in high level nuclear waste, iron (Fe) represents a main component in natural systems controlling redox related geochemical processes. Analytical techniques for determining oxidation state distribution for redox sensitive radionuclides and other metal ions often have a lack of sensitivity. The detection limits of these methods (i.e. UV/vis, TRLFS, XANES) are in general in the range of ≥ 10 -6 mol.L -1 . As a consequence ultrasensitive new analytical techniques are required. Capillary electrophoresis (CE) and ion chromatography (IC) are powerful separation methods for metal ions. In the course of this thesis different speciation method for iron, neptunium and plutonium were optimized. With the optimized setup redox speciation analysis of these elements in different samples were done. Furthermore CE hyphenated to inductively coupled plasma sector field mass spectrometry (CE - ICP - SF - MS) was used to measure the redox speciation of Pu (III, IV, V, VI), Np (IV, V, VI) and Fe (II, III) at concentrations lower than 10 -7 mol.L -1 . CE coupling and separation parameters such as sample gas pressure, make up flow rate, capillary position, auxiliary gas flow, as well as the electrolyte system were optimized to obtain the maximum sensitivity. The methodes detection limits are 10 -12 mol.L -1 for Np and Pu. The various oxidation state species of Pu and Np in different samples were separated by application of an acetate based electrolyte system. The separation of Fe (II

  5. Combined liquid chromatography-mass spectrometry for trace analysis of pharmaceuticals

    International Nuclear Information System (INIS)

    Schmidt, L.; Danigel, H.; Jungclas, H.

    1982-01-01

    A 252 Cf-plasma desorption mass spectrometer (PDMS) for the analysis of thin layers from nonvolatile organic samples has been set up to be combined with a liquid chromatograph. A novel interface performs the direct inlet of the liquid sample through a capillary into the vacuum system of the spectrometer. Samples of drugs are periodically collected, transferred to the ion source and analysed using a rotating disk. This on-line sample preparation has been tested for three antiarrhythmic drugs using various solvents and mixtures. (orig.)

  6. Concurrency & Asynchrony in Declarative Workflows

    DEFF Research Database (Denmark)

    Debois, Søren; Hildebrandt, Thomas; Slaats, Tijs

    2015-01-01

    of concurrency in DCR Graphs admits asynchronous execution of declarative workflows both conceptually and by reporting on a prototype implementation of a distributed declarative workflow engine. Both the theoretical development and the implementation is supported by an extended example; moreover, the theoretical....... In this paper, we pro- pose a notion of concurrency for declarative process models, formulated in the context of Dynamic Condition Response (DCR) graphs, and exploiting the so-called “true concurrency” semantics of Labelled Asynchronous Transition Systems. We demonstrate how this semantic underpinning...

  7. Similarity measures for scientific workflows

    OpenAIRE

    Starlinger, Johannes

    2016-01-01

    In Laufe der letzten zehn Jahre haben Scientific Workflows als Werkzeug zur Erstellung von reproduzierbaren, datenverarbeitenden in-silico Experimenten an Aufmerksamkeit gewonnen, in die sowohl lokale Skripte und Anwendungen, als auch Web-Services eingebunden werden können. Über spezialisierte Online-Bibliotheken, sogenannte Repositories, können solche Workflows veröffentlicht und wiederverwendet werden. Mit zunehmender Größe dieser Repositories werden Ähnlichkeitsmaße für Scientific Workfl...

  8. Analysing scientific workflows: Why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, L.J.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  9. Analysing scientific workflows: why workflows not only connect web services

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; Wolstencroft, K.; Neerincx, P.B.T.; Roos, M.; Rauwerda, H.; Breit, T.M.; Zhang, LJ.

    2009-01-01

    Life science workflow systems are developed to help life scientists to conveniently connect various programs and web services. In practice however, much time is spent on data conversion, because web services provided by different organisations use different data formats. We have analysed all the

  10. 13C- and 15N-Labeling Strategies Combined with Mass Spectrometry Comprehensively Quantify Phospholipid Dynamics in C. elegans.

    Directory of Open Access Journals (Sweden)

    Blair C R Dancy

    Full Text Available Membranes define cellular and organelle boundaries, a function that is critical to all living systems. Like other biomolecules, membrane lipids are dynamically maintained, but current methods are extremely limited for monitoring lipid dynamics in living animals. We developed novel strategies in C. elegans combining 13C and 15N stable isotopes with mass spectrometry to directly quantify the replenishment rates of the individual fatty acids and intact phospholipids of the membrane. Using multiple measurements of phospholipid dynamics, we found that the phospholipid pools are replaced rapidly and at rates nearly double the turnover measured for neutral lipid populations. In fact, our analysis shows that the majority of membrane lipids are replaced each day. Furthermore, we found that stearoyl-CoA desaturases (SCDs, critical enzymes in polyunsaturated fatty acid production, play an unexpected role in influencing the overall rates of membrane maintenance as SCD depletion affected the turnover of nearly all membrane lipids. Additionally, the compromised membrane maintenance as defined by LC-MS/MS with SCD RNAi resulted in active phospholipid remodeling that we predict is critical to alleviate the impact of reduced membrane maintenance in these animals. Not only have these combined methodologies identified new facets of the impact of SCDs on the membrane, but they also have great potential to reveal many undiscovered regulators of phospholipid metabolism.

  11. Office 2010 Workflow Developing Collaborative Solutions

    CERN Document Server

    Mann, David; Enterprises, Creative

    2010-01-01

    Workflow is the glue that binds information worker processes, users, and artifacts. Without workflow, information workers are just islands of data and potential. Office 2010 Workflow details how to implement workflow in SharePoint 2010 and the client Microsoft Office 2010 suite to help information workers share data, enforce processes and business rules, and work more efficiently together or solo. This book covers everything you need to know-from what workflow is all about to creating new activities; from the SharePoint Designer to Visual Studio 2010; from out-of-the-box workflows to state mac

  12. A comprehensive evaluation of popular proteomics software workflows for label-free proteome quantification and imputation.

    Science.gov (United States)

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2017-05-31

    Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.

  13. Alkaloid profiling of the Chinese herbal medicine Fuzi by combination of matrix-assisted laser desorption ionization mass spectrometry with liquid chromatography-mass spectrometry

    NARCIS (Netherlands)

    Wang, J.; Heijden, R. van der; Spijksma, G.; Reijmers, T.; Wang, M.; Xu, G.; Hankemeier, T.; Greef, J. van der

    2009-01-01

    A matrix-assisted laser desorption ionization mass spectrometry (MALDI-MS) method was developed for the high throughput and robust qualitative profiling of alkaloids in Fuzi-the processed lateral roots of the Chinese herbal medicine Aconitum carmichaeli Debx (A. carmichaeli). After optimization,

  14. 'Combined reflectance stratigraphy' - subdivision of loess successions by diffuse reflectance spectrometry (DRS)

    Science.gov (United States)

    Szeberényi, Jozsef; Bradak-Hayashi, Balázs; Kiss, Klaudia; Kovács, József; Varga, György; Balázs, Réka; Szalai, Zoltán; Viczián, István

    2016-04-01

    The different varieties of loess (and intercalated paleosol layers) together constitute one of the most widespread terrestrial sediments, which was deposited, altered, and redeposited in the course of the changing climatic conditions of the Pleistocene. To reveal more information about Pleistocene climate cycles and/or environments the detailed lithostratigraphical subdivision and classification of the loess variations and paleosols are necessary. Beside the numerous method such as various field measurements, semi-quantitative tests and laboratory investigations, diffuse reflectance spectroscopy (DRS) is one of the well applied method on loess/paleosol sequences. Generally, DRS has been used to separate the detrital and pedogenic mineral component of the loess sections by the hematite/goethite ratio. DRS also has been applied as a joint method of various environmental magnetic investigations such as magnetic susceptibility- and isothermal remanent magnetization measurements. In our study the so-called "combined reflectance stratigraphy method" were developed. At First, complex mathematical method was applied to compare the results of the spectral reflectance measurements. One of the most preferred multivariate methods is cluster analysis. Its scope is to group and compare the loess variations and paleosol based on the similarity and common properties of their reflectance curves. In the Second, beside the basic subdivision of the profiles by the different reflectance curves of the layers, the most characteristic wavelength section of the reflectance curve was determined. This sections played the most important role during the classification of the different materials of the section. The reflectance value of individual samples, belonged to the characteristic wavelength were depicted in the function of depth and well correlated with other proxies like grain size distribution and magnetic susceptibility data. The results of the correlation showed the significance of

  15. APPLICATION OF LIQUID-CHROMATOGRAPHY COMBINED WITH MASS-SPECTROMETRY (LC-MS) TO ESTABLISH IDENTITY AND PURITY OF PET-RADIOPHARMACEUTICALS

    NARCIS (Netherlands)

    FRANSSEN, EJF; LUURTSEMA, G; MEDEMA, J; VISSER, GM; JERONISMUSSHALINGH, CM; BRUINS, AP; VAALBURG, W

    This article describes the application of liquid chromatography combined with mass-spectrometry (LC-MS) as a new quality control tool for PET-radiopharmaceuticals. The final step in the production of 2-[F-18]fluoro-2-deoxy-D-glucose (F-18-FDG) is a purification by HPLC. This procedure was validated

  16. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC environments

    Directory of Open Access Journals (Sweden)

    Athanassios M. Kintsakis

    2017-01-01

    Full Text Available Hermes introduces a new “describe once, run anywhere” paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  17. Hermes: Seamless delivery of containerized bioinformatics workflows in hybrid cloud (HTC) environments

    Science.gov (United States)

    Kintsakis, Athanassios M.; Psomopoulos, Fotis E.; Symeonidis, Andreas L.; Mitkas, Pericles A.

    Hermes introduces a new "describe once, run anywhere" paradigm for the execution of bioinformatics workflows in hybrid cloud environments. It combines the traditional features of parallelization-enabled workflow management systems and of distributed computing platforms in a container-based approach. It offers seamless deployment, overcoming the burden of setting up and configuring the software and network requirements. Most importantly, Hermes fosters the reproducibility of scientific workflows by supporting standardization of the software execution environment, thus leading to consistent scientific workflow results and accelerating scientific output.

  18. CMS Distributed Computing Workflow Experience

    CERN Document Server

    Haas, Jeffrey David

    2010-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simul...

  19. New Interactions with Workflow Systems

    NARCIS (Netherlands)

    Wassink, I.; van der Vet, P.E.; van der Veer, Gerrit C.; Roos, M.; van Dijk, Elisabeth M.A.G.; Norros, L.; Koskinen, H.; Salo, L.; Savioja, P.

    2009-01-01

    This paper describes the evaluation of our early design ideas of an ad-hoc of workflow system. Using the teach-back technique, we have performed a hermeneutic analysis of the mockup implementation named NIWS to get corrective and creative feedback at the functional, dialogue and representation level

  20. CMS distributed computing workflow experience

    Science.gov (United States)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D.; Prosper, Harrison B.; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao, Junhui; Pin, Arnaud; Schul, Nicolas; De Lentdecker, Gilles; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey; Barge, Derek; Lahiff, Andrew

    2011-12-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  1. CMS distributed computing workflow experience

    International Nuclear Information System (INIS)

    Adelman-McCarthy, Jennifer; Gutsche, Oliver; Haas, Jeffrey D; Prosper, Harrison B; Dutta, Valentina; Gomez-Ceballos, Guillelmo; Hahn, Kristian; Klute, Markus; Mohapatra, Ajit; Spinoso, Vincenzo; Kcira, Dorian; Caudron, Julien; Liao Junhui; Pin, Arnaud; Schul, Nicolas; Lentdecker, Gilles De; McCartin, Joseph; Vanelderen, Lukas; Janssen, Xavier; Tsyganov, Andrey

    2011-01-01

    The vast majority of the CMS Computing capacity, which is organized in a tiered hierarchy, is located away from CERN. The 7 Tier-1 sites archive the LHC proton-proton collision data that is initially processed at CERN. These sites provide access to all recorded and simulated data for the Tier-2 sites, via wide-area network (WAN) transfers. All central data processing workflows are executed at the Tier-1 level, which contain re-reconstruction and skimming workflows of collision data as well as reprocessing of simulated data to adapt to changing detector conditions. This paper describes the operation of the CMS processing infrastructure at the Tier-1 level. The Tier-1 workflows are described in detail. The operational optimization of resource usage is described. In particular, the variation of different workflows during the data taking period of 2010, their efficiencies and latencies as well as their impact on the delivery of physics results is discussed and lessons are drawn from this experience. The simulation of proton-proton collisions for the CMS experiment is primarily carried out at the second tier of the CMS computing infrastructure. Half of the Tier-2 sites of CMS are reserved for central Monte Carlo (MC) production while the other half is available for user analysis. This paper summarizes the large throughput of the MC production operation during the data taking period of 2010 and discusses the latencies and efficiencies of the various types of MC production workflows. We present the operational procedures to optimize the usage of available resources and we the operational model of CMS for including opportunistic resources, such as the larger Tier-3 sites, into the central production operation.

  2. Design Tools and Workflows for Braided Structures

    DEFF Research Database (Denmark)

    Vestartas, Petras; Heinrich, Mary Katherine; Zwierzycki, Mateusz

    2017-01-01

    and merits of our method, demonstrated though four example design and analysis workflows. The workflows frame specific aspects of enquiry for the ongoing research project flora robotica. These include modelling target geometries, automatically producing instructions for fabrication, conducting structural...

  3. Gas-phase fragmentation of peptides to increase the spatial resolution of the Hydrogen Exchange Mass Spectrometry experiment

    DEFF Research Database (Denmark)

    Jensen, Pernille Foged; Rand, Kasper Dyrberg

    2016-01-01

    are produced after precursor ion selection and thus do not add complexity to the LC-MS analysis. The key to obtaining optimal spatial resolution in a hydrogen exchange mass spectrometry (HX-MS) experiment is the fragmentation efficiency. This chapter discusses common fragmentation techniques like collision....../D scrambling, thus making them suitable for HX applications. By combining the classic bottom-up HX-MS workflow with gas-phase fragmentation by ETD, detailed information on protein HX can be obtained....

  4. Gamma-ray spectrometry combined with acceptable knowledge (GSAK). A technique for characterization of certain remote-handled transuranic (RH-TRU) wastes. Part 1. Methodology and techniques

    International Nuclear Information System (INIS)

    Hartwell, J.K.; McIlwain, M.E.

    2005-01-01

    Gamma-ray spectrometry combined with acceptable knowledge (GSAK) is a technique for the characterization of certain remote-handled transuranic (RH-TRU) wastes. GSAK uses gamma-ray spectrometry to quantify a portion of the fission product inventory of RH-TRU wastes. These fission product results are then coupled with calculated inventories derived from acceptable process knowledge to characterize the radionuclide content of the assayed wastes. GSAK has been evaluated and tested through several test exercises. GSAK approach is described, while test results are presented in Part II. (author)

  5. Gamma-ray spectrometry combined with acceptable knowledge (GSAK). A technique for characterization of certain remote-handled transuranic (RH-TRU) wastes. Part 2. Testing and results

    International Nuclear Information System (INIS)

    Hartwell, J.K.; McIlwain, M.E.

    2005-01-01

    Gamma-ray spectrometry combined with acceptable knowledge (GSAK) is a technique for the characterization of certain remote-handled transuranic (RH-TRU) wastes. GSAK uses gamma-ray spectrometry to quantify a portion of the fission product inventory of RH-TRU wastes. These fission product results are then coupled with calculated inventories derived from acceptable process knowledge to characterize the radionuclide content of the assayed wastes. GSAK has been evaluated and tested through several test exercises. These tests and their results are described; while the former paper in this issue presents the methodology, equipment and techniques. (author)

  6. Endogenous Plasma Peptide Detection and Identification in the Rat by a Combination of Fractionation Methods and Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Fabrice Bertile

    2007-01-01

    Full Text Available Mass spectrometry-based analyses are essential tools in the field of biomarker research. However, detection and characterization of plasma low abundance and/or low molecular weight peptides is challenged by the presence of highly abundant proteins, salts and lipids. Numerous strategies have already been tested to reduce the complexity of plasma samples. The aim of this study was to enrich the low molecular weight fraction of rat plasma. To this end, we developed and compared simple protocols based on membrane filtration, solid phase extraction, and a combination of both. As assessed by UV absorbance, an albumin depletion 99% was obtained. The multistep fractionation strategy (including reverse phase HPLC allowed detection, in a reproducible manner (CV [1] 30%–35%, of more than 450 peaks below 3000 Da by MALDI-TOF/MS. A MALDI-TOF/MS-determined LOD as low as 1 fmol/μL was obtained, thus allowing nanoLC-Chip/ MS/MS identification of spiked peptides representing ∼10–6% of total proteins, by weight. Signal peptide recovery ranged between 5%–100% according to the spiked peptide considered. Tens of peptide sequence tags from endogenous plasma peptides were also obtained and high confidence identifications of low abundance fibrinopeptide A and B are reported here to show the efficiency of the protocol. It is concluded that the fractionation protocol presented would be of particular interest for future differential (high throughput analyses of the plasma low molecular weight fraction.

  7. Classification of Tempranillo wines according to geographic origin: Combination of mass spectrometry based electronic nose and chemometrics

    Energy Technology Data Exchange (ETDEWEB)

    Cynkar, Wies, E-mail: wies.cynkar@awri.com.au [Australian Wine Research Institute, PO Box 197, Glen Osmond, SA 5064 (Australia); Dambergs, Robert [Australian Wine Research Institute, Tasmanian Institute of Agricultural Research, University of Tasmania, Private Bag 98, Hobart Tasmania 7001 (Australia); Smith, Paul; Cozzolino, Daniel [Australian Wine Research Institute, PO Box 197, Glen Osmond, SA 5064 (Australia)

    2010-02-15

    Rapid methods employing instruments such as electronic noses (EN) or gas sensors are used in the food and beverage industries to monitor and assess the composition and quality of products. Similar to other food industries, the wine industry has a clear need for simple, rapid and cost effective techniques for objectively evaluating the quality of grapes, wine and spirits. In this study a mass spectrometry based electronic nose (MS-EN) instrument combined with chemometrics was used to predict the geographical origin of Tempranillo wines produced in Australia and Spain. The MS-EN data generated were analyzed using principal components analysis (PCA), partial least squares discriminant analysis (PLS-DA) and stepwise linear discriminant analysis (SLDA) with full cross validation (leave-one-out method). The SLDA classified correctly 86% of the samples while PLS-DA 85% of Tempranillo wines according to their geographical origin. The relative benefits of using MS-EN will provide capability for rapid screening of wines. However, this technique does not provide the identification and quantitative determination of individual compounds responsible for the different aroma notes in the wine.

  8. Comprehensive two-dimensional gas chromatography in combination with rapid scanning quadrupole mass spectrometry in perfume analysis.

    Science.gov (United States)

    Mondello, Luigi; Casillia, Alessandro; Tranchida, Peter Quinto; Dugo, Giovanni; Dugo, Paola

    2005-03-04

    Single column gas chromatography (GC) in combination with a flame ionization detector (FID) and/or a mass spectrometer is routinely employed in the determination of perfume profiles. The latter are to be considered medium to highly complex matrices and, as such, can only be partially separated even on long capillaries. Inevitably, several monodimensional peaks are the result of two or more overlapping components, often hindering reliable identification and quantitation. The present investigation is based on the use of a comprehensive GC (GC x GC) method, in vacuum outlet conditions, for the near to complete resolution of a complex perfume sample. A rapid scanning quadrupole mass spectrometry (qMS) system, employed for the assignment of GC x GC peaks, supplied high quality mass spectra. The validity of the three-dimensional (3D) GC x GC-qMS application was measured and compared to that of GC-qMS analysis on the same matrix. Peak identification, in all applications, was achieved through MS spectra library matching and the interactive use of linear retention indices (LRI).

  9. Real time analysis of brain tissue by direct combination of ultrasonic surgical aspiration and sonic spray mass spectrometry.

    Science.gov (United States)

    Schäfer, Karl-Christian; Balog, Júlia; Szaniszló, Tamás; Szalay, Dániel; Mezey, Géza; Dénes, Júlia; Bognár, László; Oertel, Matthias; Takáts, Zoltán

    2011-10-15

    Direct combination of cavitron ultrasonic surgical aspirator (CUSA) and sonic spray ionization mass spectrometry is presented. A commercially available ultrasonic surgical device was coupled to a Venturi easy ambient sonic-spray ionization (V-EASI) source by directly introducing liquified tissue debris into the Venturi air jet pump. The Venturi air jet pump was found to efficiently nebulize the suspended tissue material for gas phase ion production. The ionization mechanism involving solely pneumatic spraying was associated with that of sonic spray ionization. Positive and negative ionization spectra were obtained from brain and liver samples reflecting the primary application areas of the surgical device. Mass spectra were found to feature predominantly complex lipid-type constituents of tissues in both ion polarity modes. Multiply charged peptide anions were also detected. The influence of instrumental settings was characterized in detail. Venturi pump geometry and flow parameters were found to be critically important in ionization efficiency. Standard solutions of phospholipids and peptides were analyzed in order to test the dynamic range, sensitivity, and suppression effects. The spectra of the intact tissue specimens were found to be highly specific to the histological tissue type. The principal component analysis (PCA) and linear discriminant analysis (LDA) based data analysis method was developed for real-time tissue identification in a surgical environment. The method has been successfully tested on post-mortem and ex vivo human samples including astrocytomas, meningeomas, metastatic brain tumors, and healthy brain tissue. © 2011 American Chemical Society

  10. Classification of Tempranillo wines according to geographic origin: Combination of mass spectrometry based electronic nose and chemometrics

    International Nuclear Information System (INIS)

    Cynkar, Wies; Dambergs, Robert; Smith, Paul; Cozzolino, Daniel

    2010-01-01

    Rapid methods employing instruments such as electronic noses (EN) or gas sensors are used in the food and beverage industries to monitor and assess the composition and quality of products. Similar to other food industries, the wine industry has a clear need for simple, rapid and cost effective techniques for objectively evaluating the quality of grapes, wine and spirits. In this study a mass spectrometry based electronic nose (MS-EN) instrument combined with chemometrics was used to predict the geographical origin of Tempranillo wines produced in Australia and Spain. The MS-EN data generated were analyzed using principal components analysis (PCA), partial least squares discriminant analysis (PLS-DA) and stepwise linear discriminant analysis (SLDA) with full cross validation (leave-one-out method). The SLDA classified correctly 86% of the samples while PLS-DA 85% of Tempranillo wines according to their geographical origin. The relative benefits of using MS-EN will provide capability for rapid screening of wines. However, this technique does not provide the identification and quantitative determination of individual compounds responsible for the different aroma notes in the wine.

  11. The equivalency between logic Petri workflow nets and workflow nets.

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented.

  12. The Equivalency between Logic Petri Workflow Nets and Workflow Nets

    Science.gov (United States)

    Wang, Jing; Yu, ShuXia; Du, YuYue

    2015-01-01

    Logic Petri nets (LPNs) can describe and analyze batch processing functions and passing value indeterminacy in cooperative systems. Logic Petri workflow nets (LPWNs) are proposed based on LPNs in this paper. Process mining is regarded as an important bridge between modeling and analysis of data mining and business process. Workflow nets (WF-nets) are the extension to Petri nets (PNs), and have successfully been used to process mining. Some shortcomings cannot be avoided in process mining, such as duplicate tasks, invisible tasks, and the noise of logs. The online shop in electronic commerce in this paper is modeled to prove the equivalence between LPWNs and WF-nets, and advantages of LPWNs are presented. PMID:25821845

  13. Snakemake-a scalable bioinformatics workflow engine

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a workflow engine that provides a readable Python-based workflow definition language and a powerful execution environment that scales from single-core workstations to compute clusters without modifying the workflow. It is the first system to support the use of automatically

  14. Behavioral technique for workflow abstraction and matching

    NARCIS (Netherlands)

    Klai, K.; Ould Ahmed M'bareck, N.; Tata, S.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    This work is in line with the CoopFlow approach dedicated for workflow advertisement, interconnection, and cooperation in virtual organizations. In order to advertise workflows into a registry, we present in this paper a novel method to abstract behaviors of workflows into symbolic observation

  15. A performance study of grid workflow engines

    NARCIS (Netherlands)

    Stratan, C.; Iosup, A.; Epema, D.H.J.

    2008-01-01

    To benefit from grids, scientists require grid workflow engines that automatically manage the execution of inter-related jobs on the grid infrastructure. So far, the workflows community has focused on scheduling algorithms and on interface tools. Thus, while several grid workflow engines have been

  16. Analysis of the differentially expressed low molecular weight peptides in human serum via an N-terminal isotope labeling technique combining nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry.

    Science.gov (United States)

    Leng, Jiapeng; Zhu, Dong; Wu, Duojiao; Zhu, Tongyu; Zhao, Ningwei; Guo, Yinlong

    2012-11-15

    Peptidomics analysis of human serum is challenging due to the low abundance of serum peptides and interference from the complex matrix. This study analyzed the differentially expressed (DE) low molecular weight peptides in human serum integrating a DMPITC-based N-terminal isotope labeling technique with nano-liquid chromatography and matrix-assisted laser desorption/ionization mass spectrometry (nano-LC/MALDI-MS). The workflow introduced a [d(6)]-4,6-dimethoxypyrimidine-2-isothiocyanate (DMPITC)-labeled mixture of aliquots from test samples as the internal standard. The spiked [d(0)]-DMPITC-labeled samples were separated by nano-LC then spotted on the MALDI target. Both quantitative and qualitative studies for serum peptides were achieved based on the isotope-labeled peaks. The DMPITC labeling technique combined with nano-LC/MALDI-MS not only minimized the errors in peptide quantitation, but also allowed convenient recognition of the labeled peptides due to the 6 Da mass difference. The data showed that the entire research procedure as well as the subsequent data analysis method were effective, reproducible, and sensitive for the analysis of DE serum peptides. This study successfully established a research model for DE serum peptides using DMPITC-based N-terminal isotope labeling and nano-LC/MALDI-MS. Application of the DMPITC-based N-terminal labeling technique is expected to provide a promising tool for the investigation of peptides in vivo, especially for the analysis of DE peptides under different biological conditions. Copyright © 2012 John Wiley & Sons, Ltd.

  17. Molecular imaging of myocardial infarction with Gadofluorine P – A combined magnetic resonance and mass spectrometry imaging approach

    Directory of Open Access Journals (Sweden)

    Fabian Lohöfer

    2018-04-01

    Full Text Available Background: Molecular MRI is becoming increasingly important for preclinical research. Validation of targeted gadolinium probes in tissue however has been cumbersome up to now. Novel methodology to assess gadolinium distribution in tissue after in vivo application is therefore needed. Purpose: To establish combined Magnetic Resonance Imaging (MRI and Mass Spectrometry Imaging (MSI for improved detection and quantification of Gadofluorine P deposition in scar formation and myocardial remodeling. Materials and methods: Animal studies were performed according to institutionally approved protocols. Myocardial infarction was induced by permanent ligation of the left ascending artery (LAD in C57BL/6J mice. MRI was performed at 7T at 1 week and 6 weeks after myocardial infarction. Gadofluorine P was used for dynamic T1 mapping of extracellular matrix synthesis during myocardial healing and compared to Gd-DTPA. After in vivo imaging contrast agent concentration as well as distribution in tissue were validated and quantified by spatially resolved Matrix-Assisted Laser Desorption Ionization (MALDI MSI and Laser Ablation – Inductively Coupled Plasma – Mass Spectrometry (LA-ICP-MS imaging. Results: Both Gadofluorine P enhancement as well as local tissue content in the myocardial scar were highest at 15 minutes post injection. R1 values increased from 1 to 6 weeks after MI (1.62 s−1 vs 2.68 s−1, p = 0.059 paralleled by an increase in Gadofluorine P concentration in the infarct from 0.019 mM at 1 week to 0.028 mM at 6 weeks (p = 0.048, whereas Gd-DTPA enhancement showed no differences (3.95 s−1 vs 3.47 s−1, p = 0.701. MALDI-MSI results were corroborated by elemental LA-ICP-MS of Gadolinium in healthy and infarcted myocardium. Histology confirmed increased extracellular matrix synthesis at 6 weeks compared to 1 week. Conclusion: Adding quantitative MSI to MR imaging enables a quantitative validation of Gadofluorine P distribution in the heart

  18. Application of Workflow Technology for Big Data Analysis Service

    Directory of Open Access Journals (Sweden)

    Bin Zhang

    2018-04-01

    Full Text Available This study presents a lightweight representational state transfer-based cloud workflow system to construct a big data intelligent software-as-a-service (SaaS platform. The system supports the dynamic construction and operation of an intelligent data analysis application, and realizes rapid development and flexible deployment of the business analysis process that can improve the interaction and response time of the process. The proposed system integrates offline-batch and online-streaming analysis models that allow users to conduct batch and streaming computing simultaneously. Users can rend cloud capabilities and customize a set of big data analysis applications in the form of workflow processes. This study elucidates the architecture and application modeling, customization, dynamic construction, and scheduling of a cloud workflow system. A chain workflow foundation mechanism is proposed to combine several analysis components into a chain component that can promote efficiency. Four practical application cases are provided to verify the analysis capability of the system. Experimental results show that the proposed system can support multiple users in accessing the system concurrently and effectively uses data analysis algorithms. The proposed SaaS workflow system has been used in network operators and has achieved good results.

  19. The Taverna workflow suite: designing and executing workflows of Web Services on the desktop, web or in the cloud

    NARCIS (Netherlands)

    Wolstencroft, K.; Haines, R.; Fellows, D.; Williams, A.; Withers, D.; Owen, S.; Soiland-Reyes, S.; Dunlop, I.; Nenadic, A.; Fisher, P.; Bhagat, J.; Belhajjame, K.; Bacall, F.; Hardisty, A.; Nieva de la Hidalga, A.; Balcazar Vargas, M.P.; Sufi, S.; Goble, C.

    2013-01-01

    The Taverna workflow tool suite (http://www.taverna.org.uk) is designed to combine distributed Web Services and/or local tools into complex analysis pipelines. These pipelines can be executed on local desktop machines or through larger infrastructure (such as supercomputers, Grids or cloud

  20. Workflow-Based Software Development Environment

    Science.gov (United States)

    Izygon, Michel E.

    2013-01-01

    The Software Developer's Assistant (SDA) helps software teams more efficiently and accurately conduct or execute software processes associated with NASA mission-critical software. SDA is a process enactment platform that guides software teams through project-specific standards, processes, and procedures. Software projects are decomposed into all of their required process steps or tasks, and each task is assigned to project personnel. SDA orchestrates the performance of work required to complete all process tasks in the correct sequence. The software then notifies team members when they may begin work on their assigned tasks and provides the tools, instructions, reference materials, and supportive artifacts that allow users to compliantly perform the work. A combination of technology components captures and enacts any software process use to support the software lifecycle. It creates an adaptive workflow environment that can be modified as needed. SDA achieves software process automation through a Business Process Management (BPM) approach to managing the software lifecycle for mission-critical projects. It contains five main parts: TieFlow (workflow engine), Business Rules (rules to alter process flow), Common Repository (storage for project artifacts, versions, history, schedules, etc.), SOA (interface to allow internal, GFE, or COTS tools integration), and the Web Portal Interface (collaborative web environment

  1. Climate Data Analytics Workflow Management

    Science.gov (United States)

    Zhang, J.; Lee, S.; Pan, L.; Mattmann, C. A.; Lee, T. J.

    2016-12-01

    In this project we aim to pave a novel path to create a sustainable building block toward Earth science big data analytics and knowledge sharing. Closely studying how Earth scientists conduct data analytics research in their daily work, we have developed a provenance model to record their activities, and to develop a technology to automatically generate workflows for scientists from the provenance. On top of it, we have built the prototype of a data-centric provenance repository, and establish a PDSW (People, Data, Service, Workflow) knowledge network to support workflow recommendation. To ensure the scalability and performance of the expected recommendation system, we have leveraged the Apache OODT system technology. The community-approved, metrics-based performance evaluation web-service will allow a user to select a metric from the list of several community-approved metrics and to evaluate model performance using the metric as well as the reference dataset. This service will facilitate the use of reference datasets that are generated in support of the model-data intercomparison projects such as Obs4MIPs and Ana4MIPs. The data-centric repository infrastructure will allow us to catch richer provenance to further facilitate knowledge sharing and scientific collaboration in the Earth science community. This project is part of Apache incubator CMDA project.

  2. Nanocoating cellulose paper based microextraction combined with nanospray mass spectrometry for rapid and facile quantitation of ribonucleosides in human urine.

    Science.gov (United States)

    Wan, Lingzhong; Zhu, Haijing; Guan, Yafeng; Huang, Guangming

    2017-07-01

    A rapid and facile analytical method for quantification of ribonucleosides in human urine was developed by the combination of nanocoating cellulose paper based microextraction and nanoelectrospray ionization-tandem mass spectrometry (nESI-MS/MS). Cellulose paper used for microextraction was modified by nano-precision deposition of uniform ultrathin zirconia gel film using a sol-gel process. Due to the large surface area of the cellulose paper and the strong affinity between zirconia and the cis-diol compounds, the target analytes were selectively extracted from the complex matrix. Thus, the detection sensitivity was greatly improved. Typically, the nanocoating cellulose paper was immersed into the diluted urine for selective extraction of target analytes, then the extracted analytes were subjected to nESI-MS/MS detection. The whole analytical procedure could be completed within 10min. The method was evaluated by the determination of ribonucleosides (adenosine, cytidine, uridine, guanosine) in urine sample. The signal intensities of the ribonuclesides extracted by the nanocoating cellulose paper were greatly enhanced by 136-459-folds compared with the one of the unmodified cellulose paper based microextraction. The limits of detection (LODs) and the limits of quantification (LOQs) of the four ribonucleosides were in the range of 0.0136-1.258μgL -1 and 0.0454-4.194μgL -1 , respectively. The recoveries of the target nucleosides from spiked human urine were in the range of 75.64-103.49% with the relative standard deviations (RSDs) less than 9.36%. The results demonstrate the potential of the proposed method for rapid and facile determination of endogenous ribonucleosides in urine sample. Copyright © 2017. Published by Elsevier B.V.

  3. Biomarker discovery in high grade sarcomas by mass spectrometry imaging

    OpenAIRE

    Lou, S.

    2017-01-01

    This thesis demonstrates a detailed biomarker discovery Mass Spectrometry Imaging workflow for histologically heterogeneous high grade sarcomas. Panels of protein and metabolite signatures were discovered either distinguishing different histological subtypes or stratifying high risk patients with poor survival.

  4. From shared data to sharing workflow: Merging PACS and teleradiology

    International Nuclear Information System (INIS)

    Benjamin, Menashe; Aradi, Yinon; Shreiber, Reuven

    2010-01-01

    Due to a host of technological, interface, operational and workflow limitations, teleradiology and PACS/RIS were historically developed as separate systems serving different purposes. PACS/RIS handled local radiology storage and workflow management while teleradiology addressed remote access to images. Today advanced PACS/RIS support complete site radiology workflow for attending physicians, whether on-site or remote. In parallel, teleradiology has emerged into a service of providing remote, off-hours, coverage for emergency radiology and to a lesser extent subspecialty reading to subscribing sites and radiology groups. When attending radiologists use teleradiology for remote access to a site, they may share all relevant patient data and participate in the site's workflow like their on-site peers. The operation gets cumbersome and time consuming when these radiologists serve multi-sites, each requiring a different remote access, or when the sites do not employ the same PACS/RIS/Reporting Systems and do not share the same ownership. The least efficient operation is of teleradiology companies engaged in reading for multiple facilities. As these services typically employ non-local radiologists, they are allowed to share some of the available patient data necessary to provide an emergency report but, by enlarge, they do not share the workflow of the sites they serve. Radiology stakeholders usually prefer to have their own radiologists perform all radiology tasks including interpretation of off-hour examinations. It is possible with current technology to create a system that combines the benefits of local radiology services to multiple sites with the advantages offered by adding subspecialty and off-hours emergency services through teleradiology. Such a system increases efficiency for the radiology groups by enabling all users, regardless of location, to work 'local' and fully participate in the workflow of every site. We refer to such a system as SuperPACS.

  5. Automation of Flexible Migration Workflows

    Directory of Open Access Journals (Sweden)

    Dirk von Suchodoletz

    2011-03-01

    Full Text Available Many digital preservation scenarios are based on the migration strategy, which itself is heavily tool-dependent. For popular, well-defined and often open file formats – e.g., digital images, such as PNG, GIF, JPEG – a wide range of tools exist. Migration workflows become more difficult with proprietary formats, as used by the several text processing applications becoming available in the last two decades. If a certain file format can not be rendered with actual software, emulation of the original environment remains a valid option. For instance, with the original Lotus AmiPro or Word Perfect, it is not a problem to save an object of this type in ASCII text or Rich Text Format. In specific environments, it is even possible to send the file to a virtual printer, thereby producing a PDF as a migration output. Such manual migration tasks typically involve human interaction, which may be feasible for a small number of objects, but not for larger batches of files.We propose a novel approach using a software-operated VNC abstraction layer in order to replace humans with machine interaction. Emulators or virtualization tools equipped with a VNC interface are very well suited for this approach. But screen, keyboard and mouse interaction is just part of the setup. Furthermore, digital objects need to be transferred into the original environment in order to be extracted after processing. Nevertheless, the complexity of the new generation of migration services is quickly rising; a preservation workflow is now comprised not only of the migration tool itself, but of a complete software and virtual hardware stack with recorded workflows linked to every supported migration scenario. Thus the requirements of OAIS management must include proper software archiving, emulator selection, system image and recording handling. The concept of view-paths could help either to automatically determine the proper pre-configured virtual environment or to set up system

  6. PhosProtect - a novel and superior compound to tag and protect phospho-groups during mass spectrometry based phospho-proteomics

    DEFF Research Database (Denmark)

    2012-01-01

    Value Proposition The PhosProtect compound alone (developed and tested, IP secured)* • protects phospho-groups during tandem mass spectrometry, thus reducing problematic neutral loss of phosphate. • provides unique phospho-tag by causing isotopic distribution patterns in MS and MS/MS data. The PhosProtect...... compound covalently bound to column material (in progress , IP secured)** • Combines enrichment, protection and tagging of phospho-peptides and phospho-lipids in one easy workflow....

  7. Flexible Early Warning Systems with Workflows and Decision Tables

    Science.gov (United States)

    Riedel, F.; Chaves, F.; Zeiner, H.

    2012-04-01

    are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.

  8. An MRM-based workflow for absolute quantitation of lysine-acetylated metabolic enzymes in mouse liver.

    Science.gov (United States)

    Xu, Leilei; Wang, Fang; Xu, Ying; Wang, Yi; Zhang, Cuiping; Qin, Xue; Yu, Hongxiu; Yang, Pengyuan

    2015-12-07

    As a key post-translational modification mechanism, protein acetylation plays critical roles in regulating and/or coordinating cell metabolism. Acetylation is a prevalent modification process in enzymes. Protein acetylation modification occurs in sub-stoichiometric amounts; therefore extracting biologically meaningful information from these acetylation sites requires an adaptable, sensitive, specific, and robust method for their quantification. In this work, we combine immunoassays and multiple reaction monitoring-mass spectrometry (MRM-MS) technology to develop an absolute quantification for acetylation modification. With this hybrid method, we quantified the acetylation level of metabolic enzymes, which could demonstrate the regulatory mechanisms of the studied enzymes. The development of this quantitative workflow is a pivotal step for advancing our knowledge and understanding of the regulatory effects of protein acetylation in physiology and pathophysiology.

  9. Pro WF Windows Workflow in NET 40

    CERN Document Server

    Bukovics, Bruce

    2010-01-01

    Windows Workflow Foundation (WF) is a revolutionary part of the .NET 4 Framework that allows you to orchestrate human and system interactions as a series of workflows that can be easily mapped, analyzed, adjusted, and implemented. As business problems become more complex, the need for workflow-based solutions has never been more evident. WF provides a simple and consistent way to model and implement complex problems. As a developer, you focus on developing the business logic for individual workflow tasks. The runtime handles the execution of those tasks after they have been composed into a wor

  10. Integration of services into workflow applications

    CERN Document Server

    Czarnul, Pawel

    2015-01-01

    Describing state-of-the-art solutions in distributed system architectures, Integration of Services into Workflow Applications presents a concise approach to the integration of loosely coupled services into workflow applications. It discusses key challenges related to the integration of distributed systems and proposes solutions, both in terms of theoretical aspects such as models and workflow scheduling algorithms, and technical solutions such as software tools and APIs.The book provides an in-depth look at workflow scheduling and proposes a way to integrate several different types of services

  11. Multidetector-row CT: economics and workflow

    International Nuclear Information System (INIS)

    Pottala, K.M.; Kalra, M.K.; Saini, S.; Ouellette, K.; Sahani, D.; Thrall, J.H.

    2005-01-01

    With rapid evolution of multidetector-row CT (MDCT) technology and applications, several factors such ad technology upgrade and turf battles for sharing cost and profitability affect MDCT workflow and economics. MDCT workflow optimization can enhance productivity and reduce unit costs as well as increase profitability, in spite of decrease in reimbursement rates. Strategies for workflow management include standardization, automation, and constant assessment of various steps involved in MDCT operations. In this review article, we describe issues related to MDCT economics and workflow. (orig.)

  12. Towards an Intelligent Workflow Designer based on the Reuse of Workflow Patterns

    NARCIS (Netherlands)

    Iochpe, Cirano; Chiao, Carolina; Hess, Guillermo; Nascimento, Gleison; Thom, Lucinéia; Reichert, Manfred

    2007-01-01

    In order to perform process-aware information systems we need sophisticated methods and concepts for designing and modeling processes. Recently, research on workflow patterns has emerged in order to increase the reuse of recurring workflow structures. However, current workflow modeling tools do not

  13. Measuring Semantic and Structural Information for Data Oriented Workflow Retrieval with Cost Constraints

    Directory of Open Access Journals (Sweden)

    Yinglong Ma

    2014-01-01

    Full Text Available The reuse of data oriented workflows (DOWs can reduce the cost of workflow system development and control the risk of project failure and therefore is crucial for accelerating the automation of business processes. Reusing workflows can be achieved by measuring the similarity among candidate workflows and selecting the workflow satisfying requirements of users from them. However, due to DOWs being often developed based on an open, distributed, and heterogeneous environment, different users often can impose diverse cost constraints on data oriented workflows. This makes the reuse of DOWs challenging. There is no clear solution for retrieving DOWs with cost constraints. In this paper, we present a novel graph based model of DOWs with cost constraints, called constrained data oriented workflow (CDW, which can express cost constraints that users are often concerned about. An approach is proposed for retrieving CDWs, which seamlessly combines semantic and structural information of CDWs. A distance measure based on matrix theory is adopted to seamlessly combine semantic and structural similarities of CDWs for selecting and reusing them. Finally, the related experiments are made to show the effectiveness and efficiency of our approach.

  14. Workflow patterns the definitive guide

    CERN Document Server

    Russell, Nick; ter Hofstede, Arthur H M

    2016-01-01

    The study of business processes has emerged as a highly effective approach to coordinating an organization's complex service- and knowledge-based activities. The growing field of business process management (BPM) focuses on methods and tools for designing, enacting, and analyzing business processes. This volume offers a definitive guide to the use of patterns, which synthesize the wide range of approaches to modeling business processes. It provides a unique and comprehensive introduction to the well-known workflow patterns collection -- recurrent, generic constructs describing common business process modeling and execution scenarios, presented in the form of problem-solution dialectics. The underlying principles of the patterns approach ensure that they are independent of any specific enabling technology, representational formalism, or modeling approach, and thus broadly applicable across the business process modeling and business process technology domains. The authors, drawing on extensive research done by...

  15. Complexity Metrics for Workflow Nets

    DEFF Research Database (Denmark)

    Lassen, Kristian Bisgaard; van der Aalst, Wil M.P.

    2009-01-01

    analysts have difficulties grasping the dynamics implied by a process model. Recent empirical studies show that people make numerous errors when modeling complex business processes, e.g., about 20 percent of the EPCs in the SAP reference model have design flaws resulting in potential deadlocks, livelocks......, etc. It seems obvious that the complexity of the model contributes to design errors and a lack of understanding. It is not easy to measure complexity, however. This paper presents three complexity metrics that have been implemented in the process analysis tool ProM. The metrics are defined...... for a subclass of Petri nets named Workflow nets, but the results can easily be applied to other languages. To demonstrate the applicability of these metrics, we have applied our approach and tool to 262 relatively complex Protos models made in the context of various student projects. This allows us to validate...

  16. Contracts for Cross-Organizational Workflow Management

    NARCIS (Netherlands)

    Koetsier, M.J.; Grefen, P.W.P.J.; Vonk, J.

    1999-01-01

    Nowadays, many organizations form dynamic partnerships to deal effectively with market requirements. As companies use automated workflow systems to control their processes, a way of linking workflow processes in different organizations is useful in turning the co-operating companies into a seamless

  17. Verifying generalized soundness for workflow nets

    NARCIS (Netherlands)

    Hee, van K.M.; Oanea, O.I.; Sidorova, N.; Voorhoeve, M.; Virbitskaite, I.; Voronkov, A.

    2007-01-01

    We improve the decision procedure from [10] for the problem of generalized soundness of workflow nets. A workflow net is generalized sound iff every marking reachable from an initial marking with k tokens on the initial place terminates properly, i.e. it can reach a marking with k tokens on the

  18. Implementing bioinformatic workflows within the bioextract server

    Science.gov (United States)

    Computational workflows in bioinformatics are becoming increasingly important in the achievement of scientific advances. These workflows typically require the integrated use of multiple, distributed data sources and analytic tools. The BioExtract Server (http://bioextract.org) is a distributed servi...

  19. Workflow Fault Tree Generation Through Model Checking

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2014-01-01

    We present a framework for the automated generation of fault trees from models of realworld process workflows, expressed in a formalised subset of the popular Business Process Modelling and Notation (BPMN) language. To capture uncertainty and unreliability in workflows, we extend this formalism...

  20. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  1. Ion Mobility Separations of Isomers based upon Long Path Length Structures for Lossless Ion Manipulations Combined with Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Liulin [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Ibrahim, Yehia M. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Baker, Erin S. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Aly, Noor A. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Hamid, Ahmed M. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Zhang, Xing [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Zheng, Xueyun [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Garimella, Sandilya V. B. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Webb, Ian K. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Prost, Spencer A. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Sandoval, Jeremy A. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Norheim, Randolph V. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Anderson, Gordon A. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Tolmachev, Aleksey V. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA; Smith, Richard D. [Biological Sciences Division and Environmental Molecular Sciences Laboratory, Pacific Northwest National Laboratory, 902 Battelle Blvd Richland, WA 99352 USA

    2016-07-01

    Mass spectrometry (MS)-based multi-omic measurements, including proteomics, metabolomics, lipidomics, and glycomics, are increasingly transforming our ability to characterize and understand biological systems, but, presently have limitations due to the chemical diversity and range of abundances of biomolecules in complex samples. Advances addressing these challenges increasingly are based upon the ability to quickly separate, react and otherwise manipulate sample components for analysis by MS. Here we report on a new approach using Structures for Lossless Ion Manipulations (SLIM) to enable long serpentine path ion mobility spectrometry (IMS) separations followed by MS analyses. This approach provides previously unachieved mobility biomolecule isomer separations for biomolecular species, in conjunction with more effective ion utilization, and producing a basis for the improved characterization of very small samples.

  2. X-ray dosimetry in mammography for W/Mo and Mo/Mo combinations utilizing Compton spectrometry

    International Nuclear Information System (INIS)

    Almeida Junior, Jose N.; Terini, Ricardo A.; Herdade, Silvio B.; Furquim, Tania A.C.

    2009-01-01

    Mean Glandular Dose (MGD) cannot be measured directly in mammography equipment. Therefore, methods based on Compton spectrometry are alternatives to evaluate dose distributions in a standard breast phantom, as well as mean glandular dose. In this work, a CdTe detector was used for the spectrometry measurements of radiation scattered by compton effect, at nearly 90, by a PMMA cylinder. For this, the reconstruction of primary beam spectra from the scattered ones has been made using Klein-Nishina theory and Compton formalism, followed by a determination of incident air kerma, absorbed dose values in the breast phantom and, finally, MGD. Incident and attenuated X-ray spectra and depth-dose distributions in a BR-12 phantom have been determined and are presented for the mammography range (28 to 35kV), showing good agreement with previous literature data, obtained with TLD. (author)

  3. Automated selected reaction monitoring data analysis workflow for large-scale targeted proteomic studies.

    Science.gov (United States)

    Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi

    2013-08-01

    Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.

  4. A Formal Framework for Workflow Analysis

    Science.gov (United States)

    Cravo, Glória

    2010-09-01

    In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.

  5. Using Mobile Agents to Implement Workflow System

    Institute of Scientific and Technical Information of China (English)

    LI Jie; LIU Xian-xing; GUO Zheng-wei

    2004-01-01

    Current workflow management systems usually adopt the existing technologies such as TCP/IP-based Web technologies and CORBA as well to fulfill the bottom communications.Very often it has been considered only from a theoretical point of view, mainly for the lack of concrete possibilities to execute with elasticity.MAT (Mobile Agent Technology) represents a very attractive approach to the distributed control of computer networks and a valid alternative to the implementation of strategies for workflow system.This paper mainly focuses on improving the performance of workflow system by using MAT.Firstly, the performances of workflow systems based on both CORBA and mobile agent are summarized and analyzed; Secondly, the performance contrast is presented by introducing the mathematic model of each kind of data interaction process respectively.Last, a mobile agent-based workflow system named MAWMS is presented and described in detail.

  6. Dynamic Reusable Workflows for Ocean Science

    Directory of Open Access Journals (Sweden)

    Richard P. Signell

    2016-10-01

    Full Text Available Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog searches and data access now make it possible to create catalog-driven workflows that automate—end-to-end—data search, analysis, and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused, and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS which automates the skill assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC Catalog Service for the Web (CSW, then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enter the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased

  7. Dynamic reusable workflows for ocean science

    Science.gov (United States)

    Signell, Richard; Fernandez, Filipe; Wilcox, Kyle

    2016-01-01

    Digital catalogs of ocean data have been available for decades, but advances in standardized services and software for catalog search and data access make it now possible to create catalog-driven workflows that automate — end-to-end — data search, analysis and visualization of data from multiple distributed sources. Further, these workflows may be shared, reused and adapted with ease. Here we describe a workflow developed within the US Integrated Ocean Observing System (IOOS) which automates the skill-assessment of water temperature forecasts from multiple ocean forecast models, allowing improved forecast products to be delivered for an open water swim event. A series of Jupyter Notebooks are used to capture and document the end-to-end workflow using a collection of Python tools that facilitate working with standardized catalog and data services. The workflow first searches a catalog of metadata using the Open Geospatial Consortium (OGC) Catalog Service for the Web (CSW), then accesses data service endpoints found in the metadata records using the OGC Sensor Observation Service (SOS) for in situ sensor data and OPeNDAP services for remotely-sensed and model data. Skill metrics are computed and time series comparisons of forecast model and observed data are displayed interactively, leveraging the capabilities of modern web browsers. The resulting workflow not only solves a challenging specific problem, but highlights the benefits of dynamic, reusable workflows in general. These workflows adapt as new data enters the data system, facilitate reproducible science, provide templates from which new scientific workflows can be developed, and encourage data providers to use standardized services. As applied to the ocean swim event, the workflow exposed problems with two of the ocean forecast products which led to improved regional forecasts once errors were corrected. While the example is specific, the approach is general, and we hope to see increased use of dynamic

  8. Simultaneous determination of phenolic compounds in Equisetum palustre L. by ultra high performance liquid chromatography with tandem mass spectrometry combined with matrix solid-phase dispersion extraction.

    Science.gov (United States)

    Wei, Zuofu; Pan, Youzhi; Li, Lu; Huang, Yuyang; Qi, Xiaolin; Luo, Meng; Zu, Yuangang; Fu, Yujie

    2014-11-01

    A method based on matrix solid-phase dispersion extraction followed by ultra high performance liquid chromatography with tandem mass spectrometry is presented for the extraction and determination of phenolic compounds in Equisetum palustre. This method combines the high efficiency of matrix solid-phase dispersion extraction and the rapidity, sensitivity, and accuracy of ultra high performance liquid chromatography with tandem mass spectrometry. The influential parameters of the matrix solid-phase dispersion extraction were investigated and optimized. The optimized conditions were as follows: silica gel was selected as dispersing sorbent, the ratio of silica gel to sample was selected to be 2:1 (400/200 mg), and 8 mL of 80% methanol was used as elution solvent. Furthermore, a fast and sensitive ultra high performance liquid chromatography with tandem mass spectrometry method was developed for the determination of nine phenolic compounds in E. palustre. This method was carried out within <6 min, and exhibited satisfactory linearity, precision, and recovery. Compared with ultrasound-assisted extraction, the proposed matrix solid-phase dispersion procedure possessed higher extraction efficiency, and was more convenient and time saving with reduced requirements on sample and solvent amounts. All these results suggest that the developed method represents an excellent alternative for the extraction and determination of active components in plant matrices. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Precise Temporal Profiling of Signaling Complexes in Primary Cells Using SWATH Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Etienne Caron

    2017-03-01

    Full Text Available Spatiotemporal organization of protein interactions in cell signaling is a fundamental process that drives cellular functions. Given differential protein expression across tissues and developmental stages, the architecture and dynamics of signaling interaction proteomes is, likely, highly context dependent. However, current interaction information has been almost exclusively obtained from transformed cells. In this study, we applied an advanced and robust workflow combining mouse genetics and affinity purification (AP-SWATH mass spectrometry to profile the dynamics of 53 high-confidence protein interactions in primary T cells, using the scaffold protein GRB2 as a model. The workflow also provided a sufficient level of robustness to pinpoint differential interaction dynamics between two similar, but functionally distinct, primary T cell populations. Altogether, we demonstrated that precise and reproducible quantitative measurements of protein interaction dynamics can be achieved in primary cells isolated from mammalian tissues, allowing resolution of the tissue-specific context of cell-signaling events.

  10. Investigation of Figopitant and Its Metabolites in Rat Tissue by Combining Whole-Body Autoradiography with Liquid Extraction Surface Analysis Mass Spectrometry

    DEFF Research Database (Denmark)

    Schadt, S.; Kallbach, S.; Almeida, R.

    2012-01-01

    tissue extraction, sample cleanup, and high-performance liquid chromatography analysis. The parent drug and the N-dealkylated metabolite M474(1) (BIIF 1148) in varying ratios were the predominant compounds in all tissues investigated. In addition, several metabolites formed by oxygenation, dealkylation......This article describes the combination of whole-body autoradiography with liquid extraction surface analysis (LESA) and mass spectrometry (MS) to study the distribution of the tachykinin neurokinin-1 antagonist figopitant and its metabolites in tissue sections of rats after intravenous...

  11. ADVANCED APPROACH TO PRODUCTION WORKFLOW COMPOSITION ON ENGINEERING KNOWLEDGE PORTALS

    OpenAIRE

    Novogrudska, Rina; Kot, Tatyana; Globa, Larisa; Schill, Alexander

    2016-01-01

    Background. In the environment of engineering knowledge portals great amount of partial workflows is concentrated. Such workflows are composed into general workflow aiming to perform real complex production task. Characteristics of partial workflows and general workflow structure are not studied enough, that affects the impossibility of general production workflowdynamic composition.Objective. Creating an approach to the general production workflow dynamic composition based on the partial wor...

  12. Implementation of a Workflow Management System for Non-Expert Users

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2016-01-01

    tools, the CLARIN-DK workflow management system (WMS) computes combinations of tools that will give the desired result. This advanced functionality was originally not envisaged, but came within reach by writing the WMS partly in Java and partly in a programming language for symbolic computation, Bracmat....... Handling LT tool profiles, including the computation of workflows, is easier with Bracmat's language constructs for tree pattern matching and tree construction than with the language constructs offered by mainstream programming languages....

  13. Immunoaffinity chromatography combined with tandem mass spectrometry: A new tool for the selective capture and analysis of brassinosteroid plant hormones

    Czech Academy of Sciences Publication Activity Database

    Oklešťková, Jana; Tarkowská, Danuše; Eyer, L.; Elbert, Tomáš; Marek, Aleš; Smržová, Z.; Novák, Ondřej; Fránek, M.; Zhabinskii, V.N.; Strnad, Miroslav

    2017-01-01

    Roč. 170, AUG 1 (2017), s. 432-440 ISSN 0039-9140 R&D Projects: GA MŠk(CZ) LO1204; GA ČR GA14-34792S; GA ČR GJ15-08202Y Institutional support: RVO:61389030 ; RVO:61388963 Keywords : Brassica napus * Brassinosteroids * Enzyme immunoassay * Immunoaffinity chromatography * Liquid chromatography-tandem mass spectrometry * Monoclonal antibodies Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Analytical chemistry; Biochemical research methods (UOCHB-X) Impact factor: 4.162, year: 2016

  14. Screening anti-tumor compounds from Ligusticum wallichii using cell membrane chromatography combined with high-performance liquid chromatography and mass spectrometry.

    Science.gov (United States)

    Zhang, Tao; Ding, Yuanyuan; An, Hongli; Feng, Liuxin; Wang, Sicen

    2015-07-14

    Tyrosine 367 Cysteine-fibroblast growth factor receptor 4 cell membrane chromatography combined with high-performance liquid chromatography and mass spectrometry was developed. Tyrosine 367 Cysteine-HEK293 cells were used as cell membrane stationary phase. Specificity and reproducibility of the cell membrane chromatography was evaluated using 1-tert-butyl-3-{2-[4-(diethylamino)butylamino]-6-(3,5-dimethoxyphenyl)pyrido[2,3-d]pyrimidin-7-yl}urea, Nimodipine and dexamethasone acetate. Then, anti-tumor components acting on Tyrosine 367 Cysteine-fibroblast growth factor receptor 4 were screened and identified from extracts of Ligusticum wallichii. Components from the extract were retained on the cell membrane chromatographic column. The retained fraction was directly eluted into high-performance liquid chromatography with mass spectrometry system for separation and identification. Finally, Levistolide A was identified as an active component from Ligusticum wallichii extracts. The 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyl-tetrazolium bromide-formazan colorimetric assay revealed that Levistolide A inhibits proliferation of overexpressing the mutated receptor cells with dose-dependent manner. Phosphorylation of fibroblast growth factor receptor 4 was also decrease under Levistolide A treatment. Flex dock simulation verified that Levistolide A could bind with the tyrosine kinase domain of fibroblast growth factor receptor 4. Therefore, Levistolide A screened by the cell membrane chromatography combined with high-performance liquid chromatography and mass spectrometry can arrest cell growth. In conclusion, the two-dimensional high-performance liquid chromatography method can screen and identify potential anti-tumor ingredients which specifically act on the tyrosine kinase domain of the mutated fibroblast growth factor receptor 4. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Combined Mass Spectrometry Imaging and Top-down Microproteomics Reveals Evidence of a Hidden Proteome in Ovarian Cancer.

    Science.gov (United States)

    Delcourt, Vivian; Franck, Julien; Leblanc, Eric; Narducci, Fabrice; Robin, Yves-Marie; Gimeno, Jean-Pascal; Quanico, Jusal; Wisztorski, Maxence; Kobeissy, Firas; Jacques, Jean-François; Roucou, Xavier; Salzet, Michel; Fournier, Isabelle

    2017-07-01

    Recently, it was demonstrated that proteins can be translated from alternative open reading frames (altORFs), increasing the size of the actual proteome. Top-down mass spectrometry-based proteomics allows the identification of intact proteins containing post-translational modifications (PTMs) as well as truncated forms translated from reference ORFs or altORFs. Top-down tissue microproteomics was applied on benign, tumor and necrotic-fibrotic regions of serous ovarian cancer biopsies, identifying proteins exhibiting region-specific cellular localization and PTMs. The regions of interest (ROIs) were determined by MALDI mass spectrometry imaging and spatial segmentation. Analysis with a customized protein sequence database containing reference and alternative proteins (altprots) identified 15 altprots, including alternative G protein nucleolar 1 (AltGNL1) found in the tumor, and translated from an altORF nested within the GNL1 canonical coding sequence. Co-expression of GNL1 and altGNL1 was validated by transfection in HEK293 and HeLa cells with an expression plasmid containing a GNL1-FLAG (V5) construct. Western blot and immunofluorescence experiments confirmed constitutive co-expression of altGNL1-V5 with GNL1-FLAG. Taken together, our approach provides means to evaluate protein changes in the case of serous ovarian cancer, allowing the detection of potential markers that have never been considered. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  16. Workflow Management in CLARIN-DK

    DEFF Research Database (Denmark)

    Jongejan, Bart

    2013-01-01

    The CLARIN-DK infrastructure is not only a repository of resources, but also a place where users can analyse, annotate, reformat and potentially even translate resources, using tools that are integrated in the infrastructure as web services. In many cases a single tool does not produce the desired...... with the features that describe her goal, because the workflow manager not only executes chains of tools in a workflow, but also takes care of autonomously devising workflows that serve the user’s intention, given the tools that currently are integrated in the infrastructure as web services. To do this...

  17. The Diabetic Retinopathy Screening Workflow

    Science.gov (United States)

    Bolster, Nigel M.; Giardini, Mario E.; Bastawrous, Andrew

    2015-01-01

    Complications of diabetes mellitus, namely diabetic retinopathy and diabetic maculopathy, are the leading cause of blindness in working aged people. Sufferers can avoid blindness if identified early via retinal imaging. Systematic screening of the diabetic population has been shown to greatly reduce the prevalence and incidence of blindness within the population. Many national screening programs have digital fundus photography as their basis. In the past 5 years several techniques and adapters have been developed that allow digital fundus photography to be performed using smartphones. We review recent progress in smartphone-based fundus imaging and discuss its potential for integration into national systematic diabetic retinopathy screening programs. Some systems have produced promising initial results with respect to their agreement with reference standards. However further multisite trialling of such systems’ use within implementable screening workflows is required if an evidence base strong enough to affect policy change is to be established. If this were to occur national diabetic retinopathy screening would, for the first time, become possible in low- and middle-income settings where cost and availability of trained eye care personnel are currently key barriers to implementation. As diabetes prevalence and incidence is increasing sharply in these settings, the impact on global blindness could be profound. PMID:26596630

  18. Workflow Optimization in Vertebrobasilar Occlusion

    International Nuclear Information System (INIS)

    Kamper, Lars; Meyn, Hannes; Rybacki, Konrad; Nordmeyer, Simone; Kempkes, Udo; Piroth, Werner; Isenmann, Stefan; Haage, Patrick

    2012-01-01

    Objective: In vertebrobasilar occlusion, rapid recanalization is the only substantial means to improve the prognosis. We introduced a standard operating procedure (SOP) for interventional therapy to analyze the effects on interdisciplinary time management. Methods: Intrahospital time periods between hospital admission and neuroradiological intervention were retrospectively analyzed, together with the patients’ outcome, before (n = 18) and after (n = 20) implementation of the SOP. Results: After implementation of the SOP, we observed statistically significant improvement of postinterventional patient neurological status (p = 0.017). In addition, we found a decrease of 5:33 h for the mean time period from hospital admission until neuroradiological intervention. The recanalization rate increased from 72.2% to 80% after implementation of the SOP. Conclusion: Our results underscore the relevance of SOP implementation and analysis of time management for clinical workflow optimization. Both may trigger awareness for the need of efficient interdisciplinary time management. This could be an explanation for the decreased time periods and improved postinterventional patient status after SOP implementation.

  19. A comparative study of three tissue-cultured Dendrobium species and their wild correspondences by headspace gas chromatography-mass spectrometry combined with chemometric methods.

    Science.gov (United States)

    Chen, Nai-Dong; You, Tao; Li, Jun; Bai, Li-Tao; Hao, Jing-Wen; Xu, Xiao-Yuan

    2016-10-01

    Plant tissue culture technique is widely used in the conservation and utilization of rare and endangered medicinal plants and it is crucial for tissue culture stocks to obtain the ability to produce similar bioactive components as their wild correspondences. In this paper, a headspace gas chromatography-mass spectrometry method combined with chemometric methods was applied to analyze and evaluate the volatile compounds in tissue-cultured and wild Dendrobium huoshanense Cheng and Tang, Dendrobium officinale Kimura et Migo and Dendrobium moniliforme (Linn.) Sw. In total, 63 volatile compounds were separated, with 53 being identified from the three Dendrobium spp. Different provenances of Dendrobiums had characteristic chemicals and showed remarkable quantity discrepancy of common compositions. The similarity evaluation disclosed that the accumulation of volatile compounds in Dendrobium samples might be affected by their provenance. Principal component analysis showed that the first three components explained 85.9% of data variance, demonstrating a good discrimination between samples. Gas chromatography-mass spectrometry techniques, combined with chemometrics, might be an effective strategy for identifying the species and their provenance, especially in the assessment of tissue-cultured Dendrobium quality for use in raw herbal medicines. Copyright © 2016. Published by Elsevier B.V.

  20. a Standardized Approach to Topographic Data Processing and Workflow Management

    Science.gov (United States)

    Wheaton, J. M.; Bailey, P.; Glenn, N. F.; Hensleigh, J.; Hudak, A. T.; Shrestha, R.; Spaete, L.

    2013-12-01

    An ever-increasing list of options exist for collecting high resolution topographic data, including airborne LIDAR, terrestrial laser scanners, bathymetric SONAR and structure-from-motion. An equally rich, arguably overwhelming, variety of tools exists with which to organize, quality control, filter, analyze and summarize these data. However, scientists are often left to cobble together their analysis as a series of ad hoc steps, often using custom scripts and one-time processes that are poorly documented and rarely shared with the community. Even when literature-cited software tools are used, the input and output parameters differ from tool to tool. These parameters are rarely archived and the steps performed lost, making the analysis virtually impossible to replicate precisely. What is missing is a coherent, robust, framework for combining reliable, well-documented topographic data-processing steps into a workflow that can be repeated and even shared with others. We have taken several popular topographic data processing tools - including point cloud filtering and decimation as well as DEM differencing - and defined a common protocol for passing inputs and outputs between them. This presentation describes a free, public online portal that enables scientists to create custom workflows for processing topographic data using a number of popular topographic processing tools. Users provide the inputs required for each tool and in what sequence they want to combine them. This information is then stored for future reuse (and optionally sharing with others) before the user then downloads a single package that contains all the input and output specifications together with the software tools themselves. The user then launches the included batch file that executes the workflow on their local computer against their topographic data. This ZCloudTools architecture helps standardize, automate and archive topographic data processing. It also represents a forum for discovering and

  1. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    Science.gov (United States)

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We

  2. Support for Taverna workflows in the VPH-Share cloud platform.

    Science.gov (United States)

    Kasztelnik, Marek; Coto, Ernesto; Bubak, Marian; Malawski, Maciej; Nowakowski, Piotr; Arenas, Juan; Saglimbeni, Alfredo; Testi, Debora; Frangi, Alejandro F

    2017-07-01

    To address the increasing need for collaborative endeavours within the Virtual Physiological Human (VPH) community, the VPH-Share collaborative cloud platform allows researchers to expose and share sequences of complex biomedical processing tasks in the form of computational workflows. The Taverna Workflow System is a very popular tool for orchestrating complex biomedical & bioinformatics processing tasks in the VPH community. This paper describes the VPH-Share components that support the building and execution of Taverna workflows, and explains how they interact with other VPH-Share components to improve the capabilities of the VPH-Share platform. Taverna workflow support is delivered by the Atmosphere cloud management platform and the VPH-Share Taverna plugin. These components are explained in detail, along with the two main procedures that were developed to enable this seamless integration: workflow composition and execution. 1) Seamless integration of VPH-Share with other components and systems. 2) Extended range of different tools for workflows. 3) Successful integration of scientific workflows from other VPH projects. 4) Execution speed improvement for medical applications. The presented workflow integration provides VPH-Share users with a wide range of different possibilities to compose and execute workflows, such as desktop or online composition, online batch execution, multithreading, remote execution, etc. The specific advantages of each supported tool are presented, as are the roles of Atmosphere and the VPH-Share plugin within the VPH-Share project. The combination of the VPH-Share plugin and Atmosphere engenders the VPH-Share infrastructure with far more flexible, powerful and usable capabilities for the VPH-Share community. As both components can continue to evolve and improve independently, we acknowledge that further improvements are still to be developed and will be described. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Structuring research methods and data with the research object model: genomics workflows as a case study.

    Science.gov (United States)

    Hettne, Kristina M; Dharuri, Harish; Zhao, Jun; Wolstencroft, Katherine; Belhajjame, Khalid; Soiland-Reyes, Stian; Mina, Eleni; Thompson, Mark; Cruickshank, Don; Verdes-Montenegro, Lourdes; Garrido, Julian; de Roure, David; Corcho, Oscar; Klyne, Graham; van Schouwen, Reinout; 't Hoen, Peter A C; Bechhofer, Sean; Goble, Carole; Roos, Marco

    2014-01-01

    One of the main challenges for biomedical research lies in the computer-assisted integrative study of large and increasingly complex combinations of data in order to understand molecular mechanisms. The preservation of the materials and methods of such computational experiments with clear annotations is essential for understanding an experiment, and this is increasingly recognized in the bioinformatics community. Our assumption is that offering means of digital, structured aggregation and annotation of the objects of an experiment will provide necessary meta-data for a scientist to understand and recreate the results of an experiment. To support this we explored a model for the semantic description of a workflow-centric Research Object (RO), where an RO is defined as a resource that aggregates other resources, e.g., datasets, software, spreadsheets, text, etc. We applied this model to a case study where we analysed human metabolite variation by workflows. We present the application of the workflow-centric RO model for our bioinformatics case study. Three workflows were produced following recently defined Best Practices for workflow design. By modelling the experiment as an RO, we were able to automatically query the experiment and answer questions such as "which particular data was input to a particular workflow to test a particular hypothesis?", and "which particular conclusions were drawn from a particular workflow?". Applying a workflow-centric RO model to aggregate and annotate the resources used in a bioinformatics experiment, allowed us to retrieve the conclusions of the experiment in the context of the driving hypothesis, the executed workflows and their input data. The RO model is an extendable reference model that can be used by other systems as well. The Research Object is available at http://www.myexperiment.org/packs/428 The Wf4Ever Research Object Model is available at http://wf4ever.github.io/ro.

  4. RABIX: AN OPEN-SOURCE WORKFLOW EXECUTOR SUPPORTING RECOMPUTABILITY AND INTEROPERABILITY OF WORKFLOW DESCRIPTIONS.

    Science.gov (United States)

    Kaushik, Gaurav; Ivkovic, Sinisa; Simonovic, Janko; Tijanic, Nebojsa; Davis-Dusenbery, Brandi; Kural, Deniz

    2017-01-01

    As biomedical data has become increasingly easy to generate in large quantities, the methods used to analyze it have proliferated rapidly. Reproducible and reusable methods are required to learn from large volumes of data reliably. To address this issue, numerous groups have developed workflow specifications or execution engines, which provide a framework with which to perform a sequence of analyses. One such specification is the Common Workflow Language, an emerging standard which provides a robust and flexible framework for describing data analysis tools and workflows. In addition, reproducibility can be furthered by executors or workflow engines which interpret the specification and enable additional features, such as error logging, file organization, optim1izations to computation and job scheduling, and allow for easy computing on large volumes of data. To this end, we have developed the Rabix Executor, an open-source workflow engine for the purposes of improving reproducibility through reusability and interoperability of workflow descriptions.

  5. Stable isotope dilution quantification of mutagens in cooked foods by combined liquid chromatography-thermospray mass spectrometry

    International Nuclear Information System (INIS)

    Yamaizumi, Ziro; Kasai, Hiroshi; Nishimura, Susumu; Edmonds, C.G.; McCloskey, J.A.

    1986-01-01

    A method of general applicability for the detection and quantification of mutagens in cooked foods at the ppb level is presented. A minimal sample prefractionation is employed and [Me- 2 H 3 ]-labeled analogs of the compounds of interest are added for identification and quantification of mutagens by accurate measurement of chromatographic retention (K') in reverse-phase high-performance liquid chromatography (HPLC), and by measurement of the ratio of response of the protonated molecular ions of analyte and internal standard by directly coupled liquid chromatography-mass spectrometry (LC/MS). Initial application is demonstrated in the analysis of 2-amino-3-methylimidazo[4,5-f]quinoline (IQ) and 2-amino-3,4-dimethylimidazo[4,5-f]quinoline (MelQ) in broiled salmon. (Auth.)

  6. Solid phase microextraction capillary gas chromatography combined with furnace atomization plasma emission spectrometry for speciation of mercury in fish tissues

    International Nuclear Information System (INIS)

    Grinberg, Patricia; Campos, Reinaldo C.; Mester, Zoltan; Sturgeon, Ralph E.

    2003-01-01

    The use of solid phase microextraction in conjunction with tandem gas chromatography-furnace atomization plasma emission spectrometry (SPME-GC-FAPES) was evaluated for the determination of methylmercury and inorganic mercury in fish tissue. Samples were digested with methanolic potassium hydroxide, derivatized with sodium tetraethylborate and extracted by SPME. After the SPME extraction, species were separated by GC and detected by FAPES. All experimental parameters were optimized for best separation and analytical response. A repeatability precision of typically 2% can be achieved with long-term (3 months) reproducibility precision of 4.3%. Certified Reference Materials DORM-2, DOLT-2 and TORT-2 from the National Research Council of Canada were analyzed to verify the accuracy of this technique. Detection limits of 1.5 ng g -1 for methylmercury and 0.7 ng g -1 for inorganic mercury in biological tissues were obtained

  7. A combined quantitative mass spectrometry and electron microscopy analysis of ribosomal 30S subunit assembly in E. coli.

    Science.gov (United States)

    Sashital, Dipali G; Greeman, Candacia A; Lyumkis, Dmitry; Potter, Clinton S; Carragher, Bridget; Williamson, James R

    2014-10-14

    Ribosome assembly is a complex process involving the folding and processing of ribosomal RNAs (rRNAs), concomitant binding of ribosomal proteins (r-proteins), and participation of numerous accessory cofactors. Here, we use a quantitative mass spectrometry/electron microscopy hybrid approach to determine the r-protein composition and conformation of 30S ribosome assembly intermediates in Escherichia coli. The relative timing of assembly of the 3' domain and the formation of the central pseudoknot (PK) structure depends on the presence of the assembly factor RimP. The central PK is unstable in the absence of RimP, resulting in the accumulation of intermediates in which the 3'-domain is unanchored and the 5'-domain is depleted for r-proteins S5 and S12 that contact the central PK. Our results reveal the importance of the cofactor RimP in central PK formation, and introduce a broadly applicable method for characterizing macromolecular assembly in cells.

  8. Combined use of atomic force microscopy, X-ray photoelectron spectroscopy, and secondary ion mass spectrometry for cell surface analysis.

    Science.gov (United States)

    Dague, Etienne; Delcorte, Arnaud; Latgé, Jean-Paul; Dufrêne, Yves F

    2008-04-01

    Understanding the surface properties of microbial cells is a major challenge of current microbiological research and a key to efficiently exploit them in biotechnology. Here, we used three advanced surface analysis techniques with different sensitivity, probing depth, and lateral resolution, that is, in situ atomic force microscopy, X-ray photoelectron spectroscopy, and secondary ion mass spectrometry, to gain insight into the surface properties of the conidia of the human fungal pathogen Aspergillus fumigatus. We show that the native ultrastructure, surface protein and polysaccharide concentrations, and amino acid composition of three mutants affected in hydrophobin production are markedly different from those of the wild-type, thereby providing novel insight into the cell wall architecture of A. fumigatus. The results demonstrate the power of using multiple complementary techniques for probing microbial cell surfaces.

  9. Workflow Based Software Development Environment, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  10. COSMOS: Python library for massively parallel workflows.

    Science.gov (United States)

    Gafni, Erik; Luquette, Lovelace J; Lancaster, Alex K; Hawkins, Jared B; Jung, Jae-Yoon; Souilmi, Yassine; Wall, Dennis P; Tonellato, Peter J

    2014-10-15

    Efficient workflows to shepherd clinically generated genomic data through the multiple stages of a next-generation sequencing pipeline are of critical importance in translational biomedical science. Here we present COSMOS, a Python library for workflow management that allows formal description of pipelines and partitioning of jobs. In addition, it includes a user interface for tracking the progress of jobs, abstraction of the queuing system and fine-grained control over the workflow. Workflows can be created on traditional computing clusters as well as cloud-based services. Source code is available for academic non-commercial research purposes. Links to code and documentation are provided at http://lpm.hms.harvard.edu and http://wall-lab.stanford.edu. dpwall@stanford.edu or peter_tonellato@hms.harvard.edu. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.

  11. Implementing Workflow Reconfiguration in WS-BPEL

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Dragoni, Nicola; Zhou, Mu

    2012-01-01

    This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would not natu......This paper investigates the problem of dynamic reconfiguration by means of a workflow-based case study used for discussion. We state the requirements on a system implementing the workflow and its reconfiguration, and we describe the system’s design in BPMN. WS-BPEL, a language that would...... not naturally support dynamic change, is used as a target for implementation. The WS-BPEL recovery framework is here exploited to implement the reconfiguration using principles derived from previous research in process algebra and two mappings from BPMN to WS-BPEL are presented, one automatic and only mostly...

  12. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.; Das Sarma, Akash; Widom, J.

    2013-01-01

    for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations' provenance to the workflow provenance. We

  13. A Multilevel Secure Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Sheth, Amit P; Kochut, Krys J; Miller, John A

    1999-01-01

    The Department of Defense (DoD) needs multilevel secure (MLS) workflow management systems to enable globally distributed users and applications to cooperate across classification levels to achieve mission critical goals...

  14. Workflow Based Software Development Environment, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this proposed research is to investigate and develop a workflow based tool, the Software Developers Assistant, to facilitate the collaboration between...

  15. Liquid electrode plasma-optical emission spectrometry combined with solid-phase preconcentration for on-site analysis of lead.

    Science.gov (United States)

    Barua, Suman; Rahman, Ismail M M; Alam, Iftakharul; Miyaguchi, Maho; Sawai, Hikaru; Maki, Teruya; Hasegawa, Hiroshi

    2017-08-15

    A relatively rapid and precise method is presented for the determination of lead in aqueous matrix. The method consists of analyte quantitation using the liquid electrode plasma-optical emission spectrometry (LEP-OES) coupled with selective separation/preconcentration by solid-phase extraction (SPE). The impact of operating variables on the retention of lead in SPEs such as pH, flow rate of the sample solution; type, volume, flow rate of the eluent; and matrix effects were investigated. Selective SPE-separation/preconcentration minimized the interfering effect due to manganese in solution and limitations in lead-detection in low-concentration samples by LEP-OES. The LEP-OES operating parameters such as the electrical conductivity of sample solution; applied voltage; on-time, off-time, pulse count for applied voltage; number of measurements; and matrix effects have also been optimized to obtain a distinct peak for the lead at λ max =405.8nm. The limit of detection (3σ) and the limit of quantification (10σ) for lead determination using the technique were found as 1.9 and 6.5ng mL -1 , respectively. The precision, as relative standard deviation, was lower than 5% at 0.1μg mL -1 Pb, and the preconcentration factor was found to be 187. The proposed method was applied to the analysis of lead contents in the natural aqueous matrix (recovery rate:>95%). The method accuracy was verified using certified reference material of wastewaters: SPS-WW1 and ERM-CA713. The results from LEP-OES were in good agreement with inductively coupled plasma optical emission spectrometry measurements of the same samples. The application of the method is rapid (≤5min, without preconcentration) with a reliable detection limit at trace levels. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  17. Multilevel Workflow System in the ATLAS Experiment

    CERN Document Server

    Borodin, M; The ATLAS collaboration; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs...

  18. Haptoglobin is a serological biomarker for adenocarcinoma lung cancer by using the ProteomeLab PF2D combined with mass spectrometry.

    Science.gov (United States)

    Chang, You-Kang; Lai, Yu-Heng; Chu, Yen; Lee, Ming-Cheng; Huang, Chun-Yao; Wu, Semon

    2016-01-01

    Identification of serological biomarker is urgently needed for cancer screening, monitoring cancer progression, treatment response, and surveillance for recurrence in lung cancer. Therefore, we try to find new serological biomarker that has more specificity and sensitivity for lung cancer diagnostics. In this study, the 2-D liquid phase fractionation system (PF2D) and mass spectrometry approach has been used for comparison the serum profiles between lung cancer patients and healthy individuals. Eight proteins were identified form PF2D and subsequently by mass spectrometry. Among these proteins, haptoglobin (HP) and apolipoprotein AI (APOA1) were chosen and validated with turbidimetric assay. We found that HP levels were significantly higher and APOA1 levels were significantly lower in lung cancer patients. However, after the participants were stratified by gender, the expression trends of HP and APOA1 in lung cancer patients existed only in men, which is gender specific phenomenon. HP, APOA1 and carcinoembryonic antigen (CEA), used for distinguishing lung adenocarcinoma, had a sensitivity of 64%, 64% and 79%, respectively. Area under the ROC curve (AUC) of HP, APOA1 and CEA were 0.768, 0.761 and 0.884, respectively. When restricted to male subjects, HP, APOA1 and CEA showed sensitivity of 89%, 73% and 100%, respectively. AUC of HP, APOA1 and CEA were 0.929, 0.840 and 0.877, respectively. Therefore, our results showed that combined with PF2D system and mass spectrometry, this is a promising novel approach to identify new serological biomarkers for lung cancer research. In addition, HP may be a potential serological biomarker for lung adenocarcinoma diagnostics, especially in male subjects.

  19. Formalizing an integrative, multidisciplinary cancer therapy discovery workflow

    Science.gov (United States)

    McGuire, Mary F.; Enderling, Heiko; Wallace, Dorothy I.; Batra, Jaspreet; Jordan, Marie; Kumar, Sushil; Panetta, John C.; Pasquier, Eddy

    2014-01-01

    Although many clinicians and researchers work to understand cancer, there has been limited success to effectively combine forces and collaborate over time, distance, data and budget constraints. Here we present a workflow template for multidisciplinary cancer therapy that was developed during the 2nd Annual Workshop on Cancer Systems Biology sponsored by Tufts University, Boston, MA in July 2012. The template was applied to the development of a metronomic therapy backbone for neuroblastoma. Three primary groups were identified: clinicians, biologists, and scientists (mathematicians, computer scientists, physicists and engineers). The workflow described their integrative interactions; parallel or sequential processes; data sources and computational tools at different stages as well as the iterative nature of therapeutic development from clinical observations to in vitro, in vivo, and clinical trials. We found that theoreticians in dialog with experimentalists could develop calibrated and parameterized predictive models that inform and formalize sets of testable hypotheses, thus speeding up discovery and validation while reducing laboratory resources and costs. The developed template outlines an interdisciplinary collaboration workflow designed to systematically investigate the mechanistic underpinnings of a new therapy and validate that therapy to advance development and clinical acceptance. PMID:23955390

  20. Functionalized graphene quantum dots loaded with free radicals combined with liquid chromatography and tandem mass spectrometry to screen radical scavenging natural antioxidants from Licorice and Scutellariae.

    Science.gov (United States)

    Wang, Guoying; Niu, XiuLi; Shi, Gaofeng; Chen, Xuefu; Yao, Ruixing; Chen, Fuwen

    2014-12-01

    A novel screening method was developed for the detection and identification of radical scavenging natural antioxidants based on a free radical reaction combined with liquid chromatography with tandem mass spectrometry. Functionalized graphene quantum dots were prepared for loading free radicals in the complex screening system. The detection was performed with and without a preliminary exposure of the samples to specific free radicals on the functionalized graphene quantum dots, which can facilitate charge transfer between free radicals and antioxidants. The difference in chromatographic peak areas was used to identify potential antioxidants. This is a novel approach to simultaneously evaluate the antioxidant power of a component versus a free radical, and to identify it in a vegetal matrix. The structures of the antioxidants in the samples were identified using tandem mass spectrometry and comparison with standards. Fourteen compounds were found to possess potential antioxidant activity, and their free radical scavenging capacities were investigated. The order of scavenging capacity of 14 compounds was compared according to their free radical scavenging rate. 4',5,6,7-Tetrahydroxyflavone (radical scavenging rate: 0.05253 mL mg(-1) s(-1) ) showed the strongest capability for scavenging free radicals. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Assessment of a new method for the analysis of decomposition gases of polymers by a combining thermogravimetric solid-phase extraction and thermal desorption gas chromatography mass spectrometry.

    Science.gov (United States)

    Duemichen, E; Braun, U; Senz, R; Fabian, G; Sturm, H

    2014-08-08

    For analysis of the gaseous thermal decomposition products of polymers, the common techniques are thermogravimetry, combined with Fourier transformed infrared spectroscopy (TGA-FTIR) and mass spectrometry (TGA-MS). These methods offer a simple approach to the decomposition mechanism, especially for small decomposition molecules. Complex spectra of gaseous mixtures are very often hard to identify because of overlapping signals. In this paper a new method is described to adsorb the decomposition products during controlled conditions in TGA on solid-phase extraction (SPE) material: twisters. Subsequently the twisters were analysed with thermal desorption gas chromatography mass spectrometry (TDS-GC-MS), which allows the decomposition products to be separated and identified using an MS library. The thermoplastics polyamide 66 (PA 66) and polybutylene terephthalate (PBT) were used as example polymers. The influence of the sample mass and of the purge gas flow during the decomposition process was investigated in TGA. The advantages and limitations of the method were presented in comparison to the common analysis techniques, TGA-FTIR and TGA-MS. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. QuEChERS Purification Combined with Ultrahigh-Performance Liquid Chromatography Tandem Mass Spectrometry for Simultaneous Quantification of 25 Mycotoxins in Cereals

    Science.gov (United States)

    Sun, Juan; Li, Weixi; Zhang, Yan; Hu, Xuexu; Wu, Li; Wang, Bujun

    2016-01-01

    A method based on the QuEChERS (quick, easy, cheap, effective, rugged, and safe) purification combined with ultrahigh performance liquid chromatography tandem mass spectrometry (UPLC–MS/MS), was optimized for the simultaneous quantification of 25 mycotoxins in cereals. Samples were extracted with a solution containing 80% acetonitrile and 0.1% formic acid, and purified with QuEChERS before being separated by a C18 column. The mass spectrometry was conducted by using positive electrospray ionization (ESI+) and multiple reaction monitoring (MRM) models. The method gave good linear relations with regression coefficients ranging from 0.9950 to 0.9999. The detection limits ranged from 0.03 to 15.0 µg·kg−1, and the average recovery at three different concentrations ranged from 60.2% to 115.8%, with relative standard deviations (RSD%) varying from 0.7% to 19.6% for the 25 mycotoxins. The method is simple, rapid, accurate, and an improvement compared with the existing methods published so far. PMID:27983693

  3. Elucidation of oxidation and degradation products of oxygen containing fuel components by combined use of a stable isotopic tracer and mass spectrometry.

    Science.gov (United States)

    Frauscher, Marcella; Besser, Charlotte; Allmaier, Günter; Dörr, Nicole

    2017-11-15

    In order to reveal the degradation products of oxygen-containing fuel components, in particular fatty acid methyl esters, a novel approach was developed to characterize the oxidation behaviour. Combination of artificial alteration under pressurized oxygen atmosphere, a stable isotopic tracer, and gas chromatography electron impact mass spectrometry (GC-EI-MS) was used to obtain detailed information on the formation of oxidation products of (9Z), (12Z)-octadecadienoic acid methyl ester (C18:2 ME). Thereby, biodiesel simulating model compound C18:2 ME was oxidized in a rotating pressurized vessel standardized for lubricant oxidation tests (RPVOT), i.e., artificially altered, under 16 O 2 as well as 18 O 2 atmosphere. Identification of the formed degradation products, mainly carboxylic acids of various chain lengths, alcohols, ketones, and esters, was performed by means of GC-EI-MS. Comparison of mass spectra of compounds under both atmospheres revealed not only the degree of oxidation and the origin of oxygen atoms, but also the sites of oxidative attack and bond cleavage. Hence, the developed and outlined strategy based on a gas-phase stable isotopic tracer and mass spectrometry provides insight into the degradation of oxygen-containing fuels and fuel components by means of the accurate differentiation of oxygen origin in a degradation product. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. DEWEY: the DICOM-enabled workflow engine system.

    Science.gov (United States)

    Erickson, Bradley J; Langer, Steve G; Blezek, Daniel J; Ryan, William J; French, Todd L

    2014-06-01

    Workflow is a widely used term to describe the sequence of steps to accomplish a task. The use of workflow technology in medicine and medical imaging in particular is limited. In this article, we describe the application of a workflow engine to improve workflow in a radiology department. We implemented a DICOM-enabled workflow engine system in our department. We designed it in a way to allow for scalability, reliability, and flexibility. We implemented several workflows, including one that replaced an existing manual workflow and measured the number of examinations prepared in time without and with the workflow system. The system significantly increased the number of examinations prepared in time for clinical review compared to human effort. It also met the design goals defined at its outset. Workflow engines appear to have value as ways to efficiently assure that complex workflows are completed in a timely fashion.

  5. Direct solid phase microextraction combined with gas chromatography - Mass spectrometry for the determination of biogenic amines in wine.

    Science.gov (United States)

    Papageorgiou, Myrsini; Lambropoulou, Dimitra; Morrison, Calum; Namieśnik, Jacek; Płotka-Wasylka, Justyna

    2018-06-01

    A direct method based on immersion solid phase microextraction (DI-SPME) gas chromatography mass-spectrometry (GC-MS) was optimized and validated for the determination of 16 biogenic amines in Polish wines. In the analysis two internal standards were used: 1,7-diaminoheptane and bis-3-aminopropylamine. The method allows for simultaneous extraction and derivatization, providing a simple and fast mode of extraction and enrichment. Different parameters which affect the extraction procedure were studied and optimized including ionic strength (0-25%), fiber materials (PDMS/DVB, PDMS/DVD + OC, Polyacrylate, Carboxen/PDMS and DVB/CAR/PDMS) and timings of the extraction, derivatization and desorption processes. Validation studies confirmed the linearity, sensitivity, precision and accuracy of the method. The method was successfully applied to the analysis of 44 wine samples originating from several regions of Poland and 3 wine samples from other countries. Analysis showed that many of the samples contained all examined biogenic amines. The method, assessed using an Eco-Scale tool with satisfactory results, was found to be green in terms of hazardous chemicals and solvents usage, energy consumption and production of waste. Therefore the proposed method can be safely used in the wine industry for routine analysis of BAs in wine samples with a minimal detrimental impact on human health and the environment. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Measurement of opioid peptides with combinations of reversed phase high performance liquid chromatography, radioimmunoassay, radioreceptorassay, and mass spectrometry

    International Nuclear Information System (INIS)

    Fridland, G.H.; Desiderio, D.M.

    1987-01-01

    As the first step, RP-HPLC gradient elution is performed of a Sep-Pak treated peptide-rich fraction from a tissue extract, and the eluent is monitored by a variety of post-HPLC detectors. In an effort to maximize the structural information that can be obtained from the analysis, UV provides the analog absorption trace; receptorassay analysis (RRA) data of all fractions that are collected are used to construct the profile of opioid-receptoractive peptides; radioimmunoassay (RIA) of selected HPLC fractions at retention times corresponding to the retention time of standards, or in some special cases of all 90-fractions, provides immunoreactivity information; and fast atom bombardment mass spectrometry (FAB-MS) in two modes - corroboration of the (M + H) + of the expected peptide, or MS/MS to monitor an amino acid sequence-determining fragment ion unique to that peptide in the selected ion monitoring (SIM) mode - provides structural information. As a demonstration of the level of quantification sensitivity that can be attained by these novel MS methods, FAB-MS-MS-SIM of solutions of synthetic leucine enkephalin was sensitive to the 70 femtomole level. This paper discusses RIA versus RRA data, and recent MS measurements of peptides in human tissues. 4 references, 1 figure

  7. Combined Mass Spectrometry-Based Metabolite Profiling of Different Pigmented Rice (Oryza sativa L. Seeds and Correlation with Antioxidant Activities

    Directory of Open Access Journals (Sweden)

    Ga Ryun Kim

    2014-09-01

    Full Text Available Nine varieties of pigmented rice (Oryza sativa L. seeds that were black, red, or white were used to perform metabolite profiling by using ultra-performance liquid chromatography-quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF-MS and gas chromatography (GC TOF-MS, to measure antioxidant activities. Clear grouping patterns determined by the color of the rice seeds were identified in principle component analysis (PCA derived from UPLC-Q-TOF-MS. Cyanidin-3-glucoside, peonidin-3-glucoside, proanthocyanidin dimer, proanthocyanidin trimer, apigenin-6-C-glugosyl-8-C-arabiboside, tricin-O-rhamnoside-O-hexoside, and lipids were identified as significantly different secondary metabolites. In PCA score plots derived from GC-TOF-MS, Jakwangdo (JKD and Ilpoom (IP species were discriminated from the other rice seeds by PC1 and PC2. Valine, phenylalanine, adenosine, pyruvate, nicotinic acid, succinic acid, maleic acid, malonic acid, gluconic acid, xylose, fructose, glucose, maltose, and myo-inositol were significantly different primary metabolites in JKD species, while GABA, asparagine, xylitol, and sucrose were significantly distributed in IP species. Analysis of antioxidant activities revealed that black and red rice seeds had higher activity than white rice seeds. Cyanidin-3-glucoside, peonidin-3-glucoside, proanthocyanidin dimers, proanthocyanidin trimers, and catechin were highly correlated with antioxidant activities, and were more plentiful in black and red rice seeds. These results are expected to provide valuable information that could help improve and develop rice-breeding techniques.

  8. Determining the Effect of Catechins on SOD1 Conformation and Aggregation by Ion Mobility Mass Spectrometry Combined with Optical Spectroscopy

    Science.gov (United States)

    Zhao, Bing; Zhuang, Xiaoyu; Pi, Zifeng; Liu, Shu; Liu, Zhiqiang; Song, Fengrui

    2018-02-01

    The aggregation of Cu,Zn-superoxide dismutase (SOD1) plays an important role in the etiology of amyotrophic lateral sclerosis (ALS). For the disruption of ALS progression, discovering new drugs or compounds that can prevent SOD1 aggregation is important. In this study, ESI-MS was used to investigate the interaction of catechins and SOD1. The noncovalent complex of catechins that interact with SOD1 was found and retained in the gas phase under native ESI-MS condition. The conformation changes of SOD1 after binding with catechins were also explored via traveling wave ion mobility (IM) spectrometry. Epigallocatechin gallate (EGCG) can stabilize SOD1 conformation against unfolding in three catechins. To further evaluate the efficacy of EGCG, we monitored the fluorescence changes of dimer E2,E2,-SOD1(apo-SOD1, E:empty) with and without ligands under denaturation conditions, and found that EGCG can inhibit apo-SOD1 aggregation. In addition, the circular dichroism spectra of the samples showed that EGCG can decrease the β-sheet content of SOD1, which can produce aggregates. These results indicated that orthogonal separation dimension in the gas-phase IM coupled with ESI-MS (ESI-IM-MS) can potentially provide insight into the interaction between SOD1 and small molecules. The advantage is that it dramatically decreases the analysis time. Meantime, optical spectroscopy techniques can be used to confirm ESI-IM-MS results. [Figure not available: see fulltext.

  9. Preconcentration and determination of zinc and lead ions by a combination of cloud point extraction and flame atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Tavallali, H. [Chemistry Department, Payamenore University, Shiraz (Iran); Shokrollahi, A.; Zahedi, M. [Chemistry Department, Yasouj University, Yasouj (Iran); Niknam, K. [Chemistry Department, Persian Gulf University, Bushehr (Iran); Soylak, M. [Chemistry Department, University of Erciyes, Kayseri (Turkey); Ghaedi, M.

    2009-04-15

    The phase-separation phenomenon of non-ionic surfactants occurring in aqueous solution was used for the extraction of lead(II) and zinc(II). After complexation with 3-[(4-bromophenyl) (1-H-inden-3-yl)methyl]-1 H-indene (BPIMI), the analytes were quantitatively extracted to a phase rich in Triton X-114 after centrifugation. Methanol acidified with 1 mol/L HNO{sub 3} was added to the surfactant rich phase prior to its analysis by flame atomic absorption spectrometry (FAAS). The concentration of bis((1H-benzo [d] imidazol-2yl)ethyl)sulfane, Triton X-114, pH and amount of surfactant were all optimized. Detection limits (3 SDb/m) of 2.5 and 1.6 ng/mL for Pb{sup 2+} and Zn{sup 2+} along with preconcentration factors of 30 and an enrichment factor of 32 and 48 for Pb{sup 2+}and Zn {sup 2+} ions were obtained, respectively. The proposed cloud point extraction was been successfully applied for the determination of these ions in real samples with complicated matrices such as food and soil samples, with high efficiency. (Abstract Copyright [2009], Wiley Periodicals, Inc.)

  10. [Determination of lidocaine and its metabolites in human plasma by liquid chromatography in combination with tandem mass spectrometry].

    Science.gov (United States)

    Xiang, Jin; Zhang, Cheng; Yu, Qin; Liang, Mao-Zhi; Qin, Yong-Ping; Nan, Feng

    2010-07-01

    To establish a liquid chromatography tandem mass spectrometry (HPLC-MS/MS) method for the determination of lidocaine (LDC) and its metabolites, monoethylglycinexylidide (MEGX) and glycinexylidide (GX), in human plasma. METHODS; The assay was conducted with an API 3000 HPLC-MS/MS system consisted of a Ultimate C18 column (50 x 4.6 mm, 5 microm). The mobile phase consisted of methanol: 5 mmol/ L ammonium acetate (50:50, pH was adjusted to 5.0 by formic acid) and the flow rate was set at 0.2 mL/min. The alkalinized sample was extracted with ethyl acetate. After evaporation of the organic layer, the residue was dissolved in mobile phase and the drug was determined by HPLC-MS/MS using electrospray ionization. The calibration curve was linear in a range from 15.625 to 2000 ng/mL for LDC. Linear calibration curves were obtained in the range of 1.5625 to 200 ng/mL for both for MEGX and GX. The limit of quantification for LDC, MEGX and GX was set at 15.625, 1.5625 and 1.5625 ng/mL. This method for the quantitative determination of lidocaine and its metabolites in human plasma is simple, rapid, sensitive and accurate. Therefore it can be used for the determination of lidocaine and its metabolites in clinical practice.

  11. The PBase Scientific Workflow Provenance Repository

    Directory of Open Access Journals (Sweden)

    Víctor Cuevas-Vicenttín

    2014-10-01

    Full Text Available Scientific workflows and their supporting systems are becoming increasingly popular for compute-intensive and data-intensive scientific experiments. The advantages scientific workflows offer include rapid and easy workflow design, software and data reuse, scalable execution, sharing and collaboration, and other advantages that altogether facilitate “reproducible science”. In this context, provenance – information about the origin, context, derivation, ownership, or history of some artifact – plays a key role, since scientists are interested in examining and auditing the results of scientific experiments. However, in order to perform such analyses on scientific results as part of extended research collaborations, an adequate environment and tools are required. Concretely, the need arises for a repository that will facilitate the sharing of scientific workflows and their associated execution traces in an interoperable manner, also enabling querying and visualization. Furthermore, such functionality should be supported while taking performance and scalability into account. With this purpose in mind, we introduce PBase: a scientific workflow provenance repository implementing the ProvONE proposed standard, which extends the emerging W3C PROV standard for provenance data with workflow specific concepts. PBase is built on the Neo4j graph database, thus offering capabilities such as declarative and efficient querying. Our experiences demonstrate the power gained by supporting various types of queries for provenance data. In addition, PBase is equipped with a user friendly interface tailored for the visualization of scientific workflow provenance data, making the specification of queries and the interpretation of their results easier and more effective.

  12. Multilevel Workflow System in the ATLAS Experiment

    International Nuclear Information System (INIS)

    Borodin, M; De, K; Navarro, J Garcia; Golubkov, D; Klimentov, A; Maeno, T; Vaniachine, A

    2015-01-01

    The ATLAS experiment is scaling up Big Data processing for the next LHC run using a multilevel workflow system comprised of many layers. In Big Data processing ATLAS deals with datasets, not individual files. Similarly a task (comprised of many jobs) has become a unit of the ATLAS workflow in distributed computing, with about 0.8M tasks processed per year. In order to manage the diversity of LHC physics (exceeding 35K physics samples per year), the individual data processing tasks are organized into workflows. For example, the Monte Carlo workflow is composed of many steps: generate or configure hard-processes, hadronize signal and minimum-bias (pileup) events, simulate energy deposition in the ATLAS detector, digitize electronics response, simulate triggers, reconstruct data, convert the reconstructed data into ROOT ntuples for physics analysis, etc. Outputs are merged and/or filtered as necessary to optimize the chain. The bi-level workflow manager - ProdSys2 - generates actual workflow tasks and their jobs are executed across more than a hundred distributed computing sites by PanDA - the ATLAS job-level workload management system. On the outer level, the Database Engine for Tasks (DEfT) empowers production managers with templated workflow definitions. On the next level, the Job Execution and Definition Interface (JEDI) is integrated with PanDA to provide dynamic job definition tailored to the sites capabilities. We report on scaling up the production system to accommodate a growing number of requirements from main ATLAS areas: Trigger, Physics and Data Preparation. (paper)

  13. Use of combined alpha-spectrometry and fission track analysis for the determination of 240Pu/239Pu ratios in human tissue

    International Nuclear Information System (INIS)

    Love, S.F.; Filby, R.H.; Glover, S.E.; Stuit, D.B.; Kathren, R.L.

    1998-01-01

    Plutonium and other actinides were determined in human autopsy tissues of occupationally exposed workers who were registrants of the United States Transuranium and Uranium Registries (USTUR). In this study, Pu was purified and isolated from Am, U and Th, after drying and wet-ashing of the tissues, and the addition of 238 Pu as a radiotracer. After electrodeposition onto vanadium planchets, the 239+240 Pu activity was determined by alpha-spectrometry. A fission track method was developed to determine 239 Pu in the presence of 238 Pu and 240 Pu, using Lexan TM polycarbonate detectors. Combining the two techniques allowed the determination of the 240 Pu/ 239 Pu activity and atom ratios. Data from selected USTUR cases are presented. (author)

  14. Rapid determination of trace nitrophenolic organics in water by combining solid-phase extraction with surface-assisted laser desorption/ionization time-of-flight mass spectrometry.

    Science.gov (United States)

    Chen, Y C; Shiea, J; Sunner, J

    2000-01-01

    A rapid technique for the screening of trace compounds in water by combining solid-phase extraction (SPE) with activated carbon surface-assisted laser desorption/ionization (SALDI) time-of-flight mass spectrometry is demonstrated. Activated carbon is used both as the sorbent in SPE and as the solid in the SALDI matrix system. This eliminates the need for an SPE elution process. After the analytes have been adsorbed on the surfaces of the activated carbon during SPE extraction, the activated carbon is directly mixed with the SALDI liquid and mass spectrometric analysis is performed. Trace phenolic compounds in water were used to demonstrate the effectiveness of the method. The detection limit for these compounds is in the ppb to ppt range. Copyright 2000 John Wiley & Sons, Ltd.

  15. Determination of trace quaternary ammonium surfactants in water by combining solid-phase extraction with surface-assisted laser desorption/ionization mass spectrometry.

    Science.gov (United States)

    Chen, Y C; Sun, M C

    2001-01-01

    This study demonstrates the feasibility of combining solid-phase extraction (SPE) with surface-assisted laser desorption/ionization (SALDI) mass spectrometry to determine trace quaternary ammonium surfactants in water. The trace surfactants in water were directly concentrated on the surface of activated carbon sorbent in SPE. The activated carbon sorbent was then mixed with the SALDI liquid for SALDI analysis. No SPE elution procedure was necessary. Experimental results indicate that the surfactants with longer chain alkyl groups exhibit higher sensitivities than those with shorter chain alkyl groups in SPE-SALDI analysis. The detection limit for hexadecyltrimethylammonium bromide is around 10 ppt in SPE-SALDI analysis by sampling 100 mL of aqueous solution, while that of tetradecyltrimethylammonium bromide is about 100 ppt. The detection limit for decyltrimethylammonium bromide and dodecyltrimethylammonium bromide is in the low-ppb range. Copyright 2001 John Wiley & Sons, Ltd.

  16. Dispersive liquid-liquid microextraction (DLLME combined with graphite furnace atomic absorption spectrometry (GFAAS for determination of trace Cu and Zn in water Samples

    Directory of Open Access Journals (Sweden)

    Ghorbani A.

    2014-07-01

    Full Text Available Dispersive liquid-liquid microextraction (DLLME combined with graphite furnace atomic absorption spectrometry (GFAAS was proposed for the determination of trace amounts of Copper and Zinc ions using 8-hydroxyquinoline (8-HQ as chelating agent. Several factors influencing the microextraction efficiency of Cu and Zn and their subsequent determinations, such as pH, extraction and disperser solvent type and their volume, concentration of the chelating agent and extraction time were studied, and the optimized experimental conditions were established. After extraction, the enrichment factors were 25 and 26 for Cu and Zn, respectively. The detection limits of the method were 0.025 and 0.0033 μg/L for Cu and Zn, and the relative standard deviations (R.S.D for five determinations of 1 ng/ml Cu and Zn were 8.51% and 7.41%, respectively.

  17. Progress in digital color workflow understanding in the International Color Consortium (ICC) Workflow WG

    Science.gov (United States)

    McCarthy, Ann

    2006-01-01

    The ICC Workflow WG serves as the bridge between ICC color management technologies and use of those technologies in real world color production applications. ICC color management is applicable to and is used in a wide range of color systems, from highly specialized digital cinema color special effects to high volume publications printing to home photography. The ICC Workflow WG works to align ICC technologies so that the color management needs of these diverse use case systems are addressed in an open, platform independent manner. This report provides a high level summary of the ICC Workflow WG objectives and work to date, focusing on the ways in which workflow can impact image quality and color systems performance. The 'ICC Workflow Primitives' and 'ICC Workflow Patterns and Dimensions' workflow models are covered in some detail. Consider the questions, "How much of dissatisfaction with color management today is the result of 'the wrong color transformation at the wrong time' and 'I can't get to the right conversion at the right point in my work process'?" Put another way, consider how image quality through a workflow can be negatively affected when the coordination and control level of the color management system is not sufficient.

  18. A method to mine workflows from provenance for assisting scientific workflow composition

    NARCIS (Netherlands)

    Zeng, R.; He, X.; Aalst, van der W.M.P.

    2011-01-01

    Scientific workflows have recently emerged as a new paradigm for representing and managing complex distributed scientific computations and are used to accelerate the pace of scientific discovery. In many disciplines, individual workflows are large and complicated due to the large quantities of data

  19. Spin trapping combined with quantitative mass spectrometry defines free radical redistribution within the oxidized hemoglobin:haptoglobin complex.

    Science.gov (United States)

    Vallelian, Florence; Garcia-Rubio, Ines; Puglia, Michele; Kahraman, Abdullah; Deuel, Jeremy W; Engelsberger, Wolfgang R; Mason, Ronald P; Buehler, Paul W; Schaer, Dominik J

    2015-08-01

    Extracellular or free hemoglobin (Hb) accumulates during hemolysis, tissue damage, and inflammation. Heme-triggered oxidative reactions can lead to diverse structural modifications of lipids and proteins, which contribute to the propagation of tissue damage. One important target of Hb׳s peroxidase reactivity is its own globin structure. Amino acid oxidation and crosslinking events destabilize the protein and ultimately cause accumulation of proinflammatory and cytotoxic Hb degradation products. The Hb scavenger haptoglobin (Hp) attenuates oxidation-induced Hb degradation. In this study we show that in the presence of hydrogen peroxide (H2O2), Hb and the Hb:Hp complex share comparable peroxidative reactivity and free radical generation. While oxidation of both free Hb and Hb:Hp complex generates a common tyrosine-based free radical, the spin-trapping reaction with 5,5-dimethyl-1-pyrroline N-oxide (DMPO) yields dissimilar paramagnetic products in Hb and Hb:Hp, suggesting that radicals are differently redistributed within the complex before reacting with the spin trap. With LC-MS(2) mass spectrometry we assigned multiple known and novel DMPO adduct sites. Quantification of these adducts suggested that the Hb:Hp complex formation causes extensive delocalization of accessible free radicals with drastic reduction of the major tryptophan and cysteine modifications in the β-globin chain of the Hb:Hp complex, including decreased βCys93 DMPO adduction. In contrast, the quantitative changes in DMPO adduct formation on Hb:Hp complex formation were less pronounced in the Hb α-globin chain. In contrast to earlier speculations, we found no evidence that free Hb radicals are delocalized to the Hp chain of the complex. The observation that Hb:Hp complex formation alters free radical distribution in Hb may help to better understand the structural basis for Hp as an antioxidant protein. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Differential isotope dansylation labeling combined with liquid chromatography mass spectrometry for quantification of intact and N-terminal truncated proteins

    International Nuclear Information System (INIS)

    Tang, Yanan; Li, Liang

    2013-01-01

    Graphical abstract: -- Highlights: •LC–MS was developed for quantifying protein mixtures containing both intact and N-terminal truncated proteins. • 12 C 2 -Dansylation of the N-terminal amino acid of proteins was done first, followed by microwave-assisted acid hydrolysis. •The released 12 C 2 -dansyl labeled N-terminal amino acid was quantified using 13 C 2 -dansyl labeled amino acid standards. •The method provided accurate and precise results for quantifying intact and N-terminal truncated proteins within 8 h. -- Abstract: The N-terminal amino acids of proteins are important structure units for maintaining the biological function, localization, and interaction networks of proteins. Under different biological conditions, one or several N-terminal amino acids could be cleaved from an intact protein due to processes, such as proteolysis, resulting in the change of protein properties. Thus, the ability to quantify the N-terminal truncated forms of proteins is of great importance, particularly in the area of development and production of protein-based drugs where the relative quantity of the intact protein and its truncated form needs to be monitored. In this work, we describe a rapid method for absolute quantification of protein mixtures containing intact and N-terminal truncated proteins. This method is based on dansylation labeling of the N-terminal amino acids of proteins, followed by microwave-assisted acid hydrolysis of the proteins into amino acids. It is shown that dansyl labeled amino acids are stable in acidic conditions and can be quantified by liquid chromatography mass spectrometry (LC–MS) with the use of isotope analog standards

  1. Parallel path nebulizer: Critical parameters for use with microseparation techniques combined with inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Yanes, Enrique G.; Miller-Ihli, Nancy J.

    2005-01-01

    Four different, low flow parallel path Mira Mist CE nebulizers were evaluated and compared in support of an ongoing project related to the use of microseparation techniques interfaced to inductively coupled plasma mass spectrometry for the quantification of cobalamin species (Vitamin B12). For the characterization of the different Mira Mist CE nebulizers, the nebulizer orientation as well as the effect of methanol on analytical response was the focus of the study. The position of the gas outlet on the nebulizer which consistently provided the maximum signal was when it was rotated to the 11 o'clock position when the nebulizer is viewed end-on. With this orientation the increased signal may be explained by the fact that the cone angle of the aerosol is such that the largest percentage of the aerosol is directed to the center of the spray chamber and consequently into the plasma. To characterize the nebulizer's performance, the signal response of a multielement solution containing elements with a variety of ionization potentials was used. The selection of elements with varying ionization energies and degrees of ionization was essential for a better understanding of observed increases in signal enhancement when methanol was used. Two different phenomena contribute to signal enhancement when using methanol: the first is improved transport efficiency and the second is the 'carbon enhancement effect'. The net result was that as much as a 30-fold increase in signal was observed for As and Mg when using a make-up solution of 20% methanol at a 15 μL/min flow rate which is equivalent to a net volume of 3 μL/min of pure methanol

  2. Differential isotope dansylation labeling combined with liquid chromatography mass spectrometry for quantification of intact and N-terminal truncated proteins

    Energy Technology Data Exchange (ETDEWEB)

    Tang, Yanan; Li, Liang, E-mail: Liang.Li@ualberta.ca

    2013-08-20

    Graphical abstract: -- Highlights: •LC–MS was developed for quantifying protein mixtures containing both intact and N-terminal truncated proteins. •{sup 12}C{sub 2}-Dansylation of the N-terminal amino acid of proteins was done first, followed by microwave-assisted acid hydrolysis. •The released {sup 12}C{sub 2}-dansyl labeled N-terminal amino acid was quantified using {sup 13}C{sub 2}-dansyl labeled amino acid standards. •The method provided accurate and precise results for quantifying intact and N-terminal truncated proteins within 8 h. -- Abstract: The N-terminal amino acids of proteins are important structure units for maintaining the biological function, localization, and interaction networks of proteins. Under different biological conditions, one or several N-terminal amino acids could be cleaved from an intact protein due to processes, such as proteolysis, resulting in the change of protein properties. Thus, the ability to quantify the N-terminal truncated forms of proteins is of great importance, particularly in the area of development and production of protein-based drugs where the relative quantity of the intact protein and its truncated form needs to be monitored. In this work, we describe a rapid method for absolute quantification of protein mixtures containing intact and N-terminal truncated proteins. This method is based on dansylation labeling of the N-terminal amino acids of proteins, followed by microwave-assisted acid hydrolysis of the proteins into amino acids. It is shown that dansyl labeled amino acids are stable in acidic conditions and can be quantified by liquid chromatography mass spectrometry (LC–MS) with the use of isotope analog standards.

  3. Combined solid-phase extraction and gas chromatography-mass spectrometry used for determination of chloropropanols in water.

    Science.gov (United States)

    González, Paula; Racamonde, Inés; Carro, Antonia M; Lorenzo, Rosa A

    2011-10-01

    A sensitive and rapid derivatization method for the simultaneous determination of 1,3-dichloro-2-propanol (1,3-DCP) and 3-chloropropane-1,2-diol (3-MCPD) in water samples has been developed. The aim was to research the optimal conditions of the derivatization process for two selected reagents. A central composite design was used to determine the influence of derivatization time, derivatization temperature and reagent volume. A global desirability function was applied for multi-response optimization. The analysis was performed by gas chromatography-mass spectrometry. During the optimization of the extraction procedure, four different types of solid-phase extraction (SPE) columns were tested. It was demonstrated that the Oasis HLB cartridge produced the best recoveries of the target analytes. The pH value and the salinity were investigated using a Doehlert design. The best results for the SPE of both analytes were obtained with 1.5 g of NaCl and pH 6. The proposed method provides high sensitivity, good linearity (R(2)≥0.999) and repeatability (relative standard deviations % between 2.9 and 3.4%). Limits of detection and quantification were in the range of 1.4-11.2 ng/mL and 4.8-34.5 ng/mL, respectively. Recoveries obtained for water samples were ca. 100% for 1,3-DCP and 3-MCPD. The method has been successfully applied to the analysis of different samples including commercially bottled water, an influent and effluent sewage. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Combining Mass Spectrometry and Toxicology for a Multi-Country European Epidemiologic Study on Drinking Water Disinfection By-Products

    Science.gov (United States)

    The HiWATE (Health Impacts of long-term exposure to disinfection by-products in drinking WATEr) project is the first systematic analysis that combines the epidemiology on adverse pregnancy outcomes with analytical chemistry and analytical biology in the European Union. This study...

  5. The MPO system for automatic workflow documentation

    Energy Technology Data Exchange (ETDEWEB)

    Abla, G.; Coviello, E.N.; Flanagan, S.M. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Greenwald, M. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Lee, X. [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Romosan, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Schissel, D.P., E-mail: schissel@fusion.gat.com [General Atomics, P.O. Box 85608, San Diego, CA 92186-5608 (United States); Shoshani, A. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Stillerman, J.; Wright, J. [Massachusetts Institute of Technology, Cambridge, MA 02139 (United States); Wu, K.J. [Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States)

    2016-11-15

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  6. Integrating configuration workflows with project management system

    International Nuclear Information System (INIS)

    Nilsen, Dimitri; Weber, Pavel

    2014-01-01

    The complexity of the heterogeneous computing resources, services and recurring infrastructure changes at the GridKa WLCG Tier-1 computing center require a structured approach to configuration management and optimization of interplay between functional components of the whole system. A set of tools deployed at GridKa, including Puppet, Redmine, Foreman, SVN and Icinga, provides the administrative environment giving the possibility to define and develop configuration workflows, reduce the administrative effort and improve sustainable operation of the whole computing center. In this presentation we discuss the developed configuration scenarios implemented at GridKa, which we use for host installation, service deployment, change management procedures, service retirement etc. The integration of Puppet with a project management tool like Redmine provides us with the opportunity to track problem issues, organize tasks and automate these workflows. The interaction between Puppet and Redmine results in automatic updates of the issues related to the executed workflow performed by different system components. The extensive configuration workflows require collaboration and interaction between different departments like network, security, production etc. at GridKa. Redmine plugins developed at GridKa and integrated in its administrative environment provide an effective way of collaboration within the GridKa team. We present the structural overview of the software components, their connections, communication protocols and show a few working examples of the workflows and their automation.

  7. The MPO system for automatic workflow documentation

    International Nuclear Information System (INIS)

    Abla, G.; Coviello, E.N.; Flanagan, S.M.; Greenwald, M.; Lee, X.; Romosan, A.; Schissel, D.P.; Shoshani, A.; Stillerman, J.; Wright, J.; Wu, K.J.

    2016-01-01

    Highlights: • Data model, infrastructure, and tools for data tracking, cataloging, and integration. • Automatically document workflow and data provenance in the widest sense. • Fusion Science as test bed but the system’s framework and data model is quite general. - Abstract: Data from large-scale experiments and extreme-scale computing is expensive to produce and may be used for critical applications. However, it is not the mere existence of data that is important, but our ability to make use of it. Experience has shown that when metadata is better organized and more complete, the underlying data becomes more useful. Traditionally, capturing the steps of scientific workflows and metadata was the role of the lab notebook, but the digital era has resulted instead in the fragmentation of data, processing, and annotation. This paper presents the Metadata, Provenance, and Ontology (MPO) System, the software that can automate the documentation of scientific workflows and associated information. Based on recorded metadata, it provides explicit information about the relationships among the elements of workflows in notebook form augmented with directed acyclic graphs. A set of web-based graphical navigation tools and Application Programming Interface (API) have been created for searching and browsing, as well as programmatically accessing the workflows and data. We describe the MPO concepts and its software architecture. We also report the current status of the software as well as the initial deployment experience.

  8. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    Rzehorz, Gerhard Ferdinand; The ATLAS collaboration

    2016-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value

  9. Data intensive ATLAS workflows in the Cloud

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00396985; The ATLAS collaboration; Keeble, Oliver; Quadt, Arnulf; Kawamura, Gen

    2017-01-01

    This contribution reports on the feasibility of executing data intensive workflows on Cloud infrastructures. In order to assess this, the metric ETC = Events/Time/Cost is formed, which quantifies the different workflow and infrastructure configurations that are tested against each other. In these tests ATLAS reconstruction Jobs are run, examining the effects of overcommitting (more parallel processes running than CPU cores available), scheduling (staggered execution) and scaling (number of cores). The desirability of commissioning storage in the Cloud is evaluated, in conjunction with a simple analytical model of the system, and correlated with questions about the network bandwidth, caches and what kind of storage to utilise. In the end a cost/benefit evaluation of different infrastructure configurations and workflows is undertaken, with the goal to find the maximum of the ETC value.

  10. Logical provenance in data-oriented workflows?

    KAUST Repository

    Ikeda, R.

    2013-04-01

    We consider the problem of defining, generating, and tracing provenance in data-oriented workflows, in which input data sets are processed by a graph of transformations to produce output results. We first give a new general definition of provenance for general transformations, introducing the notions of correctness, precision, and minimality. We then determine when properties such as correctness and minimality carry over from the individual transformations\\' provenance to the workflow provenance. We describe a simple logical-provenance specification language consisting of attribute mappings and filters. We provide an algorithm for provenance tracing in workflows where logical provenance for each transformation is specified using our language. We consider logical provenance in the relational setting, observing that for a class of Select-Project-Join (SPJ) transformations, logical provenance specifications encode minimal provenance. We have built a prototype system supporting the features and algorithms presented in the paper, and we report a few preliminary experimental results. © 2013 IEEE.

  11. Impact of CGNS on CFD Workflow

    Science.gov (United States)

    Poinot, M.; Rumsey, C. L.; Mani, M.

    2004-01-01

    CFD tools are an integral part of industrial and research processes, for which the amount of data is increasing at a high rate. These data are used in a multi-disciplinary fluid dynamics environment, including structural, thermal, chemical or even electrical topics. We show that the data specification is an important challenge that must be tackled to achieve an efficient workflow for use in this environment. We compare the process with other software techniques, such as network or database type, where past experiences showed how difficult it was to bridge the gap between completely general specifications and dedicated specific applications. We show two aspects of the use of CFD General Notation System (CGNS) that impact CFD workflow: as a data specification framework and as a data storage means. Then, we give examples of projects involving CFD workflows where the use of the CGNS standard leads to a useful method either for data specification, exchange, or storage.

  12. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...... for more complex annotations and ultimately to automatically synthesise workflows by composing predefined sub-processes, in order to achieve a configuration that is optimal for parameters of interest....

  13. A virtual radiation therapy workflow training simulation

    International Nuclear Information System (INIS)

    Bridge, P.; Crowe, S.B.; Gibson, G.; Ellemor, N.J.; Hargrave, C.; Carmichael, M.

    2016-01-01

    Aim: Simulation forms an increasingly vital component of clinical skills development in a wide range of professional disciplines. Simulation of clinical techniques and equipment is designed to better prepare students for placement by providing an opportunity to learn technical skills in a “safe” academic environment. In radiotherapy training over the last decade or so this has predominantly comprised treatment planning software and small ancillary equipment such as mould room apparatus. Recent virtual reality developments have dramatically changed this approach. Innovative new simulation applications and file processing and interrogation software have helped to fill in the gaps to provide a streamlined virtual workflow solution. This paper outlines the innovations that have enabled this, along with an evaluation of the impact on students and educators. Method: Virtual reality software and workflow applications have been developed to enable the following steps of radiation therapy to be simulated in an academic environment: CT scanning using a 3D virtual CT scanner simulation; batch CT duplication; treatment planning; 3D plan evaluation using a virtual linear accelerator; quantitative plan assessment, patient setup with lasers; and image guided radiotherapy software. Results: Evaluation of the impact of the virtual reality workflow system highlighted substantial time saving for academic staff as well as positive feedback from students relating to preparation for clinical placements. Students valued practice in the “safe” environment and the opportunity to understand the clinical workflow ahead of clinical department experience. Conclusion: Simulation of most of the radiation therapy workflow and tasks is feasible using a raft of virtual reality simulation applications and supporting software. Benefits of this approach include time-saving, embedding of a case-study based approach, increased student confidence, and optimal use of the clinical environment

  14. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow...

  15. A Prudent Approach to Fair Use Workflow

    Directory of Open Access Journals (Sweden)

    Karey Patterson

    2018-02-01

    Full Text Available This poster will outline a new highly efficient workflow for the management of copyright materials that is prudent and accommodates generally and legally accepted Fair Use limits. The workflow allows library or copyright staff an easy means to keep on top of their copyright obligations, manage licenses and review and adjust schedules but is still a highly efficient means to cope with large numbers of requests to use materials. The poster details speed and efficiency gains for professors and library staff while reducing legal exposure.

  16. Determination of mercury species in biological samples by inductively coupled plasma mass spectrometry combined with solvent extraction and ultrasonication

    International Nuclear Information System (INIS)

    Sun, J.; Li, Y.F.; Wang, J.X.; Chen, C.Y.; Li, B.; Gao, Y.X.; Chai, Z.F.

    2005-01-01

    Mercury (Hg) is a well-known toxic element. The toxic effects of Hg depend on its chemical forms. The most important chemical forms are elemental Hg (Hg 0 ), inorganic Hg (Hg 2+ ) and methylmercury (CH 3 Hg + ). In the biogeochemical cycle of Hg, these species may interchange in atmospheric, aquatic and terrestrial environments. Among them, methylmercury is considerably higher toxic than elemental mercury and inorganic mercury because it is recognized as one of major health hazards for human due to its teratogenic, immunotoxic, and neurotoxic effects. Therefore, determinations of not only total mercury, but also methylmercury content in biological samples is necessary. In large numbers of analytical methods, inductively coupled plasma mass spectrometry (ICP-MS) using conventional sample introduction with a peristaltic pump is widely used for the determination of trace metals in a wide variety of different sample matrices. ICP-MS can offer high sensitivity, low detection limit, reasonable accuracy and precision, and can easily be automated. However, mercury is considered as an element with analytical problems. One problem is well known in Hg analysis that the memory effect increases the blank counts and worsens the analytical performance of ICP-MS. The possibility of Hg losses during sample decomposition procedure due to its volatility is another important issue. Additionally, its high first ionization potential and numerous isotopes have limited its sensitivity in ICP-MS analysis. In order to solve the above questions, the present work was carried out to develop a method based on ICP-MS coupled with solvent extraction for determination of mercury species in biological samples. At first step, we investigated different solvent extraction methods including acid leaching, CuSO 4 extraction, alkaline-methanol extraction, and surfactant extraction with ultrasonication for methylmercury determination using the certified reference materials GBW07601 (Human Hair). Next, we

  17. Structural characterization of new defective molecules in poly(amidoamide) dendrimers by combining mass spectrometry and nuclear magnetic resonance

    Energy Technology Data Exchange (ETDEWEB)

    Tintaru, Aura; Ungaro, Rémi [Aix-Marseille Université – CNRS, UMR 7273, Institut de Chimie Radicalaire, Marseille (France); Liu, Xiaoxiuan; Chen, Chao [Aix-Marseille Université – CNRS, UMR 6114, Centre Interdisciplinaire de Nanosciences de Marseille, Marseille (France); Giordano, Laurent [Aix-Marseille Université – CNRS, UMR 7313, Institut des Sciences Moléculaires de Marseille ISM2 and Ecole Centrale de Marseille, Marseille (France); Peng, Ling [Aix-Marseille Université – CNRS, UMR 6114, Centre Interdisciplinaire de Nanosciences de Marseille, Marseille (France); Charles, Laurence, E-mail: laurence.charles@univ-amu.fr [Aix-Marseille Université – CNRS, UMR 7273, Institut de Chimie Radicalaire, Marseille (France)

    2015-01-01

    Highlights: • ESI-MS/MS and NMR were combined to elucidate a new side-reaction during divergent synthesis of PAMAM dendrimers. • These new impurities exhibit a net gain of a single carbon atom as compared to expected molecules. • The side-reaction is due to formaldehyde, contained as trace level impurity in methanol used as the synthesis medium. - Abstract: A new side-reaction occurring during divergent synthesis of PAMAM dendrimers (generations G{sub 0}–G{sub 2}) was revealed by mass spectrometric detection of defective molecules with a net gain of a single carbon atom as compared to expected compounds. Combining MS/MS experiments performed on different electrosprayed precursor ions (protonated molecules and lithiated adducts) with NMR analyses allowed the origin of these by-products to be elucidated. Modification of one ethylenediamine end-group of perfect dendrimers into a cyclic imidazolidine moiety was induced by formaldehyde present at trace level in the methanol solvent used as the synthesis medium. Dendrimers studied here were purposely constructed from a triethanolamine core to make them more flexible, as compared to NH{sub 3}- or ethylenediamine-core PAMAM, and hence improve their interaction with DNA. Occurrence of this side-reaction would be favored by the particular flexibility of the dendrimer branches.

  18. Increased Expression of Simple Ganglioside Species GM2 and GM3 Detected by MALDI Imaging Mass Spectrometry in a Combined Rat Model of Aβ Toxicity and Stroke.

    Directory of Open Access Journals (Sweden)

    Sarah Caughlin

    Full Text Available The aging brain is often characterized by the presence of multiple comorbidities resulting in synergistic damaging effects in the brain as demonstrated through the interaction of Alzheimer's disease (AD and stroke. Gangliosides, a family of membrane lipids enriched in the central nervous system, may have a mechanistic role in mediating the brain's response to injury as their expression is altered in a number of disease and injury states. Matrix-Assisted Laser Desorption Ionization (MALDI Imaging Mass Spectrometry (IMS was used to study the expression of A-series ganglioside species GD1a, GM1, GM2, and GM3 to determine alteration of their expression profiles in the presence of beta-amyloid (Aβ toxicity in addition to ischemic injury. To model a stroke, rats received a unilateral striatal injection of endothelin-1 (ET-1 (stroke alone group. To model Aβ toxicity, rats received intracerebralventricular (i.c.v. injections of the toxic 25-35 fragment of the Aβ peptide (Aβ alone group. To model the combination of Aβ toxicity with stroke, rats received both the unilateral ET-1 injection and the bilateral icv injections of Aβ25-35 (combined Aβ/ET-1 group. By 3 d, a significant increase in the simple ganglioside species GM2 was observed in the ischemic brain region of rats who received a stroke (ET-1, with or without Aβ. By 21 d, GM2 levels only remained elevated in the combined Aβ/ET-1 group. GM3 levels however demonstrated a different pattern of expression. By 3 d GM3 was elevated in the ischemic brain region only in the combined Aβ/ET-1 group. By 21 d, GM3 was elevated in the ischemic brain region in both stroke alone and Aβ/ET-1 groups. Overall, results indicate that the accumulation of simple ganglioside species GM2 and GM3 may be indicative of a mechanism of interaction between AD and stroke.

  19. Rapid determination of six carcinogenic primary aromatic amines in mainstream cigarette smoke by two-dimensional online solid phase extraction combined with liquid chromatography tandem mass spectrometry.

    Science.gov (United States)

    Bie, Zhenying; Lu, Wei; Zhu, You; Chen, Yusong; Ren, Hubo; Ji, Lishun

    2017-01-27

    A fully automated, rapid, and reliable method for simultaneous determination of six carcinogenic primary aromatic amines (AAs), including o-toluidine (o-TOL), 2, 6-dimethylaniline (2, 6-DMA), o-anisidine (o-ASD), 1-naphthylamine (1-ANP), 2-naphthylamine (2-ANP), and 4-aminobiphenyl (4-ABP), in mainstream cigarette smoke was established. The proposed method was based on two-dimensional online solid phase extraction combined with liquid chromatography tandem mass spectrometry (SPE/LC-MS/MS). The particulate phase of the mainstream cigarette smoke was collected on a Cambridge filter pad and pretreated via ultrasonic extraction with 2% formic acid (FA), while the gas phase was trapped by 2% FA without pretreatment for determination. The two-dimensional online SPE comprised of two cartridges with different absorption characteristics was applied for sample pretreatment. Analysis was performed by liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) under multiple reaction monitoring mode. Each sample required about 0.5h for solid phase extraction and analysis. The limit of detections (LODs) for six AAs ranged from 0.04 to 0.58ng/cig and recoveries were within 84.5%-122.9%. The relative standard deviations of intra- and inter-day tests for 3R4F reference cigarette were less than 6% and 7%, respectively, while no more than 7% and 8% separately for a type of Virginia cigarette. The proposed method enabled minimum sample pretreatment, full automation, and high throughput with high selectivity, sensitivity, and accuracy. As a part of the validation procedure, fifteen brands of cigarettes were tested by the designed method. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Perti Net-Based Workflow Access Control Model

    Institute of Scientific and Technical Information of China (English)

    陈卓; 骆婷; 石磊; 洪帆

    2004-01-01

    Access control is an important protection mechanism for information systems. This paper shows how to make access control in workflow system. We give a workflow access control model (WACM) based on several current access control models. The model supports roles assignment and dynamic authorization. The paper defines the workflow using Petri net. It firstly gives the definition and description of the workflow, and then analyzes the architecture of the workflow access control model (WACM). Finally, an example of an e-commerce workflow access control model is discussed in detail.

  1. Analysis of biodiesel and biodiesel-petrodiesel blends by high performance thin layer chromatography combined with easy ambient sonic-spray ionization mass spectrometry.

    Science.gov (United States)

    Eberlin, Livia S; Abdelnur, Patricia V; Passero, Alan; de Sa, Gilberto F; Daroda, Romeu J; de Souza, Vanderlea; Eberlin, Marcos N

    2009-08-01

    High performance thin layer chromatography (HPTLC) combined with on-spot detection and characterization via easy ambient sonic-spray ionization mass spectrometry (EASI-MS) is applied to the analysis of biodiesel (B100) and biodiesel-petrodiesel blends (BX). HPTLC provides chromatographic resolution of major components whereas EASI-MS allows on-spot characterization performed directly on the HPTLC surface at ambient conditions. Constituents (M) are detected by EASI-MS in a one component-one ion fashion as either [M + Na](+) or [M + H](+). For both B100 and BX samples, typical profiles of fatty acid methyl esters (FAME) detected as [FAME + Na](+) ions allow biodiesel typification. The spectrum of the petrodiesel spot displays a homologous series of protonated alkyl pyridines which are characteristic for petrofuels (natural markers). The spectrum for residual or admixture oil spots is characterized by sodiated triglycerides [TAG + Na](+). The application of HPTLC to analyze B100 and BX samples and its combination with EASI-MS for on-spot characterization and quality control is demonstrated.

  2. Competing intermolecular interactions of artemisinin-type agents and aspirin with membrane phospholipids: Combined model mass spectrometry and quantum-chemical study

    Energy Technology Data Exchange (ETDEWEB)

    Pashynska, Vlada, E-mail: vlada@vl.kharkov.ua [B.Verkin Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine, Lenin Ave., 47, 61103 Kharkov (Ukraine); Stepanian, Stepan [B.Verkin Institute for Low Temperature Physics and Engineering of the National Academy of Sciences of Ukraine, Lenin Ave., 47, 61103 Kharkov (Ukraine); Gömöry, Agnes; Vekey, Karoly [Institute of Organic Chemistry of Research Centre for Natural Sciences of the Hungarian Academy of Sciences, Magyar tudosok korutja, 2, Budapest H-1117 (Hungary); Adamowicz, Ludwik [University of Arizona, Department of Chemistry and Biochemistry, Tucson, AZ 85721 (United States)

    2015-07-09

    Highlights: • Competitive binding of artemisinin agents and aspirin with phospholipids is shown. • Complexation between the antimalarial drugs and aspirin molecules is also found. • Energetically favorable structures of the model complexes are identified by DFT. • Membranotropic activity of the studied drugs can be modified under joint usage. - Abstract: Study of intermolecular interactions of antimalarial artemisinin-type drugs and aspirin with membrane phospholipids is important in term of elucidation of the drugs activity modification under their joint usage. Combined experimental and computational study of the interaction of dihydroartemisinin, α-artemether, and artesunate with aspirin (ASP) and dipalmitoylphosphatidylcholine (DPPC) is performed by electrospray ionization (ESI) mass spectrometry and by DFT B3LYP/aug-cc-pVDZ methods. The results of the ESI investigation of systems containing artemisinin-type agent, ASP and DPPC, reveal a competition between the antimalarial agents and ASP for binding with DPPC molecules. The complexation between the antimalarial drugs and ASP is also found. Observed phenomena suggest that membranotropic activity of artemisin-type agents and aspirin is modified under their combined usage. To elucidate structure-energy characteristics of the non-covalent complexes studied the model DFT calculations are performed for dihydroartemisinin · ASP complex and complexes of the each drug with phosphatidylcholine head of DPPC in neutral and cationized forms.

  3. Detailed polyphenolic profiling of Annurca apple (M. pumila Miller cv Annurca) by a combination of RP-UHPLC and HILIC, both hyphenated to IT-TOF mass spectrometry.

    Science.gov (United States)

    Sommella, Eduardo; Pepe, Giacomo; Pagano, Francesco; Ostacolo, Carmine; Tenore, Gian Carlo; Russo, Maria Teresa; Novellino, Ettore; Manfra, Michele; Campiglia, Pietro

    2015-10-01

    Annurca apple, a Southern Italian cultivar, possesses not only a particular taste and flavor, different from other types of apple, but also several healthy properties. With the aim to thoroughly elucidate the polyphenolic profile of this variety, listed as Protected Geographical Indication product, an extensive qualitative profiling of Annurca apple polyphenolic peel extract was carried out, by employing a combination of ultra high performance reversed phase (RP-UHPLC) and hydrophilic liquid chromatography (HILIC) coupled to ion trap-time of flight (IT-TOF) mass spectrometry. A total of 63 compounds were tentatively identified, 25 of which not reported in Annurca apple extract so far. Furthermore, thanks to the different selectivity obtained with the HILIC, in combination with accurate mass measurements, an improved separation and detection of procyanidins, was obtained. Moreover, the obtained profiles were compared with those of a conventional variety, such as Red Delicious (RD), highlighting their differences. This work contributes to increase the knowledge about the polyphenolic fingerprint of this typical apple variety. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Competing intermolecular interactions of artemisinin-type agents and aspirin with membrane phospholipids: Combined model mass spectrometry and quantum-chemical study

    International Nuclear Information System (INIS)

    Pashynska, Vlada; Stepanian, Stepan; Gömöry, Agnes; Vekey, Karoly; Adamowicz, Ludwik

    2015-01-01

    Highlights: • Competitive binding of artemisinin agents and aspirin with phospholipids is shown. • Complexation between the antimalarial drugs and aspirin molecules is also found. • Energetically favorable structures of the model complexes are identified by DFT. • Membranotropic activity of the studied drugs can be modified under joint usage. - Abstract: Study of intermolecular interactions of antimalarial artemisinin-type drugs and aspirin with membrane phospholipids is important in term of elucidation of the drugs activity modification under their joint usage. Combined experimental and computational study of the interaction of dihydroartemisinin, α-artemether, and artesunate with aspirin (ASP) and dipalmitoylphosphatidylcholine (DPPC) is performed by electrospray ionization (ESI) mass spectrometry and by DFT B3LYP/aug-cc-pVDZ methods. The results of the ESI investigation of systems containing artemisinin-type agent, ASP and DPPC, reveal a competition between the antimalarial agents and ASP for binding with DPPC molecules. The complexation between the antimalarial drugs and ASP is also found. Observed phenomena suggest that membranotropic activity of artemisin-type agents and aspirin is modified under their combined usage. To elucidate structure-energy characteristics of the non-covalent complexes studied the model DFT calculations are performed for dihydroartemisinin · ASP complex and complexes of the each drug with phosphatidylcholine head of DPPC in neutral and cationized forms

  5. A high-throughput solid-phase extraction microchip combined with inductively coupled plasma-mass spectrometry for rapid determination of trace heavy metals in natural water.

    Science.gov (United States)

    Shih, Tsung-Ting; Hsieh, Cheng-Chuan; Luo, Yu-Ting; Su, Yi-An; Chen, Ping-Hung; Chuang, Yu-Chen; Sun, Yuh-Chang

    2016-04-15

    Herein, a hyphenated system combining a high-throughput solid-phase extraction (htSPE) microchip with inductively coupled plasma-mass spectrometry (ICP-MS) for rapid determination of trace heavy metals was developed. Rather than performing multiple analyses in parallel for the enhancement of analytical throughput, we improved the processing speed for individual samples by increasing the operation flow rate during SPE procedures. To this end, an innovative device combining a micromixer and a multi-channeled extraction unit was designed. Furthermore, a programmable valve manifold was used to interface the developed microchip and ICP-MS instrumentation in order to fully automate the system, leading to a dramatic reduction in operation time and human error. Under the optimized operation conditions for the established system, detection limits of 1.64-42.54 ng L(-1) for the analyte ions were achieved. Validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Each analysis could be readily accomplished within just 186 s using the established system. This represents, to the best of our knowledge, an unprecedented speed for the analysis of trace heavy metal ions. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Soundness of Timed-Arc Workflow Nets

    DEFF Research Database (Denmark)

    Mateo, Jose Antonio; Srba, Jiri; Sørensen, Mathias Grund

    2014-01-01

    , we demonstrate the usability of our theory on the case studies of a Brake System Control Unit used in aircraft certification, the MPEG2 encoding algorithm, and a blood transfusion workflow. The implementation of the algorithms is freely available as a part of the model checker TAPAAL....

  7. Distributed interoperable workflow support for electronic commerce

    NARCIS (Netherlands)

    Papazoglou, M.; Jeusfeld, M.A.; Weigand, H.; Jarke, M.

    1998-01-01

    Abstract. This paper describes a flexible distributed transactional workflow environment based on an extensible object-oriented framework built around class libraries, application programming interfaces, and shared services. The purpose of this environment is to support a range of EC-like business

  8. Using workflow for projects in higher education

    NARCIS (Netherlands)

    van der Veen, Johan (CTIT); Jones, Valerie M.; Collis, Betty

    2000-01-01

    The WWW is increasingly used as a medium to support education and training. A course at the University of Twente in which groups of students collaborate in the design and production of multimedia instructional materials has now been supported by a website since 1995. Workflow was integrated with

  9. Workflow Automation: A Collective Case Study

    Science.gov (United States)

    Harlan, Jennifer

    2013-01-01

    Knowledge management has proven to be a sustainable competitive advantage for many organizations. Knowledge management systems are abundant, with multiple functionalities. The literature reinforces the use of workflow automation with knowledge management systems to benefit organizations; however, it was not known if process automation yielded…

  10. Text mining for the biocuration workflow.

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A P C; Krallinger, Martin; Arighi, Cecilia; Cohen, K Bretonnel; Valencia, Alfonso; Wu, Cathy H; Chatr-Aryamontri, Andrew; Dowell, Karen G; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on 'Text Mining for the BioCuration Workflow' at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community.

  11. Adaptive workflow simulation of emergency response

    NARCIS (Netherlands)

    Bruinsma, Guido Wybe Jan

    2010-01-01

    Recent incidents and major training exercises in and outside the Netherlands have persistently shown that not having or not sharing information during emergency response are major sources of emergency response inefficiency and error, and affect incident mitigation outcomes through workflow planning

  12. Combination of mass spectrometry-based targeted lipidomics and supervised machine learning algorithms in detecting adulterated admixtures of white rice.

    Science.gov (United States)

    Lim, Dong Kyu; Long, Nguyen Phuoc; Mo, Changyeun; Dong, Ziyuan; Cui, Lingmei; Kim, Giyoung; Kwon, Sung Won

    2017-10-01

    The mixing of extraneous ingredients with original products is a common adulteration practice in food and herbal medicines. In particular, authenticity of white rice and its corresponding blended products has become a key issue in food industry. Accordingly, our current study aimed to develop and evaluate a novel discrimination method by combining targeted lipidomics with powerful supervised learning methods, and eventually introduce a platform to verify the authenticity of white rice. A total of 30 cultivars were collected, and 330 representative samples of white rice from Korea and China as well as seven mixing ratios were examined. Random forests (RF), support vector machines (SVM) with a radial basis function kernel, C5.0, model averaged neural network, and k-nearest neighbor classifiers were used for the classification. We achieved desired results, and the classifiers effectively differentiated white rice from Korea to blended samples with high prediction accuracy for the contamination ratio as low as five percent. In addition, RF and SVM classifiers were generally superior to and more robust than the other techniques. Our approach demonstrated that the relative differences in lysoGPLs can be successfully utilized to detect the adulterated mixing of white rice originating from different countries. In conclusion, the present study introduces a novel and high-throughput platform that can be applied to authenticate adulterated admixtures from original white rice samples. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Isotope ratio mass spectrometry in combination with chemometrics for characterization of geographical origin and agronomic practices of table grape.

    Science.gov (United States)

    Longobardi, Francesco; Casiello, Grazia; Centonze, Valentina; Catucci, Lucia; Agostiano, Angela

    2017-08-01

    Although table grape is one of the most cultivated and consumed fruits worldwide, no study has been reported on its geographical origin or agronomic practice based on stable isotope ratios. This study aimed to evaluate the usefulness of isotopic ratios (i.e. 2 H/ 1 H, 13 C/ 12 C, 15 N/ 14 N and 18 O/ 16 O) as possible markers to discriminate the agronomic practice (conventional versus organic farming) and provenance of table grape. In order to quantitatively evaluate which of the isotopic variables were more discriminating, a t test was carried out, in light of which only δ 13 C and δ 18 O provided statistically significant differences (P ≤ 0.05) for the discrimination of geographical origin and farming method. Principal component analysis (PCA) showed no good separation of samples differing in geographical area and agronomic practice; thus, for classification purposes, supervised approaches were carried out. In particular, general discriminant analysis (GDA) was used, resulting in prediction abilities of 75.0 and 92.2% for the discrimination of farming method and origin respectively. The present findings suggest that stable isotopes (i.e. δ 18 O, δ 2 H and δ 13 C) combined with chemometrics can be successfully applied to discriminate the provenance of table grape. However, the use of bulk nitrogen isotopes was not effective for farming method discrimination. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  14. Characterization of diesel fuel by chemical separation combined with capillary gas chromatography (GC) isotope ratio mass spectrometry (IRMS).

    Science.gov (United States)

    Harvey, Scott D; Jarman, Kristin H; Moran, James J; Sorensen, Christina M; Wright, Bob W

    2012-09-15

    The purpose of this study was to perform a preliminary investigation of compound-specific isotope analysis (CSIA) of diesel fuels to evaluate whether the technique could distinguish diesel samples from different sources/locations. The ability to differentiate or correlate diesel samples could be valuable for discovering fuel tax evasion schemes or for environmental forensic studies. Two urea adduction-based techniques were used to isolate the n-alkanes from the fuel. Both carbon isotope ratio (δ(13)C) and hydrogen isotope ratio (δD) values for the n-alkanes were then determined by CSIA in each sample. The samples investigated had δ(13)C values that ranged from -30.1‰ to -26.8‰, whereas δD values ranged from -83‰ to -156‰. Plots of δD versus δ(13)C with sample n-alkane points connected in order of increasing carbon number gave well-separated clusters with characteristic shapes for each sample. Principal components analysis (PCA) with δ(13)C, δD, or combined δ(13)C and δD data was applied to extract the maximum information content. PCA scores plots could clearly differentiate the samples, thereby demonstrating the potential of this approach for distinguishing (e.g., fingerprinting) fuel samples using δ(13)C and δD values. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Combining information from headspace mass spectrometry and visible spectroscopy in the classification of the Ligurian olive oils

    Energy Technology Data Exchange (ETDEWEB)

    Casale, Monica [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita di Genova, Via Brigata Salerno (ponte), I-16147 Genova (Italy)]. E-mail: monica@dictfa.unige.it; Armanino, Carla [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita di Genova, Via Brigata Salerno (ponte), I-16147 Genova (Italy); Casolino, Chiara [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita di Genova, Via Brigata Salerno (ponte), I-16147 Genova (Italy); Forina, Michele [Dipartimento di Chimica e Tecnologie Farmaceutiche ed Alimentari, Universita di Genova, Via Brigata Salerno (ponte), I-16147 Genova (Italy)

    2007-04-18

    An electronic nose and an UV-Vis spectrophotometer, in combination with multivariate analysis, have been used to verify the geographical origin of extra virgin olive oils. Forty-six oil samples from three different areas of Liguria were included in this analysis. Initially, the data obtained from the two instruments were analysed separately. Then, the potential of the synergy between these two technologies for testing food authenticity and quality was investigated. Application of Linear Discriminant Analysis, after feature selection, was sufficient to differentiate the three geographical denominations of Liguria ('Riviera dei Fiori', 'Riviera del Ponente Savonese' and 'Riviera di Levante'), obtaining 100% success in classification and close to 100% in prediction. The models built using SIMCA as a class-modelling tool, were not so effective, but confirmed that the results improve using the synergy between different analytical techniques. This paper shows that objective instrumental data related to two important organoleptic features such as oil colour and aroma, supply complementary information.

  16. Final Technical Report for DE-FG02-06ER15835: Chemical Imaging with 100nm Spatial Resolution: Combining High Resolution Flurosecence Microscopy and Ion Mobility Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Buratto, Steven K. [UC Santa Barbara

    2013-09-03

    We have combined, in a single instrument, high spatial resolution optical microscopy with the chemical specificity and conformational selectivity of ion mobility mass spectrometry. We discuss the design and construction of this apparatus as well as our efforts in applying this technique to thin films of molecular semiconductor materials.

  17. Solid phase extraction in combination with comprehensive two-dimensional gas chromatography coupled to time-of-flight mass spectrometry for the detailed investigation of volatiles in South African red wines

    NARCIS (Netherlands)

    Weldegergis, B.T.; Crouch, A.M.; Górecki, T.; Villiers, de A.

    2011-01-01

    Comprehensive two-dimensional gas chromatography in combination with time-of-flight mass spectrometry (GC × GC–TOFMS) has been applied for the analysis of volatile compounds in three young South African red wines. In spite of the significant benefits offered by GC × GC–TOFMS for the separation and

  18. Principal component directed partial least squares analysis for combining nuclear magnetic resonance and mass spectrometry data in metabolomics: Application to the detection of breast cancer

    International Nuclear Information System (INIS)

    Gu Haiwei; Pan Zhengzheng; Xi Bowei; Asiago, Vincent; Musselman, Brian; Raftery, Daniel

    2011-01-01

    Nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry (MS) are the two most commonly used analytical tools in metabolomics, and their complementary nature makes the combination particularly attractive. A combined analytical approach can improve the potential for providing reliable methods to detect metabolic profile alterations in biofluids or tissues caused by disease, toxicity, etc. In this paper, 1 H NMR spectroscopy and direct analysis in real time (DART)-MS were used for the metabolomics analysis of serum samples from breast cancer patients and healthy controls. Principal component analysis (PCA) of the NMR data showed that the first principal component (PC1) scores could be used to separate cancer from normal samples. However, no such obvious clustering could be observed in the PCA score plot of DART-MS data, even though DART-MS can provide a rich and informative metabolic profile. Using a modified multivariate statistical approach, the DART-MS data were then reevaluated by orthogonal signal correction (OSC) pretreated partial least squares (PLS), in which the Y matrix in the regression was set to the PC1 score values from the NMR data analysis. This approach, and a similar one using the first latent variable from PLS-DA of the NMR data resulted in a significant improvement of the separation between the disease samples and normals, and a metabolic profile related to breast cancer could be extracted from DART-MS. The new approach allows the disease classification to be expressed on a continuum as opposed to a binary scale and thus better represents the disease and healthy classifications. An improved metabolic profile obtained by combining MS and NMR by this approach may be useful to achieve more accurate disease detection and gain more insight regarding disease mechanisms and biology.

  19. Principal component directed partial least squares analysis for combining nuclear magnetic resonance and mass spectrometry data in metabolomics: application to the detection of breast cancer.

    Science.gov (United States)

    Gu, Haiwei; Pan, Zhengzheng; Xi, Bowei; Asiago, Vincent; Musselman, Brian; Raftery, Daniel

    2011-02-07

    Nuclear magnetic resonance (NMR) spectroscopy and mass spectrometry (MS) are the two most commonly used analytical tools in metabolomics, and their complementary nature makes the combination particularly attractive. A combined analytical approach can improve the potential for providing reliable methods to detect metabolic profile alterations in biofluids or tissues caused by disease, toxicity, etc. In this paper, (1)H NMR spectroscopy and direct analysis in real time (DART)-MS were used for the metabolomics analysis of serum samples from breast cancer patients and healthy controls. Principal component analysis (PCA) of the NMR data showed that the first principal component (PC1) scores could be used to separate cancer from normal samples. However, no such obvious clustering could be observed in the PCA score plot of DART-MS data, even though DART-MS can provide a rich and informative metabolic profile. Using a modified multivariate statistical approach, the DART-MS data were then reevaluated by orthogonal signal correction (OSC) pretreated partial least squares (PLS), in which the Y matrix in the regression was set to the PC1 score values from the NMR data analysis. This approach, and a similar one using the first latent variable from PLS-DA of the NMR data resulted in a significant improvement of the separation between the disease samples and normals, and a metabolic profile related to breast cancer could be extracted from DART-MS. The new approach allows the disease classification to be expressed on a continuum as opposed to a binary scale and thus better represents the disease and healthy classifications. An improved metabolic profile obtained by combining MS and NMR by this approach may be useful to achieve more accurate disease detection and gain more insight regarding disease mechanisms and biology. Copyright © 2010 Elsevier B.V. All rights reserved.

  20. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronnie S:; van der Aalst, Wil M.P.; Bakker, Piet J.M.

    2007-01-01

    care process of the Academic Medical Center (AMC) hospital is used as reference process. The process consists of hundreds of activities. These have been modeled and analyzed using an EUC and a CWN. Moreover, based on the CWN, the process has been implemented using four different workflow systems......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Workflow Net (CWN) are used to close the gap between the given requirements specification and the realization of these requirements with the help of a workflow system. This paper describes a large case study where the diagnostic tra jectory of the gynaecological oncology...

  1. From remote sensing data about information extraction for 3D geovisualization - Development of a workflow

    International Nuclear Information System (INIS)

    Tiede, D.

    2010-01-01

    With an increased availability of high (spatial) resolution remote sensing imagery since the late nineties, the need to develop operative workflows for the automated extraction, provision and communication of information from such data has grown. Monitoring requirements, aimed at the implementation of environmental or conservation targets, management of (environmental-) resources, and regional planning as well as international initiatives, especially the joint initiative of the European Commission and ESA (European Space Agency) for Global Monitoring for Environment and Security (GMES) play also a major part. This thesis addresses the development of an integrated workflow for the automated provision of information derived from remote sensing data. Considering applied data and fields of application, this work aims to design the workflow as generic as possible. Following research questions are discussed: What are the requirements of a workflow architecture that seamlessly links the individual workflow elements in a timely manner and secures accuracy of the extracted information effectively? How can the workflow retain its efficiency if mounds of data are processed? How can the workflow be improved with regards to automated object-based image analysis (OBIA)? Which recent developments could be of use? What are the limitations or which workarounds could be applied in order to generate relevant results? How can relevant information be prepared target-oriented and communicated effectively? How can the more recently developed freely available virtual globes be used for the delivery of conditioned information under consideration of the third dimension as an additional, explicit carrier of information? Based on case studies comprising different data sets and fields of application it is demonstrated how methods to extract and process information as well as to effectively communicate results can be improved and successfully combined within one workflow. It is shown that (1

  2. Workflows in bioinformatics: meta-analysis and prototype implementation of a workflow generator

    Directory of Open Access Journals (Sweden)

    Thoraval Samuel

    2005-04-01

    Full Text Available Abstract Background Computational methods for problem solving need to interleave information access and algorithm execution in a problem-specific workflow. The structures of these workflows are defined by a scaffold of syntactic, semantic and algebraic objects capable of representing them. Despite the proliferation of GUIs (Graphic User Interfaces in bioinformatics, only some of them provide workflow capabilities; surprisingly, no meta-analysis of workflow operators and components in bioinformatics has been reported. Results We present a set of syntactic components and algebraic operators capable of representing analytical workflows in bioinformatics. Iteration, recursion, the use of conditional statements, and management of suspend/resume tasks have traditionally been implemented on an ad hoc basis and hard-coded; by having these operators properly defined it is possible to use and parameterize them as generic re-usable components. To illustrate how these operations can be orchestrated, we present GPIPE, a prototype graphic pipeline generator for PISE that allows the definition of a pipeline, parameterization of its component methods, and storage of metadata in XML formats. This implementation goes beyond the macro capacities currently in PISE. As the entire analysis protocol is defined in XML, a complete bioinformatic experiment (linked sets of methods, parameters and results can be reproduced or shared among users. Availability: http://if-web1.imb.uq.edu.au/Pise/5.a/gpipe.html (interactive, ftp://ftp.pasteur.fr/pub/GenSoft/unix/misc/Pise/ (download. Conclusion From our meta-analysis we have identified syntactic structures and algebraic operators common to many workflows in bioinformatics. The workflow components and algebraic operators can be assimilated into re-usable software components. GPIPE, a prototype implementation of this framework, provides a GUI builder to facilitate the generation of workflows and integration of heterogeneous

  3. Performing Workflows in Pervasive Environments Based on Context Specifications

    OpenAIRE

    Xiping Liu; Jianxin Chen

    2010-01-01

    The workflow performance consists of the performance of activities and transitions between activities. Along with the fast development of varied computing devices, activities in workflows and transitions between activities could be performed in pervasive ways, whichcauses that the workflow performance need to migrate from traditional computing environments to pervasive environments. To perform workflows in pervasive environments needs to take account of the context information which affects b...

  4. Workflow Support for Advanced Grid-Enabled Computing

    OpenAIRE

    Xu, Fenglian; Eres, M.H.; Tao, Feng; Cox, Simon J.

    2004-01-01

    The Geodise project brings computer scientists and engineer's skills together to build up a service-oriented computing environmnet for engineers to perform complicated computations in a distributed system. The workflow tool is a front GUI to provide a full life cycle of workflow functions for Grid-enabled computing. The full life cycle of workflow functions have been enhanced based our initial research and development. The life cycle starts with a composition of a workflow, followed by an ins...

  5. BIFI: a Taverna plugin for a simplified and user-friendly workflow platform.

    Science.gov (United States)

    Yildiz, Ahmet; Dilaveroglu, Erkan; Visne, Ilhami; Günay, Bilal; Sefer, Emrah; Weinhausel, Andreas; Rattay, Frank; Goble, Carole A; Pandey, Ram Vinay; Kriegner, Albert

    2014-10-20

    Heterogeneity in the features, input-output behaviour and user interface for available bioinformatics tools and services is still a bottleneck for both expert and non-expert users. Advancement in providing common interfaces over such tools and services are gaining interest among researchers. However, the lack of (meta-) information about input-output data and parameter prevents to provide automated and standardized solutions, which can assist users in setting the appropriate parameters. These limitations must be resolved especially in the workflow-based solution in order to ease the integration of software. We report a Taverna Workbench plugin: the XworX BIFI (Beautiful Interfaces for Inputs) implemented as a solution for the aforementioned issues. BIFI provides a Graphical User Interface (GUI) definition language used to layout the user interface and to define parameter options for Taverna workflows. BIFI is also able to submit GUI Definition Files (GDF) directly or discover appropriate instances from a configured repository. In the absence of a GDF, BIFI generates a default interface. The Taverna Workbench is an open source software providing the ability to combine various services within a workflow. Nevertheless, users can supply input data to the workflow via a simple user interface providing only a text area to enter the input in text form. The workflow may contain meta-information in human readable form such as description text for the port and an example value. However, not all workflow ports are documented so well or have all the required information.BIFI uses custom user interface components for ports which give users feedback on the parameter data type or structure to be used for service execution and enables client-side data validations. Moreover, BIFI offers user interfaces that allow users to interactively construct workflow views and share them with the community, thus significantly increasing usability of heterogeneous, distributed service

  6. Widening the adoption of workflows to include human and human-machine scientific processes

    Science.gov (United States)

    Salayandia, L.; Pinheiro da Silva, P.; Gates, A. Q.

    2010-12-01

    Scientific workflows capture knowledge in the form of technical recipes to access and manipulate data that help scientists manage and reuse established expertise to conduct their work. Libraries of scientific workflows are being created in particular fields, e.g., Bioinformatics, where combined with cyber-infrastructure environments that provide on-demand access to data and tools, result in powerful workbenches for scientists of those communities. The focus in these particular fields, however, has been more on automating rather than documenting scientific processes. As a result, technical barriers have impeded a wider adoption of scientific workflows by scientific communities that do not rely as heavily on cyber-infrastructure and computing environments. Semantic Abstract Workflows (SAWs) are introduced to widen the applicability of workflows as a tool to document scientific recipes or processes. SAWs intend to capture a scientists’ perspective about the process of how she or he would collect, filter, curate, and manipulate data to create the artifacts that are relevant to her/his work. In contrast, scientific workflows describe the process from the point of view of how technical methods and tools are used to conduct the work. By focusing on a higher level of abstraction that is closer to a scientist’s understanding, SAWs effectively capture the controlled vocabularies that reflect a particular scientific community, as well as the types of datasets and methods used in a particular domain. From there on, SAWs provide the flexibility to adapt to different environments to carry out the recipes or processes. These environments range from manual fieldwork to highly technical cyber-infrastructure environments, i.e., such as those already supported by scientific workflows. Two cases, one from Environmental Science and another from Geophysics, are presented as illustrative examples.

  7. A practical data processing workflow for multi-OMICS projects.

    Science.gov (United States)

    Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin

    2014-01-01

    Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post

  8. On Lifecycle Constraints of Artifact-Centric Workflows

    Science.gov (United States)

    Kucukoguz, Esra; Su, Jianwen

    Data plays a fundamental role in modeling and management of business processes and workflows. Among the recent "data-aware" workflow models, artifact-centric models are particularly interesting. (Business) artifacts are the key data entities that are used in workflows and can reflect both the business logic and the execution states of a running workflow. The notion of artifacts succinctly captures the fluidity aspect of data during workflow executions. However, much of the technical dimension concerning artifacts in workflows is not well understood. In this paper, we study a key concept of an artifact "lifecycle". In particular, we allow declarative specifications/constraints of artifact lifecycle in the spirit of DecSerFlow, and formulate the notion of lifecycle as the set of all possible paths an artifact can navigate through. We investigate two technical problems: (Compliance) does a given workflow (schema) contain only lifecycle allowed by a constraint? And (automated construction) from a given lifecycle specification (constraint), is it possible to construct a "compliant" workflow? The study is based on a new formal variant of artifact-centric workflow model called "ArtiNets" and two classes of lifecycle constraints named "regular" and "counting" constraints. We present a range of technical results concerning compliance and automated construction, including: (1) compliance is decidable when workflow is atomic or constraints are regular, (2) for each constraint, we can always construct a workflow that satisfies the constraint, and (3) sufficient conditions where atomic workflows can be constructed.

  9. WS-VLAM: A GT4 based workflow management system

    NARCIS (Netherlands)

    Wibisono, A.; Vasyunin, D.; Korkhov, V.; Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.

    2007-01-01

    Generic Grid middleware, e.g., Globus Toolkit 4 (GT4), provides basic services for scientific workflow management systems to discover, store and integrate workflow components. Using the state of the art Grid services can advance the functionality of workflow engine in orchestrating distributed Grid

  10. Optimal resource assignment in workflows for maximizing cooperation

    NARCIS (Netherlands)

    Kumar, Akhil; Dijkman, R.M.; Song, Minseok; Daniel, Fl.; Wang, J.; Weber, B.

    2013-01-01

    A workflow is a team process since many actors work on various tasks to complete an instance. Resource management in such workflows deals with assignment of tasks to workers or actors. In team formation, it is necessary to ensure that members of a team are compatible with each other. When a workflow

  11. Combined Determination of Poly-β-Hydroxyalkanoic and Cellular Fatty Acids in Starved Marine Bacteria and Sewage Sludge by Gas Chromatography with Flame Ionization or Mass Spectrometry Detection

    Science.gov (United States)

    Odham, Göran; Tunlid, Anders; Westerdahl, Gunilla; Mårdén, Per

    1986-01-01

    Extraction of lipids from bacterial cells or sewage sludge samples followed by simple and rapid extraction procedures and room temperature esterification with pentafluorobenzylbromide allowed combined determinations of poly-β-hydroxyalkanoate constituents and fatty acids. Capillary gas chromatography and flame ionization or mass spectrometric detection was used. Flame ionization permitted determination with a coefficient of variation ranging from 10 to 27% at the picomolar level, whereas quantitative chemical ionization mass spectrometry afforded sensitivities for poly-β-hydroxyalkanoate constituuents in the attomolar range. The latter technique suggests the possibility of measuring such components in bacterial assemblies with as few as 102 cells. With the described technique using flame ionization detection, it was possible to study the rapid formation of poly-β-hydroxyalkanoate during feeding of a starved marine bacterium isolate with a complex medium or glucose and correlate the findings to changes in cell volumes. Mass spectrometric detection of short β-hydroxy acids in activated sewage sludge revealed the presence of 3-hydroxybutyric, 3-hydroxyhexanoic, and 3-hydroxyoctanoic acids in the relative proportions of 56, 5 and 39%, respectively. No odd-chain β-hydroxy acids were found. PMID:16347181

  12. Cloud point extraction combined with electrothermal atomic absorption spectrometry for the speciation of antimony(III) and antimony(V) in food packaging materials

    International Nuclear Information System (INIS)

    Jiang Xiuming; Wen Shengping; Xiang Guoqiang

    2010-01-01

    A simple, sensitive method for the speciation of inorganic antimony by cloud point extraction combined with electrothermal atomic absorption spectrometry (ETAAS) is presented and evaluated. The method based on the fact that formation of a hydrophobic complex of antimony(III) with ammonium pyrrolidine dithiocarbamate (APDC) at pH 5.0 and subsequently the hydrophobic complex enter into surfactant-rich phase, whereas antimony(V) remained in aqueous solutions. Antimony(III) in surfactant-rich phase was analyzed by ETAAS after dilution by 0.2 mL nitric acid in methanol (0.1 M), and antimony(V) was calculated by subtracting antimony(III) from the total antimony after reducing antimony(V) to antimony(III) by L-cysteine. The main factors affecting the cloud point extraction, such as pH, concentration of APDC and Triton X-114, equilibrium temperature and incubation time, sample volume were investigated in detail. Under the optimum conditions, the detection limit (3σ) of the proposed method was 0.02 ng mL -1 for antimony(III), and the relative standard deviation was 7.8% (c = 1.0 ng mL -1 , n = 7). The proposed method was successfully applied to speciation of inorganic antimony in the leaching solutions of different food packaging materials with satisfactory results.

  13. Speciation of mercury in water samples by dispersive liquid-liquid microextraction combined with high performance liquid chromatography-inductively coupled plasma mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Jia Xiaoyu; Han Yi; Liu Xinli [State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Science, Changchun 130022 (China); Graduate School of Chinese Academy of Sciences, Beijing 100039 (China); Duan Taicheng, E-mail: tcduan@ciac.jl.cn [State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Science, Changchun 130022 (China); Chen Hangting, E-mail: htchen@ciac.jl.cn [State Key Laboratory of Electroanalytical Chemistry, Changchun Institute of Applied Chemistry, Chinese Academy of Science, Changchun 130022 (China)

    2011-01-15

    The dispersive liquid-liquid microextraction (DLLME) combined with high performance liquid chromatography-inductively coupled plasma mass spectrometry for the speciation of mercury in water samples was described. Firstly methylmercury (MeHg{sup +}) and mercury (Hg{sup 2+}) were complexed with sodium diethyldithiocarbamate, and then the complexes were extracted into carbon tetrachloride by using DLLME. Under the optimized conditions, the enrichment factors of 138 and 350 for MeHg{sup +} and Hg{sup 2+} were obtained from only 5.00 mL sample solution. The detection limits of the analytes (as Hg) were 0.0076 ng mL{sup -1} for MeHg{sup +} and 0.0014 ng mL{sup -1} for Hg{sup 2+}, respectively. The relative standard deviations for ten replicate measurements of 0.5 ng mL{sup -1} MeHg{sup +} and Hg{sup 2+} were 6.9% and 4.4%, respectively. Standard reference material of seawater (GBW(E)080042) was analyzed to verify the accuracy of the method and the results were in good agreement with the certified values. Finally, the developed method was successfully applied for the speciation of mercury in three environmental water samples.

  14. Using Light Microscopy and Liquid Chromatography Tandem Mass Spectrometry for Qualitative and Quantitative Control of a Combined Three-Herb Formulation in Different Preparations

    Directory of Open Access Journals (Sweden)

    Tun-Pin Hsueh

    2016-12-01

    Full Text Available Artemisia capillaries Thunb, Gardenia jasminoides Ellis, and Rheum officinale Baill have been combined to treat jaundice for thousands of years. Studies have revealed that these herbs induce anti-hepatic fibrosis and anti-hepatic apoptosis and alleviate hepatic oxidative stress. This study aims to determine the quality and quantity of an herbal formulation (Chinese name: Yin-Chen-Hao-Tang using physical and chemical examinations. Physical examination of Yin-Chen-Hao-Tang in pharmaceutical herbal products, raw fiber powders, and decoction preparations was performed using Congo red and iodine-potassium staining. A sensitive and validated method employing ultra-high-performance liquid chromatography tandem mass spectrometry (UHPLC-MS/MS was developed to simultaneously quantify the bioactive compounds scoparone, geniposide, and rhein in the Yin-Chen-Hao-Tang formulation in different preparations. Physical examination indicated that cellulose fibers with irregular round shapes were present in the pharmaceutical herbal products. The developed UHPLC-MS/MS method showed good linearity and was well validated. The quantification results revealed that the decoction preparations had the highest amounts of geniposide and rhein. Scoparone appeared in pharmaceutical herbal products from two manufacturers. This experiment provides a qualitative and quantitative method using physical and chemical examinations to test different preparations of herbal products. The results provide a reference for clinical herbal product preparations and further pharmacokinetic research.

  15. Selection of the optimal combination of water vapor absorption lines for detection of temperature in combustion zones of mixing supersonic gas flows by diode laser absorption spectrometry

    International Nuclear Information System (INIS)

    Mironenko, V.R.; Kuritsyn, Yu.A.; Bolshov, M.A.; Liger, V.V.

    2017-01-01

    Determination of a gas medium temperature by diode laser absorption spectrometry (DLAS) is based on the measurement of integral intensities of the absorption lines of a test molecule (generally water vapor molecule). In case of local thermodynamic equilibrium temperature is inferred from the ratio of the integral intensities of two lines with different low energy levels. For the total gas pressure above 1 atm the absorption lines are broadened and one cannot find isolated well resolved water vapor absorption lines within relatively narrow spectral interval of fast diode laser (DL) tuning range (about 3 cm"−"1). For diagnostics of a gas object in the case of high temperature and pressure DLAS technique can be realized with two diode lasers working in different spectral regions with strong absorption lines. In such situation the criteria of the optimal line selection differs significantly from the case of narrow lines. These criteria are discussed in our work. The software for selection the optimal spectral regions using the HITRAN-2012 and HITEMP data bases is developed. The program selects spectral regions of DL tuning, minimizing the error of temperature determination δT/T, basing on the attainable experimental error of line intensity measurement δS. Two combinations of optimal spectral regions were selected – (1.392 & 1.343 μm) and (1.392 & 1.339 μm). Different algorithms of experimental data processing are discussed.

  16. Determination of six polyether antibiotic residues in foods of animal origin by solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Ha, Jing; Song, Ge; Ai, Lian-Feng; Li, Jian-Chen

    2016-04-01

    A new method using solid phase extraction (SPE) combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS) has been developed for the determination of six polyether antibiotics, including lasalocid, salinomycin, monensin, narasin, madubamycin and nigericin residues, in foods of animal origin. The samples were extracted with acetonitrile and purified by ENVI-Carb SPE columns after comparing the impurity effect and maneuverability of several SPE cartridges. Subsequently, the analytes were separated on a Hypersil Gold column (2.1×150mm, 5μm) and analyzed by MS/MS detection. The limit of quantization (LOQ) for milk and chicken was 0.4μg/kg, and for chicken livers and eggs, it was 1μg/kg. The linearity was satisfactory with a correlation coefficient of >0.9995 at concentrations ranging from 2 to 100μg/L. The average recoveries of the analytes fortified at three levels ranged from 68.2 to 114.3%, and the relative standard deviations ranged from 4.5 to 12.1%. The method was suitable for quantitative analysis and confirmation of polyether antibiotic residues in foods of animal origin. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Separation and determination of copper in bottled water samples by combination of dispersive liquid--liquid microextraction and microsample introduction flame atomic absorption spectrometry.

    Science.gov (United States)

    Citak, Demirhan; Tuzen, Mustafa

    2013-01-01

    A new and simple method for the determination of trace amounts of Cu(II) was developed by combination of dispersive liquid-liquid microextraction (DLLME) preconcentration and microsample introduction flame atomic absorption spectrometry. In this method, ethanol and chloroform were chosen as disperser and extraction solvents, respectively, and 1-nitroso-2-naphthol was used as the complexing agent. The factors affecting the extraction efficiency and determination of Cu(II), including extraction and disperser solvent nature and volume, concentration of the complexing agent, pH of the solution, extraction time, and matrix ions, were investigated. Under optimal conditions, the LOD for Cu(II) was 0.95 microg/L with a preconcentration factor of 70. The RSD was 1.9%. The accuracy of the developed DLLME method was verified by determination of Cu(II) in a certified reference material (NRCC-SLRS-4 river water). The relative error was -3.31%. The developed preconcentration procedure was successfully applied to the analysis of bottled drinking water samples.

  18. Dispersive Liquid-Liquid Microextraction Combined with Ultrahigh Performance Liquid Chromatography/Tandem Mass Spectrometry for Determination of Organophosphate Esters in Aqueous Samples

    Directory of Open Access Journals (Sweden)

    Haiying Luo

    2014-01-01

    Full Text Available A new technique was established to identify eight organophosphate esters (OPEs in this work. It utilised dispersive liquid-liquid microextraction in combination with ultrahigh performance liquid chromatography/tandem mass spectrometry. The type and volume of extraction solvents, dispersion agent, and amount of NaCl were optimized. The target analytes were detected in the range of 1.0–200 µg/L with correlation coefficients ranging from 0.9982 to 0.9998, and the detection limits of the analytes were ranged from 0.02 to 0.07 µg/L (S/N=3. The feasibility of this method was demonstrated by identifying OPEs in aqueous samples that exhibited spiked recoveries, which ranged between 48.7% and 58.3% for triethyl phosphate (TEP as well as between 85.9% and 113% for the other OPEs. The precision was ranged from 3.2% to 9.3% (n=6, and the interprecision was ranged from 2.6% to 12.3% (n=5. Only 2 of the 12 selected samples were tested to be positive for OPEs, and the total concentrations of OPEs in them were 1.1 and 1.6 µg/L, respectively. This method was confirmed to be simple, fast, and accurate for identifying OPEs in aqueous samples.

  19. Analysis of Bioactive Components of Oilseed Cakes by High-Performance Thin-Layer Chromatography-(Bioassay Combined with Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Sue-Siang Teh

    2015-03-01

    Full Text Available Hemp, flax and canola seed cakes are byproducts of the plant oil extraction industry that have not received much attention in terms of their potential use for human food instead of animal feed. Thus, the bioactivity profiling of these oilseed cakes is of interest. For their effect-directed analysis, planar chromatography was combined with several (bioassays, namely 2,2-diphenyl-1-picrylhydrazyl scavenging, acetylcholine esterase inhibition, planar yeast estrogen screen, antimicrobial Bacillus subtilis and Aliivibrio fischeri assays. The streamlined high-performance thin-layer chromatography (HPTLC-bioassay method allowed the discovery of previously unknown bioactive compounds present in these oilseed cake extracts. In contrast to target analysis, the direct link to the effective compounds allowed comprehensive information with regard to selected effects. HPTLC-electrospray ionization-mass spectrometry via the elution-head based TLC-MS Interface was used for a first characterization of the unknown effective compounds. The demonstrated bioactivity profiling on the feed/food intake side may guide the isolation of active compounds for production of functional food or for justified motivation of functional feed/food supplements.

  20. Molecular-level characterization of crude oil compounds combining reversed-phase high-performance liquid chromatography with off-line high-resolution mass spectrometry

    Science.gov (United States)

    Sim, Arum; Cho, Yunju; Kim, Daae; Witt, Matthias; Birdwell, Justin E.; Kim, Byung Ju; Kim, Sunghwan

    2014-01-01

    A reversed-phase separation technique was developed in a previous study (Loegel et al., 2012) and successfully applied to the de-asphalted fraction of crude oil. However, to the best of our knowledge, the molecular-level characterization of oil fractions obtained by reversed-phase high-performance liquid chromatography (HPLC) coupled with high-resolution mass spectrometry (MS) has not yet been reported. A detailed characterization of the oil fractions prepared by reversed-phase HPLC was performed in this study. HPLC fractionation was carried out on conventional crude oil and an oil shale pyrolysate. The analyses of the fractions showed that the carbon number of alkyl chains and the double bond equivalent (DBE) value were the major factors determining elution order. The compounds with larger DBE (presumably more condensed aromatic structures) and smaller carbon number (presumably compounds with short side chains) were eluted earlier but those compounds with lower DBE values (presumably less aromatic structures) and higher carbon number (presumably compounds with longer alkyl chains) eluted later in the chromatograms. This separation behavior is in good agreement with that expected from the principles of reversed-phase separation. The data presented in this study show that reversed-phase chromatography is effective in separating crude oil compounds and can be combined with ultrahigh-resolution MS data to better understand natural oils and oil shale pyrolysates.

  1. Speciation of mercury in water samples by dispersive liquid-liquid microextraction combined with high performance liquid chromatography-inductively coupled plasma mass spectrometry

    International Nuclear Information System (INIS)

    Jia Xiaoyu; Han Yi; Liu Xinli; Duan Taicheng; Chen Hangting

    2011-01-01

    The dispersive liquid-liquid microextraction (DLLME) combined with high performance liquid chromatography-inductively coupled plasma mass spectrometry for the speciation of mercury in water samples was described. Firstly methylmercury (MeHg + ) and mercury (Hg 2+ ) were complexed with sodium diethyldithiocarbamate, and then the complexes were extracted into carbon tetrachloride by using DLLME. Under the optimized conditions, the enrichment factors of 138 and 350 for MeHg + and Hg 2+ were obtained from only 5.00 mL sample solution. The detection limits of the analytes (as Hg) were 0.0076 ng mL -1 for MeHg + and 0.0014 ng mL -1 for Hg 2+ , respectively. The relative standard deviations for ten replicate measurements of 0.5 ng mL -1 MeHg + and Hg 2+ were 6.9% and 4.4%, respectively. Standard reference material of seawater (GBW(E)080042) was analyzed to verify the accuracy of the method and the results were in good agreement with the certified values. Finally, the developed method was successfully applied for the speciation of mercury in three environmental water samples.

  2. Illustration of compositional variations over time of Chinese porcelain glazes combining micro-X-ray Fluorescence spectrometry, multivariate data analysis and Seger formulas

    Science.gov (United States)

    Van Pevenage, J.; Verhaeven, E.; Vekemans, B.; Lauwers, D.; Herremans, D.; De Clercq, W.; Vincze, L.; Moens, L.; Vandenabeele, P.

    2015-01-01

    In this research, the transparent glaze layers of Chinese porcelain samples were investigated. Depending on the production period, these samples can be divided into two groups: the samples of group A dating from the Kangxi period (1661-1722), and the samples of group B produced under emperor Qianlong (1735-1795). Due to the specific sample preparation method and the small spot size of the X-ray beam, investigation of the transparent glaze layers is enabled. Despite the many existing research papers about glaze investigations of ceramics and/or porcelain ware, this research reveals new insights into the glaze composition and structure of Chinese porcelain samples. In this paper it is demonstrated, using micro-X-ray Fluorescence (μ-XRF) spectrometry, multivariate data analysis and statistical analysis (Hotelling's T-Square test) that the transparent glaze layers of the samples of groups A and B are significantly different (95% confidence level). Calculation of the Seger formulas, enabled classification of the glazes. Combining all the information, the difference in composition of the Chinese porcelain glazes of the Kangxi period and the Qianlong period can be demonstrated.

  3. Sensitive Determination of Onco-metabolites of D- and L-2-hydroxyglutarate Enantiomers by Chiral Derivatization Combined with Liquid Chromatography/Mass Spectrometry Analysis

    Science.gov (United States)

    Cheng, Qing-Yun; Xiong, Jun; Huang, Wei; Ma, Qin; Ci, Weimin; Feng, Yu-Qi; Yuan, Bi-Feng

    2015-01-01

    2-hydroxyglutarate (2HG) is a potent competitor of α-ketoglutarate (α-KG) and can inhibit multiple α-KG dependent dioxygenases that function on the epigenetic modifications. The accumulation of 2HG contributes to elevated risk of malignant tumors. 2HG carries an asymmetric carbon atom in its carbon backbone and differentiation between D-2-hydroxyglutarate (D-2HG) and L-2-hydroxyglutarate (L-2HG) is crucially important for accurate diagnosis of 2HG related diseases. Here we developed a strategy by chiral derivatization combined with liquid chromatography-electrospray ionization-tandem mass spectrometry (LC-ESI-MS/MS) analysis for highly sensitive determination of D-2HG and L-2HG enantiomers. N-(p-toluenesulfonyl)-L-phenylalanyl chloride (TSPC) was used to derivatize 2HG. The formed diastereomers by TSPC labeling can efficiently improve the chromatographic separation of D-2HG and L-2HG. And derivatization by TSPC could also markedly increase the detection sensitivities by 291 and 346 folds for D-2HG and L-2HG, respectively. Using the developed method, we measured the contents of D-2HG and L-2HG in clear cell renal cell carcinoma (ccRCC) tissues. We observed 12.9 and 29.8 folds increase of D-2HG and L-2HG, respectively, in human ccRCC tissues compared to adjacent normal tissues. The developed chiral derivatization combined with LC-ESI-MS/MS analysis offers sensitive determination of D-2HG and L-2HG enantiomers, which benefits the precise diagnosis of 2HG related metabolic diseases. PMID:26458332

  4. Illustration of compositional variations over time of Chinese porcelain glazes combining micro-X-ray Fluorescence spectrometry, multivariate data analysis and Seger formulas

    International Nuclear Information System (INIS)

    Van Pevenage, J.; Verhaeven, E.; Vekemans, B.; Lauwers, D.; Herremans, D.; De Clercq, W.; Vincze, L.; Moens, L.; Vandenabeele, P.

    2015-01-01

    In this research, the transparent glaze layers of Chinese porcelain samples were investigated. Depending on the production period, these samples can be divided into two groups: the samples of group A dating from the Kangxi period (1661–1722), and the samples of group B produced under emperor Qianlong (1735–1795). Due to the specific sample preparation method and the small spot size of the X-ray beam, investigation of the transparent glaze layers is enabled. Despite the many existing research papers about glaze investigations of ceramics and/or porcelain ware, this research reveals new insights into the glaze composition and structure of Chinese porcelain samples. In this paper it is demonstrated, using micro-X-ray Fluorescence (μ-XRF) spectrometry, multivariate data analysis and statistical analysis (Hotelling's T-Square test) that the transparent glaze layers of the samples of groups A and B are significantly different (95% confidence level). Calculation of the Seger formulas, enabled classification of the glazes. Combining all the information, the difference in composition of the Chinese porcelain glazes of the Kangxi period and the Qianlong period can be demonstrated. - Highlights: • Fully described methodology for the analysis of silicate glazes of Chinese porcelain samples • The combination of a semi-quantitative analysis of silicate glazes, multi-variate data and statistical analysis. • The use of Seger formula to understand better the composition of the glazes. • New insights into the glaze composition and structure of Chinese porcelain glazes of different time periods

  5. Method development and validation of liquid chromatography-tandem/mass spectrometry for aldosterone in human plasma: Application to drug interaction study of atorvastatin and olmesartan combination

    Directory of Open Access Journals (Sweden)

    Rakesh Das

    2014-01-01

    Full Text Available In the present investigation, a simple and sensitive liquid chromatography-tandem mass spectrometry (LC/MS/MS method was developed for the quantification of aldosterone (ALD a hormone responsible for blood pressure in human plasma. The developed method was validated and extended for application on human subjects to study drug interaction of atorvastatin (ATSV and olmesartan (OLM on levels of ALD. The ALD in plasma was extracted by liquid-liquid extraction with 5 mL dichloromethane/ethyl ether (60/40% v/v. The chromatographic separation of ALD was carried on Xterra, RP-Column C18 (150 mm× 4.6 mm × 3.5 μm at 30°C followed by four-step gradient program composed of methanol and water. Step 1 started with 35% methanol for first 1 min and changed linearly to 90% in next 1.5 min in Step 2. Step 3 lasted for next 2 min with 90% methanol. The method finally concluded with Step 4 to achieve initial concentration of methanol that is, 35% thus contributing the total method run time of 17.5 min. The flow rate was 0.25 mL/min throughout the process. The developed method was validated for specificity, accuracy, precision, stability, linearity, sensitivity, and recovery. The method was linear and found to be acceptable over the range of 50-800 ng/mL. The method was successfully applied for the drug interaction study of ATSV + OLM in combination against OLM treatment on blood pressure by quantifying changes in levels of ALD in hypertensive patients. The study revealed levels of ALD were significantly higher in ATSV + OLM treatment condition when compared to OLM as single treated condition. This reflects the reason of low effectiveness of ATSV + OLM in combination instead of synergistic activity.

  6. A dipole-assisted solid-phase extraction microchip combined with inductively coupled plasma-mass spectrometry for online determination of trace heavy metals in natural water.

    Science.gov (United States)

    Shih, Tsung-Ting; Hsu, I-Hsiang; Chen, Shun-Niang; Chen, Ping-Hung; Deng, Ming-Jay; Chen, Yu; Lin, Yang-Wei; Sun, Yuh-Chang

    2015-01-21

    We employed a polymeric material, poly(methyl methacrylate) (PMMA), for fabricating a microdevice and then implanted the chlorine (Cl)-containing solid-phase extraction (SPE) functionality into the PMMA chip to develop an innovative on-chip dipole-assisted SPE technique. Instead of the ion-ion interactions utilized in on-chip SPE techniques, the dipole-ion interactions between the highly electronegative C-Cl moieties in the channel interior and the positively charged metal ions were employed to facilitate the on-chip SPE procedures. Furthermore, to avoid labor-intensive manual manipulation, a programmable valve manifold was designed as an interface combining the dipole-assisted SPE microchip and inductively coupled plasma-mass spectrometry (ICP-MS) to achieve the fully automated operation. Under the optimized operation conditions for the established system, the detection limits for each analyte ion were obtained based on three times the standard deviation of seven measurements of the blank eluent solution. The limits ranged from 3.48 to 20.68 ng L(-1), suggesting that this technique appears uniquely suited for determining the levels of heavy metal ions in natural water. Indeed, a series of validation procedures demonstrated that the developed method could be satisfactorily applied to the determination of trace heavy metals in natural water. Remarkably, the developed device was durable enough to be reused more than 160 times without any loss in its analytical performance. To the best of our knowledge, this is the first study reporting on the combination of a dipole-assisted SPE microchip and elemental analysis instrument for the online determination of trace heavy metal ions.

  7. Illustration of compositional variations over time of Chinese porcelain glazes combining micro-X-ray Fluorescence spectrometry, multivariate data analysis and Seger formulas

    Energy Technology Data Exchange (ETDEWEB)

    Van Pevenage, J., E-mail: Raman@UGent.be [Department of Analytical Chemistry, Raman Spectroscopy Research Group, Ghent University, Krijgslaan 281, S12, B-9000 Ghent (Belgium); Verhaeven, E. [Department of Conservation and Restoration, University College Antwerp, Blindestraat 9, B-2000 Antwerp (Belgium); Vekemans, B. [Department of Analytical Chemistry, Ghent University, Krijgslaan 281, S12, B-9000 Ghent (Belgium); Lauwers, D., E-mail: Raman@UGent.be [Department of Analytical Chemistry, Raman Spectroscopy Research Group, Ghent University, Krijgslaan 281, S12, B-9000 Ghent (Belgium); Herremans, D.; De Clercq, W. [Department of Archaeology, Ghent University, Sint-Pietersnieuwstraat 35, B-9000 Ghent (Belgium); Vincze, L. [Department of Analytical Chemistry, Ghent University, Krijgslaan 281, S12, B-9000 Ghent (Belgium); Moens, L., E-mail: Raman@UGent.be [Department of Analytical Chemistry, Raman Spectroscopy Research Group, Ghent University, Krijgslaan 281, S12, B-9000 Ghent (Belgium); Vandenabeele, P. [Department of Archaeology, Ghent University, Sint-Pietersnieuwstraat 35, B-9000 Ghent (Belgium)

    2015-01-01

    In this research, the transparent glaze layers of Chinese porcelain samples were investigated. Depending on the production period, these samples can be divided into two groups: the samples of group A dating from the Kangxi period (1661–1722), and the samples of group B produced under emperor Qianlong (1735–1795). Due to the specific sample preparation method and the small spot size of the X-ray beam, investigation of the transparent glaze layers is enabled. Despite the many existing research papers about glaze investigations of ceramics and/or porcelain ware, this research reveals new insights into the glaze composition and structure of Chinese porcelain samples. In this paper it is demonstrated, using micro-X-ray Fluorescence (μ-XRF) spectrometry, multivariate data analysis and statistical analysis (Hotelling's T-Square test) that the transparent glaze layers of the samples of groups A and B are significantly different (95% confidence level). Calculation of the Seger formulas, enabled classification of the glazes. Combining all the information, the difference in composition of the Chinese porcelain glazes of the Kangxi period and the Qianlong period can be demonstrated. - Highlights: • Fully described methodology for the analysis of silicate glazes of Chinese porcelain samples • The combination of a semi-quantitative analysis of silicate glazes, multi-variate data and statistical analysis. • The use of Seger formula to understand better the composition of the glazes. • New insights into the glaze composition and structure of Chinese porcelain glazes of different time periods.

  8. Analysis of neutral volatile aroma components in Tilsit cheese using a combination of dynamic headspace technique, capillary gas chromatography and mass spectrometry

    International Nuclear Information System (INIS)

    Dillinger, K.H.

    2000-03-01

    Tilsit cheese is made by the influence of lab ferment and starter cultures on milk. The ripening is done by repeated inoculation of the surface of the Tilsit cheese with yeasts and read smear cultures. This surface flora forms the typical aroma of the Tilsit cheese during the ripening process. The aim of the work was to receive general knowledge about the kind and amount of the neutral volatile aroma components of Tilsit cheese. Beyond this the ability of forming aroma components by read smear cultures and the dispersion of these components in cheese was to be examined. The results were intended to evaluate the formation of aroma components in Tilsit cheese. The semi-quantitative analyses of the aroma components of all samples were done by combining dynamic headspace extraction, gas chromatography and mass spectrometry. In this process the neutral volatile aroma components were extracted by dynamic headspace technique, adsorbed on a trap, thermally desorbed, separated by gas chromatography, detected and identified by mass spectrometry. 63 components belonging to the chemical classes of esters, ketones, aldehydes, alcohols and sulfur containing substances as well as aromatic hydrocarbons, chlorinated hydrocarbons and hydrocarbons were found in the analysed cheese samples of different Austrian Tilsit manufacturing plants. All cheese samples showed a qualitative equal but quantitative varied spectrum of aroma components. The cultivation of pure cultures on a cheese agar medium showed all analysed aroma components to be involved in the biochemical metabolism of these cultures. The ability to produce aroma components greatly differed between the strains and it was not possible to correlate this ability with the taxonomic classification of the strains. The majority of the components had a non-homogeneous concentration profile in the cheese body. This was explained by effects of diffusion and temporal and spatial different forming of components by the metabolism of the

  9. Determination of chlorophenols in landfill leachate using headspace sampling with ionic liquid-coated solid-phase microextraction fibers combined with gas chromatography–mass spectrometry

    International Nuclear Information System (INIS)

    Ho, Tse-Tsung; Chen, Chung-Yu; Li Zuguang; Yang, Thomas Ching-Cherng; Lee, Maw-Rong

    2012-01-01

    Highlights: ► Ionic liquid (IL), ([C 4 MIM][PF 6 ]), was rapid synthesized by microwave radiation. ► Trace chlorophenols in landfill leachate were extract by SPME coated IL. ► The IL-coated SPME-GC/MS method is low-cost, solvent-free and sensitive. - Abstract: A new microextraction technique based on ionic liquid solid-phase microextraction (IL-SPME) was developed for determination of trace chlorophenols (CPs) in landfill leachate. The synthesized ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate ([C 4 MIM][PF 6 ]), was coated onto the spent fiber of SPME for extraction of trace CPs. After extraction, the absorbed analytes were desorbed and quantified using gas chromatography–mass spectrometry (GC/MS). The term of the proposed method is as ionic liquid-coated of solid-phase microextraction combined with gas chromatography–mass spectrometry (IL-SPME-GC/MS). No carryover effect was found, and every laboratory-made ionic liquids-coated-fiber could be used for extraction at least eighty times without degradation of efficiency. The chlorophenols studied were 2,4-dichlorophenol (2,4-DP), 2,4,6-trichlorophenol (2,4,6-TCP), 2,3,4,6-tetrachlorophenol (2,3,4,6-TeCP), and pentachlorophenol (PCP). The best results of chlorophenols analysis were obtained with landfill leachate at pH 2, headspace extraction for 4 min, and thermal desorption with the gas chromatograph injector at 240 °C for 4 min. Linearity was observed from 0.1 to 1000 μg L −1 with relative standard deviations (RSD) less than 7% and recoveries were over 87%. The limit of detection (LOD) for pentachlorophenol was 0.008 μg L −1 . The proposed method was tested by analyzing landfill leachate from a sewage farm. The concentrations of chlorophenols were detected to range from 1.1 to 1.4 μg L −1 . The results demonstrate that the IL-SPME-GC/MS method is highly effective in analyzing trace chlorophenols in landfill leachate.

  10. Determination of chlorophenols in landfill leachate using headspace sampling with ionic liquid-coated solid-phase microextraction fibers combined with gas chromatography-mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Tse-Tsung; Chen, Chung-Yu [Department of Chemistry, National Chung Hsing University, Taichung 40227, Taiwan (China); Li Zuguang [Department of Chemistry, National Chung Hsing University, Taichung 40227, Taiwan (China); College of Chemical Engineering and Materials Science, Zhejiang University of Technology, Hangzhou 310014, Zhejiang (China); Yang, Thomas Ching-Cherng [Department of Chemistry, National Kaohsiung Normal University, Kaohsiung 82444, Taiwan (China); Lee, Maw-Rong, E-mail: mrlee@dragon.nchu.edu.tw [Department of Chemistry, National Chung Hsing University, Taichung 40227, Taiwan (China)

    2012-01-27

    Highlights: Black-Right-Pointing-Pointer Ionic liquid (IL), ([C{sub 4}MIM][PF{sub 6}]), was rapid synthesized by microwave radiation. Black-Right-Pointing-Pointer Trace chlorophenols in landfill leachate were extract by SPME coated IL. Black-Right-Pointing-Pointer The IL-coated SPME-GC/MS method is low-cost, solvent-free and sensitive. - Abstract: A new microextraction technique based on ionic liquid solid-phase microextraction (IL-SPME) was developed for determination of trace chlorophenols (CPs) in landfill leachate. The synthesized ionic liquid, 1-butyl-3-methylimidazolium hexafluorophosphate ([C{sub 4}MIM][PF{sub 6}]), was coated onto the spent fiber of SPME for extraction of trace CPs. After extraction, the absorbed analytes were desorbed and quantified using gas chromatography-mass spectrometry (GC/MS). The term of the proposed method is as ionic liquid-coated of solid-phase microextraction combined with gas chromatography-mass spectrometry (IL-SPME-GC/MS). No carryover effect was found, and every laboratory-made ionic liquids-coated-fiber could be used for extraction at least eighty times without degradation of efficiency. The chlorophenols studied were 2,4-dichlorophenol (2,4-DP), 2,4,6-trichlorophenol (2,4,6-TCP), 2,3,4,6-tetrachlorophenol (2,3,4,6-TeCP), and pentachlorophenol (PCP). The best results of chlorophenols analysis were obtained with landfill leachate at pH 2, headspace extraction for 4 min, and thermal desorption with the gas chromatograph injector at 240 Degree-Sign C for 4 min. Linearity was observed from 0.1 to 1000 {mu}g L{sup -1} with relative standard deviations (RSD) less than 7% and recoveries were over 87%. The limit of detection (LOD) for pentachlorophenol was 0.008 {mu}g L{sup -1}. The proposed method was tested by analyzing landfill leachate from a sewage farm. The concentrations of chlorophenols were detected to range from 1.1 to 1.4 {mu}g L{sup -1}. The results demonstrate that the IL-SPME-GC/MS method is highly effective in

  11. A solid-phase microextraction-gas chromatographic approach combined with triple quadrupole mass spectrometry for the assay of carbamate pesticides in water samples.

    Science.gov (United States)

    Cavaliere, Brunella; Monteleone, Marcello; Naccarato, Attilio; Sindona, Giovanni; Tagarelli, Antonio

    2012-09-28

    A simple and sensitive method was developed for the quantification of five carbamate pesticides in water samples using solid phase microextraction (SPME) combined with gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS). The performance of five SPME fibers was tested in univariate mode whereas the other variables affecting the efficiency of SPME analysis were optimized by the multivariate approach of design of experiment (DoE) and, in particular, a central composite design (CCD) was applied. The optimum working conditions in terms of response values were achieved by performing analysis with polydimethylsiloxane/divinylbenzene (PDMS/DVB) fiber in immersion mode for 45min at room temperature with addition of NaCl (10%). The multivariate chemometric approach was also used to explore the chromatographic behavior of the carbamates and to evaluate the importance of each variable investigated. An overall appraisement of results shows that the factor which gave a statistically significant effect on the response was only the injection temperature. Identification and quantification of carbamates was performed by using a gas chromatography-triple quadrupole mass spectrometry (GC-QqQ-MS) system in multiple reaction monitoring (MRM) acquisition. Since the choice of internal standard represented a crucial step in the development of method to achieve good reproducibility and robustness for the entire analytical protocol, three compounds (2,3,5-trimethacarb, 4-bromo-3,5-dimethylphenyl-n-methylcarbamate (BDMC) and carbaryl-d7) were evaluated as internal standards. Both precision and accuracy of the proposed protocol tested at concentration of 0.08, 5 and 3 μg l⁻¹ offered values ranging from 70.8% and 115.7% (except for carbaryl at 3 μg l⁻¹) and from 1.0% and 9.0% for accuracy and precision, respectively. Moreover, LOD and LOQ values ranging from 0.04 to 1.7 ng l⁻¹ and from 0.64 to 2.9 ng l⁻¹, respectively, can be considered very satisfactory. Copyright

  12. Grid workflow job execution service 'Pilot'

    Science.gov (United States)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-12-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  13. Grid workflow job execution service 'Pilot'

    International Nuclear Information System (INIS)

    Shamardin, Lev; Kryukov, Alexander; Demichev, Andrey; Ilyin, Vyacheslav

    2011-01-01

    'Pilot' is a grid job execution service for workflow jobs. The main goal for the service is to automate computations with multiple stages since they can be expressed as simple workflows. Each job is a directed acyclic graph of tasks and each task is an execution of something on a grid resource (or 'computing element'). Tasks may be submitted to any WS-GRAM (Globus Toolkit 4) service. The target resources for the tasks execution are selected by the Pilot service from the set of available resources which match the specific requirements from the task and/or job definition. Some simple conditional execution logic is also provided. The 'Pilot' service is built on the REST concepts and provides a simple API through authenticated HTTPS. This service is deployed and used in production in a Russian national grid project GridNNN.

  14. Workflow optimization beyond RIS and PACS

    International Nuclear Information System (INIS)

    Treitl, M.; Wirth, S.; Lucke, A.; Nissen-Meyer, S.; Trumm, C.; Rieger, J.; Pfeifer, K.-J.; Reiser, M.; Villain, S.

    2005-01-01

    Technological progress and the rising cost pressure on the healthcare system have led to a drastic change in the work environment of radiologists today. The pervasive demand for workflow optimization and increased efficiency of its activities raises the question of whether by employment of electronic systems, such as RIS and PACS, the potentials of digital technology are sufficiently used to fulfil this demand. This report describes the tasks and structures in radiology departments, which so far are only insufficiently supported by commercially available electronic systems but are nevertheless substantial. We developed and employed a web-based, integrated workplace system, which simplifies many daily tasks of departmental organization and administration apart from well-established tasks of documentation. Furthermore, we analyzed the effects exerted on departmental workflow by employment of this system for 3 years. (orig.) [de

  15. Workflow of the Grover algorithm simulation incorporating CUDA and GPGPU

    Science.gov (United States)

    Lu, Xiangwen; Yuan, Jiabin; Zhang, Weiwei

    2013-09-01

    The Grover quantum search algorithm, one of only a few representative quantum algorithms, can speed up many classical algorithms that use search heuristics. No true quantum computer has yet been developed. For the present, simulation is one effective means of verifying the search algorithm. In this work, we focus on the simulation workflow using a compute unified device architecture (CUDA). Two simulation workflow schemes are proposed. These schemes combine the characteristics of the Grover algorithm and the parallelism of general-purpose computing on graphics processing units (GPGPU). We also analyzed the optimization of memory space and memory access from this perspective. We implemented four programs on CUDA to evaluate the performance of schemes and optimization. Through experimentation, we analyzed the organization of threads suited to Grover algorithm simulations, compared the storage costs of the four programs, and validated the effectiveness of optimization. Experimental results also showed that the distinguished program on CUDA outperformed the serial program of libquantum on a CPU with a speedup of up to 23 times (12 times on average), depending on the scale of the simulation.

  16. Designing Flexible E-Business Workflow Systems

    OpenAIRE

    Cătălin Silvestru; Codrin Nisioiu; Marinela Mircea; Bogdan Ghilic-Micu; Marian Stoica

    2010-01-01

    In today’s business environment organizations must cope with complex interactions between actors adapt fast to frequent market changes and be innovative. In this context, integrating knowledge with processes and Business Intelligenceis a major step towards improving organization agility. Therefore, traditional environments for workflow design have been adapted to answer the new business models and current requirements in the field of collaborative processes. This paper approaches the design o...

  17. Planning bioinformatics workflows using an expert system

    Science.gov (United States)

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  18. SPECT/CT workflow and imaging protocols

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Catherine [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Hustinx, Roland [University Hospital of Liege, Division of Nuclear Medicine and Oncological Imaging, Department of Medical Physics, Liege (Belgium); Domaine Universitaire du Sart Tilman, Service de Medecine Nucleaire et Imagerie Oncologique, CHU de Liege, Liege (Belgium)

    2014-05-15

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  19. IDD Archival Hardware Architecture and Workflow

    Energy Technology Data Exchange (ETDEWEB)

    Mendonsa, D; Nekoogar, F; Martz, H

    2008-10-09

    This document describes the functionality of every component in the DHS/IDD archival and storage hardware system shown in Fig. 1. The document describes steps by step process of image data being received at LLNL then being processed and made available to authorized personnel and collaborators. Throughout this document references will be made to one of two figures, Fig. 1 describing the elements of the architecture and the Fig. 2 describing the workflow and how the project utilizes the available hardware.

  20. SPECT/CT workflow and imaging protocols

    International Nuclear Information System (INIS)

    Beckers, Catherine; Hustinx, Roland

    2014-01-01

    Introducing a hybrid imaging method such as single photon emission computed tomography (SPECT)/CT greatly alters the routine in the nuclear medicine department. It requires designing new workflow processes and the revision of original scheduling process and imaging protocols. In addition, the imaging protocol should be adapted for each individual patient, so that performing CT is fully justified and the CT procedure is fully tailored to address the clinical issue. Such refinements often occur before the procedure is started but may be required at some intermediate stage of the procedure. Furthermore, SPECT/CT leads in many instances to a new partnership with the radiology department. This article presents practical advice and highlights the key clinical elements which need to be considered to help understand the workflow process of SPECT/CT and optimise imaging protocols. The workflow process using SPECT/CT is complex in particular because of its bimodal character, the large spectrum of stakeholders, the multiplicity of their activities at various time points and the need for real-time decision-making. With help from analytical tools developed for quality assessment, the workflow process using SPECT/CT may be separated into related, but independent steps, each with its specific human and material resources to use as inputs or outputs. This helps identify factors that could contribute to failure in routine clinical practice. At each step of the process, practical aspects to optimise imaging procedure and protocols are developed. A decision-making algorithm for justifying each CT indication as well as the appropriateness of each CT protocol is the cornerstone of routine clinical practice using SPECT/CT. In conclusion, implementing hybrid SPECT/CT imaging requires new ways of working. It is highly rewarding from a clinical perspective, but it also proves to be a daily challenge in terms of management. (orig.)

  1. BioInfra.Prot: A comprehensive proteomics workflow including data standardization, protein inference, expression analysis and data publication.

    Science.gov (United States)

    Turewicz, Michael; Kohl, Michael; Ahrens, Maike; Mayer, Gerhard; Uszkoreit, Julian; Naboulsi, Wael; Bracht, Thilo; Megger, Dominik A; Sitek, Barbara; Marcus, Katrin; Eisenacher, Martin

    2017-11-10

    The analysis of high-throughput mass spectrometry-based proteomics data must address the specific challenges of this technology. To this end, the comprehensive proteomics workflow offered by the de.NBI service center BioInfra.Prot provides indispensable components for the computational and statistical analysis of this kind of data. These components include tools and methods for spectrum identification and protein inference, protein quantification, expression analysis as well as data standardization and data publication. All particular methods of the workflow which address these tasks are state-of-the-art or cutting edge. As has been shown in previous publications, each of these methods is adequate to solve its specific task and gives competitive results. However, the methods included in the workflow are continuously reviewed, updated and improved to adapt to new scientific developments. All of these particular components and methods are available as stand-alone BioInfra.Prot services or as a complete workflow. Since BioInfra.Prot provides manifold fast communication channels to get access to all components of the workflow (e.g., via the BioInfra.Prot ticket system: bioinfraprot@rub.de) users can easily benefit from this service and get support by experts. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  2. Schedule-Aware Workflow Management Systems

    Science.gov (United States)

    Mans, Ronny S.; Russell, Nick C.; van der Aalst, Wil M. P.; Moleman, Arnold J.; Bakker, Piet J. M.

    Contemporary workflow management systems offer work-items to users through specific work-lists. Users select the work-items they will perform without having a specific schedule in mind. However, in many environments work needs to be scheduled and performed at particular times. For example, in hospitals many work-items are linked to appointments, e.g., a doctor cannot perform surgery without reserving an operating theater and making sure that the patient is present. One of the problems when applying workflow technology in such domains is the lack of calendar-based scheduling support. In this paper, we present an approach that supports the seamless integration of unscheduled (flow) and scheduled (schedule) tasks. Using CPN Tools we have developed a specification and simulation model for schedule-aware workflow management systems. Based on this a system has been realized that uses YAWL, Microsoft Exchange Server 2007, Outlook, and a dedicated scheduling service. The approach is illustrated using a real-life case study at the AMC hospital in the Netherlands. In addition, we elaborate on the experiences obtained when developing and implementing a system of this scale using formal techniques.

  3. Routine digital pathology workflow: The Catania experience

    Directory of Open Access Journals (Sweden)

    Filippo Fraggetta

    2017-01-01

    Full Text Available Introduction: Successful implementation of whole slide imaging (WSI for routine clinical practice has been accomplished in only a few pathology laboratories worldwide. We report the transition to an effective and complete digital surgical pathology workflow in the pathology laboratory at Cannizzaro Hospital in Catania, Italy. Methods: All (100% permanent histopathology glass slides were digitized at ×20 using Aperio AT2 scanners. Compatible stain and scanning slide racks were employed to streamline operations. eSlide Manager software was bidirectionally interfaced with the anatomic pathology laboratory information system. Virtual slide trays connected to the two-dimensional (2D barcode tracking system allowed pathologists to confirm that they were correctly assigned slides and that all tissues on these glass slides were scanned. Results: Over 115,000 glass slides were digitized with a scan fail rate of around 1%. Drying glass slides before scanning minimized them sticking to scanner racks. Implementation required introduction of a 2D barcode tracking system and modification of histology workflow processes. Conclusion: Our experience indicates that effective adoption of WSI for primary diagnostic use was more dependent on optimizing preimaging variables and integration with the laboratory information system than on information technology infrastructure and ensuring pathologist buy-in. Implementation of digital pathology for routine practice not only leveraged the benefits of digital imaging but also creates an opportunity for establishing standardization of workflow processes in the pathology laboratory.

  4. Evaluation of Workflow Management Systems - A Meta Model Approach

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    1998-11-01

    Full Text Available The automated enactment of processes through the use of workflow management systems enables the outsourcing of the control flow from application systems. By now a large number of systems, that follow different workflow paradigms, are available. This leads to the problem of selecting the appropriate workflow management system for a given situation. In this paper we outline the benefits of a meta model approach for the evaluation and comparison of different workflow management systems. After a general introduction on the topic of meta modeling the meta models of the workflow management systems WorkParty (Siemens Nixdorf and FlowMark (IBM are compared as an example. These product specific meta models can be generalized to meta reference models, which helps to specify a workflow methodology. Exemplary, an organisational reference meta model is presented, which helps users in specifying their requirements for a workflow management system.

  5. Analysis of plutonium isotope ratios including 238Pu/239Pu in individual U-Pu mixed oxide particles by means of a combination of alpha spectrometry and ICP-MS.

    Science.gov (United States)

    Esaka, Fumitaka; Yasuda, Kenichiro; Suzuki, Daisuke; Miyamoto, Yutaka; Magara, Masaaki

    2017-04-01

    Isotope ratio analysis of individual uranium-plutonium (U-Pu) mixed oxide particles contained within environmental samples taken from nuclear facilities is proving to be increasingly important in the field of nuclear safeguards. However, isobaric interferences, such as 238 U with 238 Pu and 241 Am with 241 Pu, make it difficult to determine plutonium isotope ratios in mass spectrometric measurements. In the present study, the isotope ratios of 238 Pu/ 239 Pu, 240 Pu/ 239 Pu, 241 Pu/ 239 Pu, and 242 Pu/ 239 Pu were measured for individual Pu and U-Pu mixed oxide particles by a combination of alpha spectrometry and inductively coupled plasma mass spectrometry (ICP-MS). As a consequence, we were able to determine the 240 Pu/ 239 Pu, 241 Pu/ 239 Pu, and 242 Pu/ 239 Pu isotope ratios with ICP-MS after particle dissolution and chemical separation of plutonium with UTEVA resins. Furthermore, 238 Pu/ 239 Pu isotope ratios were able to be calculated by using both the 238 Pu/( 239 Pu+ 240 Pu) activity ratios that had been measured through alpha spectrometry and the 240 Pu/ 239 Pu isotope ratios determined through ICP-MS. Therefore, the combined use of alpha spectrometry and ICP-MS is useful in determining plutonium isotope ratios, including 238 Pu/ 239 Pu, in individual U-Pu mixed oxide particles. Copyright © 2016 Elsevier B.V. All rights reserved.

  6. Salting-out assisted liquid-liquid extraction combined with gas chromatography-mass spectrometry for the determination of pyrethroid insecticides in high salinity and biological samples.

    Science.gov (United States)

    Niu, Zongliang; Yu, Chunwei; He, Xiaowen; Zhang, Jun; Wen, Yingying

    2017-09-05

    A salting-out assisted liquid-liquid extraction (SALLE) combined with gas chromatography-mass spectrometry (GC-MS) method was developed for the determination of four pyrethroid insecticides (PYRs) in high salinity and biological samples. Several parameters including sample pH, salting-out solution volume and salting-out solution pH influencing the extraction efficiency were systematically investigated with the aid of orthogonal design. The optimal extraction conditions of SALLE were: 4mL of salting-out solution with pH=4 and the sample pH=3. Under the optimum extraction and determination conditions, good responses for four PYRs were obtained in a range of 5-5000ng/mL, with linear coefficients greater than 0.998. The recoveries of the four PYRs ranged from 74% to 110%, with standard deviations ranging from 1.8% to 9.8%. The limits of detection based on a signal-to-noise ratio of 3 were between 1.5-60.6ng/mL. The method was applied to the determination of PYRs in urine, seawater and wastewater samples with a satisfactory result. The results demonstrated that this SALLE-GC-MS method was successfully applied to determine PYRs in high salinity and biological samples. SALLE avoided the need for the elimination of salinity and protein in the sample matrix, as well as clean-up of the extractant. Most of all, no centrifugation or any special apparatus are required, make this a promising method for rapid sample preparation procedure. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Accurate quantification of polycyclic aromatic hydrocarbons in dust samples using microwave-assisted solvent extraction combined with isotope-dilution mass spectrometry

    International Nuclear Information System (INIS)

    Itoh, Nobuyasu; Fushimi, Akihiro; Yarita, Takashi; Aoyagi, Yoshie; Numata, Masahiko

    2011-01-01

    Highlights: → We applied MAE-IDMS for accurate quantification of PAHs in dust samples. → Both partitioning and isotopic equilibria can be achieved using this technique. → MAE-IDMS can provide accurate concentrations even if extraction efficiencies were low. → Characteristics of samples strongly affected for low extraction efficiencies of PAHs. - Abstract: For accurate quantification of polycyclic aromatic hydrocarbons (PAHs) in dust samples, we investigated the use of microwave-assisted solvent extraction (MAE) combined with isotope-dilution mass spectrometry (IDMS) using deuterium-labelled PAHs (D-PAHs). Although MAE with a methanol/toluene mixture (1:3 by volume) at 160 deg. C for 40 min was best for extracting PAHs from tunnel dust among examined, the recovery yields of D-PAHs decreased with increasing molecular weight (<40% for MW ≥ 264; that of deuterium-labelled indeno[123-cd]pyrene (D-IcdP) was only 7.1%). Although the residues were extracted a second time, the observed concentrations did not change dramatically (<5%), and the recovery yields of heavier D-PAHs (i.e., MW ≥ 264) were approximately half of those of the first extract, including D-IcdP (3.4%). These results suggest that both partitioning and isotopic equilibria of PAHs and D-PAHs between sample and solvent were achieved for extractable heavier PAHs under the condition. Thus, the observed concentrations of PAHs obtained by MAE-IDMS were reasonable, even though recovery yields of D-PAHs were <50%. From the results of carbon analyses and extractable contents, lower recovery yields of D-PAHs from the tunnel dust were due to a large content of char with low extractable contents.

  8. Room temperature ionic liquids enhanced the speciation of Cr(VI) and Cr(III) by hollow fiber liquid phase microextraction combined with flame atomic absorption spectrometry

    International Nuclear Information System (INIS)

    Zeng, Chujie; Lin, Yao; Zhou, Neng; Zheng, Jiaoting; Zhang, Wei

    2012-01-01

    Highlights: ► First reported enhancement effect of RTILs in HF-LPME for the speciation of chromium. ► The addition of RTILs led to 3.5 times improvement of the sensitivity of Cr(VI). ► The proposed method is a simplicity, sensitivity, low cost, green method. - Abstract: A new method for the speciation of Cr(VI) and Cr(III) based on enhancement effect of room temperature ionic liquids (RTILs) for hollow fiber liquid phase microextraction (HF-LPME) combined with flame atomic absorption spectrometry (FAAS) was developed. Room temperature ionic liquids (RTILs) and diethyldithiocarbamate (DDTC) were used enhancement reagents and chelating reagent, respectively. The addition of room temperature ionic liquids led to 3.5 times improvement in the determination of Cr(VI). In this method, Cr(VI) reacts with DDTC yielding a hydrophobic complex, which is subsequently extracted into the lumen of hollow fiber, whereas Cr(III) is remained in aqueous solutions. The extraction organic phase was injected into FAAS for the determination of Cr(VI). Total Cr concentration was determined after oxidizing Cr(III) to Cr(VI) in the presence of KMnO 4 and using the extraction procedure mentioned above. Cr(III) was calculated by subtracting of Cr(VI) from the total Cr. Under optimized conditions, a detection limit of 0.7 ng mL −1 and an enrichment factor of 175 were achieved. The relative standard deviation (RSD) was 4.9% for Cr(VI) (40 ng mL −1 , n = 5). The proposed method was successfully applied to the speciation of chromium in natural water samples with satisfactory results.

  9. N-glycoproteome analysis of the secretome of human metastatic hepatocellular carcinoma cell lines combining hydrazide chemistry, HILIC enrichment and mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Xianyu Li

    Full Text Available Cancer cell metastasis is a major cause of cancer death. Unfortunately, the underlying molecular mechanisms remain unknown, which results in the lack of efficient diagnosis, therapy and prevention approaches. Nevertheless, the dysregulation of the cancer cell secretome is known to play key roles in tumor transformation and progression. The majority of proteins in the secretome are secretory proteins and membrane-released proteins, and, mostly, the glycosylated proteins. Until recently, few studies have explored protein N-glycosylation changes in the secretome, although protein glycosylation has received increasing attention in the study of tumor development processes. Here, the N-glycoproteins in the secretome of two human hepatocellular carcinoma (HCC cell lines with low (MHCC97L or high (HCCLM3 metastatic potential were investigated with a in-depth characterization of the N-glycosites by combining two general glycopeptide enrichment approaches, hydrazide chemistry and zwitterionic hydrophilic interaction chromatography (zic-HILIC, with mass spectrometry analysis. A total of 1,213 unique N-glycosites from 611 N-glycoproteins were confidently identified. These N-glycoproteins were primarily localized to the extracellular space and plasma membrane, supporting the important role of N-glycosylation in the secretory pathway. Coupling label-free quantification with a hierarchical clustering strategy, we determined the differential regulation of several N-glycoproteins that are related to metastasis, among which AFP, DKK1, FN1, CD151 and TGFβ2 were up-regulated in HCCLM3 cells. The inclusion of the well-known metastasis-related proteins AFP and DKK1 in this list provides solid supports for our study. Further western blotting experiments detecting FN1 and FAT1 confirmed our discovery. The glycoproteome strategy in this study provides an effective means to explore potential cancer biomarkers.

  10. Room temperature ionic liquids enhanced the speciation of Cr(VI) and Cr(III) by hollow fiber liquid phase microextraction combined with flame atomic absorption spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Zeng, Chujie, E-mail: cjzeng@126.com [Department of Chemistry and Material, Yulin Normal College, Yulin, Guangxi 537000 (China); Lin, Yao; Zhou, Neng; Zheng, Jiaoting; Zhang, Wei [Department of Chemistry and Material, Yulin Normal College, Yulin, Guangxi 537000 (China)

    2012-10-30

    Highlights: Black-Right-Pointing-Pointer First reported enhancement effect of RTILs in HF-LPME for the speciation of chromium. Black-Right-Pointing-Pointer The addition of RTILs led to 3.5 times improvement of the sensitivity of Cr(VI). Black-Right-Pointing-Pointer The proposed method is a simplicity, sensitivity, low cost, green method. - Abstract: A new method for the speciation of Cr(VI) and Cr(III) based on enhancement effect of room temperature ionic liquids (RTILs) for hollow fiber liquid phase microextraction (HF-LPME) combined with flame atomic absorption spectrometry (FAAS) was developed. Room temperature ionic liquids (RTILs) and diethyldithiocarbamate (DDTC) were used enhancement reagents and chelating reagent, respectively. The addition of room temperature ionic liquids led to 3.5 times improvement in the determination of Cr(VI). In this method, Cr(VI) reacts with DDTC yielding a hydrophobic complex, which is subsequently extracted into the lumen of hollow fiber, whereas Cr(III) is remained in aqueous solutions. The extraction organic phase was injected into FAAS for the determination of Cr(VI). Total Cr concentration was determined after oxidizing Cr(III) to Cr(VI) in the presence of KMnO{sub 4} and using the extraction procedure mentioned above. Cr(III) was calculated by subtracting of Cr(VI) from the total Cr. Under optimized conditions, a detection limit of 0.7 ng mL{sup -1} and an enrichment factor of 175 were achieved. The relative standard deviation (RSD) was 4.9% for Cr(VI) (40 ng mL{sup -1}, n = 5). The proposed method was successfully applied to the speciation of chromium in natural water samples with satisfactory results.

  11. Ion-exchange solid-phase extraction combined with liquid chromatography-tandem mass spectrometry for the determination of veterinary drugs in organic fertilizers.

    Science.gov (United States)

    Zhao, Zhiyong; Zhang, Yanmei; Xuan, Yanfang; Song, Wei; Si, Wenshuai; Zhao, Zhihui; Rao, Qinxiong

    2016-06-01

    The analysis of veterinary drugs in organic fertilizers is crucial for an assessment of potential risks to soil microbial communities and human health. We develop a robust and sensitive method to quantitatively determine 19 veterinary drugs (amantadine, sulfonamides and fluoroquinolones) in organic fertilizers. The method involved a simple solid-liquid extraction step using the combination of acetonitrile and McIlvaine buffer as extraction solvent, followed by cleanup with a solid-phase extraction cartridge containing polymeric mixed-mode anion-exchange sorbents. Ultra-high performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) was used to separate and detect target analytes. We particularly focused on the optimization of sample clean-up step: different diluents and dilution factors were tested. The developed method was validated in terms of linearity, recovery, precision, sensitivity and specificity. The recoveries of all the drugs ranged from 70.9% to 112.7% at three concentration levels, with the intra-day and inter-day relative standard deviation lower than 15.7%. The limits of quantification were between 1.0 and 10.0μg/kg for all the drugs. Matrix effect was minimized by matrix-matched calibration curves. The analytical method was successfully applied for the survey of veterinary drugs contamination in 20 compost samples. The results indicated that fluoroquinolones had higher incidence rate and mean concentration levels ranging from 31.9 to 308.7μg/kg compared with other drugs. We expect the method will provide the basis for risk assessment of veterinary drugs in organic fertilizers. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Glyoxal and methylglyoxal as urinary markers of diabetes. Determination using a dispersive liquid-liquid microextraction procedure combined with gas chromatography-mass spectrometry.

    Science.gov (United States)

    Pastor-Belda, M; Fernández-García, A J; Campillo, N; Pérez-Cárceles, M D; Motas, M; Hernández-Córdoba, M; Viñas, P

    2017-08-04

    Glyoxal (GO) and methylglyoxal (MGO) are α-oxoaldehydes that can be used as urinary diabetes markers. In this study, their levels were measured using a sample preparation procedure based on salting-out assisted liquid-liquid extraction (SALLE) and dispersive liquid-liquid microextraction (DLLME) combined with gas chromatography-mass spectrometry (GC-MS). The effect of the derivatization reaction with 2,3-diaminonaphthalene, the addition of acetonitrile and sodium chloride to urine, and the DLLME step using the acetonitrile extract as dispersant solvent and carbon tetrachloride as extractant solvent were carefully optimized. Quantification was performed by the internal standard method, using 5-bromo-2-chloroanisole. The intraday and interday precisions were lower than 6%. Limits of detection were 0.12 and 0.06ngmL -1 , and enrichment factors 140 and 130 for GO and MGO, respectively. The concentrations of these α-oxoaldehydes in urine were between 0.9 and 35.8ngg -1 levels (creatinine adjusted). A statistical comparison of the analyte contents of urine samples from non-diabetic and diabetic patients pointed to significant differences (P=0.046, 24 subjects investigated), particularly regarding MGO, which was higher in diabetic patients. The novelty of this study compared with previous procedures lies in the treatment of the urine sample by SALLE based on the addition of acetonitrile and sodium chloride to the urine. The DLLME procedure is performed with a sedimented drop of the extractant solvent, without a surfactant reagent, and using acetonitrile as dispersant solvent. Separation of the analytes was performed using GC-MS detection, being the analytes unequivocal identified. The proposed procedure is the first microextraction method applied to the analysis of urine samples from diabetic and non-diabetic patients that allows a clear differentiation between both groups using a simple analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. From Requirements via Colored Workflow Nets to an Implementation in Several Workflow Systems

    DEFF Research Database (Denmark)

    Mans, Ronny S.; van der Aalst, Willibrordus Martinus Pancratius; Molemann, A.J.

    2007-01-01

    Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where an Execu......Care organizations, such as hospitals, need to support complex and dynamic workflows. More- over, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes an approach where...... an Executable Use Case (EUC) and Colored Care organizations, such as hospitals, need to support complex and dynamic workflows. Moreover, many disciplines are involved. This makes it important to avoid the typical disconnect between requirements and the actual implementation of the system. This paper proposes...

  14. Tavaxy: integrating Taverna and Galaxy workflows with cloud computing support.

    Science.gov (United States)

    Abouelhoda, Mohamed; Issa, Shadi Alaa; Ghanem, Moustafa

    2012-05-04

    Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis.The system can be accessed either through a

  15. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Directory of Open Access Journals (Sweden)

    Abouelhoda Mohamed

    2012-05-01

    Full Text Available Abstract Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub- workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and

  16. Tavaxy: Integrating Taverna and Galaxy workflows with cloud computing support

    Science.gov (United States)

    2012-01-01

    Background Over the past decade the workflow system paradigm has evolved as an efficient and user-friendly approach for developing complex bioinformatics applications. Two popular workflow systems that have gained acceptance by the bioinformatics community are Taverna and Galaxy. Each system has a large user-base and supports an ever-growing repository of application workflows. However, workflows developed for one system cannot be imported and executed easily on the other. The lack of interoperability is due to differences in the models of computation, workflow languages, and architectures of both systems. This lack of interoperability limits sharing of workflows between the user communities and leads to duplication of development efforts. Results In this paper, we present Tavaxy, a stand-alone system for creating and executing workflows based on using an extensible set of re-usable workflow patterns. Tavaxy offers a set of new features that simplify and enhance the development of sequence analysis applications: It allows the integration of existing Taverna and Galaxy workflows in a single environment, and supports the use of cloud computing capabilities. The integration of existing Taverna and Galaxy workflows is supported seamlessly at both run-time and design-time levels, based on the concepts of hierarchical workflows and workflow patterns. The use of cloud computing in Tavaxy is flexible, where the users can either instantiate the whole system on the cloud, or delegate the execution of certain sub-workflows to the cloud infrastructure. Conclusions Tavaxy reduces the workflow development cycle by introducing the use of workflow patterns to simplify workflow creation. It enables the re-use and integration of existing (sub-) workflows from Taverna and Galaxy, and allows the creation of hybrid workflows. Its additional features exploit recent advances in high performance cloud computing to cope with the increasing data size and complexity of analysis. The system

  17. Mass Spectrometry-Based Biomarker Discovery.

    Science.gov (United States)

    Zhou, Weidong; Petricoin, Emanuel F; Longo, Caterina

    2017-01-01

    The discovery of candidate biomarkers within the entire proteome is one of the most important and challenging goals in proteomic research. Mass spectrometry-based proteomics is a modern and promising technology for semiquantitative and qualitative assessment of proteins, enabling protein sequencing and identification with exquisite accuracy and sensitivity. For mass spectrometry analysis, protein extractions from tissues or body fluids and subsequent protein fractionation represent an important and unavoidable step in the workflow for biomarker discovery. Following extraction of proteins, the protein mixture must be digested, reduced, alkylated, and cleaned up prior to mass spectrometry. The aim of our chapter is to provide comprehensible and practical lab procedures for sample digestion, protein fractionation, and subsequent mass spectrometry analysis.

  18. SU-E-T-419: Workflow and FMEA in a New Proton Therapy (PT) Facility

    International Nuclear Information System (INIS)

    Cheng, C; Wessels, B; Hamilton, H; Difranco, T; Mansur, D

    2014-01-01

    Purpose: Workflow is an important component in the operational planning of a new proton facility. By integrating the concept of failure mode and effect analysis (FMEA) and traditional QA requirements, a workflow for a proton therapy treatment course is set up. This workflow serves as the blue print for the planning of computer hardware/software requirements and network flow. A slight modification of the workflow generates a process map(PM) for FMEA and the planning of QA program in PT. Methods: A flowchart is first developed outlining the sequence of processes involved in a PT treatment course. Each process consists of a number of sub-processes to encompass a broad scope of treatment and QA procedures. For each subprocess, the personnel involved, the equipment needed and the computer hardware/software as well as network requirements are defined by a team of clinical staff, administrators and IT personnel. Results: Eleven intermediate processes with a total of 70 sub-processes involved in a PT treatment course are identified. The number of sub-processes varies, ranging from 2-12. The sub-processes within each process are used for the operational planning. For example, in the CT-Sim process, there are 12 sub-processes: three involve data entry/retrieval from a record-and-verify system, two controlled by the CT computer, two require department/hospital network, and the other five are setup procedures. IT then decides the number of computers needed and the software and network requirement. By removing the traditional QA procedures from the workflow, a PM is generated for FMEA analysis to design a QA program for PT. Conclusion: Significant efforts are involved in the development of the workflow in a PT treatment course. Our hybrid model of combining FMEA and traditional QA program serves a duo purpose of efficient operational planning and designing of a QA program in PT

  19. Provenance for Runtime Workflow Steering and Validation in Computational Seismology

    Science.gov (United States)

    Spinuso, A.; Krischer, L.; Krause, A.; Filgueira, R.; Magnoni, F.; Muraleedharan, V.; David, M.

    2014-12-01

    Provenance systems may be offered by modern workflow engines to collect metadata about the data transformations at runtime. If combined with effective visualisation and monitoring interfaces, these provenance recordings can speed up the validation process of an experiment, suggesting interactive or automated interventions with immediate effects on the lifecycle of a workflow run. For instance, in the field of computational seismology, if we consider research applications performing long lasting cross correlation analysis and high resolution simulations, the immediate notification of logical errors and the rapid access to intermediate results, can produce reactions which foster a more efficient progress of the research. These applications are often executed in secured and sophisticated HPC and HTC infrastructures, highlighting the need for a comprehensive framework that facilitates the extraction of fine grained provenance and the development of provenance aware components, leveraging the scalability characteristics of the adopted workflow engines, whose enactment can be mapped to different technologies (MPI, Storm clusters, etc). This work looks at the adoption of W3C-PROV concepts and data model within a user driven processing and validation framework for seismic data, supporting also computational and data management steering. Validation needs to balance automation with user intervention, considering the scientist as part of the archiving process. Therefore, the provenance data is enriched with community-specific metadata vocabularies and control messages, making an experiment reproducible and its description consistent with the community understandings. Moreover, it can contain user defined terms and annotations. The current implementation of the system is supported by the EU-Funded VERCE (http://verce.eu). It provides, as well as the provenance generation mechanisms, a prototypal browser-based user interface and a web API built on top of a NoSQL storage

  20. Workflow with pitfalls to derive a regional airborne magnetic compilation

    Science.gov (United States)

    Brönner, Marco; Baykiev, Eldar; Ebbing, Jörg

    2017-04-01

    Today, large scale magnetic maps are usually a patchwork of different airborne surveys from different size, different resolution and different years. Airborne magnetic acquisition is a fast and economic method to map and gain geological and tectonic information for large areas, onshore and offshore. Depending on the aim of a survey, acquisition parameters like altitude and profile distance are usually adjusted to match the purpose of investigation. The subsequent data processing commonly follows a standardized workflow comprising core-field subtraction and line leveling to yield a coherent crustal field magnetic grid for a survey area. The resulting data makes it possible to correlate with geological and tectonic features in the subsurface, which is of importance for e.g. oil and mineral exploration. Crustal scale magnetic interpretation and modeling demand regional compilation of magnetic data and the merger of adjacent magnetic surveys. These studies not only focus on shallower sources, reflected by short to intermediate magnetic wavelength anomalies, but also have a particular interest in the long wavelength deriving from deep seated sources. However, whilst the workflow to produce such a merger is supported by quite a few powerful routines, the resulting compilation contains several pitfalls and limitations, which were discussed before, but still are very little recognized. The maximum wavelength that can be resolved of each individual survey is directly related to the survey size and consequently a merger will contribute erroneous long-wavelength components in the magnetic data compilation. To minimize this problem and to homogenous the longer wavelengths, a first order approach is the combination of airborne and satellite magnetic data commonly combined with the compilation from airborne data, which is sufficient only under particular preconditions. A more advanced approach considers the gap in frequencies between airborne and satellite data, which motivated

  1. Text mining for the biocuration workflow

    Science.gov (United States)

    Hirschman, Lynette; Burns, Gully A. P. C; Krallinger, Martin; Arighi, Cecilia; Cohen, K. Bretonnel; Valencia, Alfonso; Wu, Cathy H.; Chatr-Aryamontri, Andrew; Dowell, Karen G.; Huala, Eva; Lourenço, Anália; Nash, Robert; Veuthey, Anne-Lise; Wiegers, Thomas; Winter, Andrew G.

    2012-01-01

    Molecular biology has become heavily dependent on biological knowledge encoded in expert curated biological databases. As the volume of biological literature increases, biocurators need help in keeping up with the literature; (semi-) automated aids for biocuration would seem to be an ideal application for natural language processing and text mining. However, to date, there have been few documented successes for improving biocuration throughput using text mining. Our initial investigations took place for the workshop on ‘Text Mining for the BioCuration Workflow’ at the third International Biocuration Conference (Berlin, 2009). We interviewed biocurators to obtain workflows from eight biological databases. This initial study revealed high-level commonalities, including (i) selection of documents for curation; (ii) indexing of documents with biologically relevant entities (e.g. genes); and (iii) detailed curation of specific relations (e.g. interactions); however, the detailed workflows also showed many variabilities. Following the workshop, we conducted a survey of biocurators. The survey identified biocurator priorities, including the handling of full text indexed with biological entities and support for the identification and prioritization of documents for curation. It also indicated that two-thirds of the biocuration teams had experimented with text mining and almost half were using text mining at that time. Analysis of our interviews and survey provide a set of requirements for the integration of text mining into the biocuration workflow. These can guide the identification of common needs across curated databases and encourage joint experimentation involving biocurators, text mining developers and the larger biomedical research community. PMID:22513129

  2. Mass Spectrometry-Based Proteomic Profiling of Thrombotic Material Obtained by Endovascular Thrombectomy in Patients with Ischemic Stroke

    Directory of Open Access Journals (Sweden)

    Roberto Muñoz

    2018-02-01

    Full Text Available Thrombotic material retrieved from acute ischemic stroke (AIS patients represents a valuable source of biological information. In this study, we have developed a clinical proteomics workflow to characterize the protein cargo of thrombi derived from AIS patients. To analyze the thrombus proteome in a large-scale format, we developed a workflow that combines the isolation of thrombus by endovascular thrombectomy and peptide chromatographic fractionation coupled to mass-spectrometry. Using this workflow, we have characterized a specific proteomic expression profile derived from four AIS patients included in this study. Around 1600 protein species were unambiguously identified in the analyzed material. Functional bioinformatics analyses were performed, emphasizing a clustering of proteins with immunological functions as well as cardiopathy-related proteins with blood-cell dependent functions and peripheral vascular processes. In addition, we established a reference proteomic fingerprint of 341 proteins commonly detected in all patients. Protein interactome network of this subproteome revealed protein clusters involved in the interaction of fibronectin with 14-3-3 proteins, TGFβ signaling, and TCP complex network. Taken together, our data contributes to the repertoire of the human thrombus proteome, serving as a reference library to increase our knowledge about the molecular basis of thrombus derived from AIS patients, paving the way toward the establishment of a quantitative approach necessary to detect and characterize potential novel biomarkers in the stroke field.

  3. Electronic resource management systems a workflow approach

    CERN Document Server

    Anderson, Elsa K

    2014-01-01

    To get to the bottom of a successful approach to Electronic Resource Management (ERM), Anderson interviewed staff at 11 institutions about their ERM implementations. Among her conclusions, presented in this issue of Library Technology Reports, is that grasping the intricacies of your workflow-analyzing each step to reveal the gaps and problems-at the beginning is crucial to selecting and implementing an ERM. Whether the system will be used to fill a gap, aggregate critical data, or replace a tedious manual process, the best solution for your library depends on factors such as your current soft

  4. Reenginering of the i4 workflow engine

    OpenAIRE

    Likar, Tilen

    2013-01-01

    I4 is an enterprise resource planning system which allows you to manage business processes. Due to increasing demands for managing complex processes and adjusting those processes to global standards, a renewal of a part of the system was required. In this thesis we faced the reengineering of the workflow engine, and corresponding data model. We designed a business process diagram in Bizagi Porcess Modeler. The import to i4 and the export from i4 was developed on XPDL file exported from the mo...

  5. CMS data and workflow management system

    CERN Document Server

    Fanfani, A; Bacchi, W; Codispoti, G; De Filippis, N; Pompili, A; My, S; Abbrescia, M; Maggi, G; Donvito, G; Silvestris, L; Calzolari, F; Sarkar, S; Spiga, D; Cinquili, M; Lacaprara, S; Biasotto, M; Farina, F; Merlo, M; Belforte, S; Kavka, C; Sala, L; Harvey, J; Hufnagel, D; Fanzago, F; Corvo, M; Magini, N; Rehn, J; Toteva, Z; Feichtinger, D; Tuura, L; Eulisse, G; Bockelman, B; Lundstedt, C; Egeland, R; Evans, D; Mason, D; Gutsche, O; Sexton-Kennedy, L; Dagenhart, D W; Afaq, A; Guo, Y; Kosyakov, S; Lueking, L; Sekhri, V; Fisk, I; McBride, P; Bauerdick, L; Bakken, J; Rossman, P; Wicklund, E; Wu, Y; Jones, C; Kuznetsov, V; Riley, D; Dolgert, A; van Lingen, F; Narsky, I; Paus, C; Klute, M; Gomez-Ceballos, G; Piedra-Gomez, J; Miller, M; Mohapatra, A; Lazaridis, C; Bradley, D; Elmer, P; Wildish, T; Wuerthwein, F; Letts, J; Bourilkov, D; Kim, B; Smith, P; Hernandez, J M; Caballero, J; Delgado, A; Flix, J; Cabrillo-Bartolome, I; Kasemann, M; Flossdorf, A; Stadie, H; Kreuzer, P; Khomitch, A; Hof, C; Zeidler, C; Kalini, S; Trunov, A; Saout, C; Felzmann, U; Metson, S; Newbold, D; Geddes, N; Brew, C; Jackson, J; Wakefield, S; De Weirdt, S; Adler, V; Maes, J; Van Mulders, P; Villella, I; Hammad, G; Pukhaeva, N; Kurca, T; Semneniouk, I; Guan, W; Lajas, J A; Teodoro, D; Gregores, E; Baquero, M; Shehzad, A; Kadastik, M; Kodolova, O; Chao, Y; Ming Kuo, C; Filippidis, C; Walzel, G; Han, D; Kalinowski, A; Giro de Almeida, N M; Panyam, N

    2008-01-01

    CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.

  6. Wildfire: distributed, Grid-enabled workflow construction and execution

    Directory of Open Access Journals (Sweden)

    Issac Praveen

    2005-03-01

    Full Text Available Abstract Background We observe two trends in bioinformatics: (i analyses are increasing in complexity, often requiring several applications to be run as a workflow; and (ii multiple CPU clusters and Grids are available to more scientists. The traditional solution to the problem of running workflows across multiple CPUs required programming, often in a scripting language such as perl. Programming places such solutions beyond the reach of many bioinformatics consumers. Results We present Wildfire, a graphical user interface for constructing and running workflows. Wildfire borrows user interface features from Jemboss and adds a drag-and-drop interface allowing the user to compose EMBOSS (and other programs into workflows. For execution, Wildfire uses GEL, the underlying workflow execution engine, which can exploit available parallelism on multiple CPU machines including Beowulf-class clusters and Grids. Conclusion Wildfire simplifies the tasks of constructing and executing bioinformatics workflows.

  7. Declarative Modelling and Safe Distribution of Healthcare Workflows

    DEFF Research Database (Denmark)

    Hildebrandt, Thomas; Mukkamala, Raghava Rao; Slaats, Tijs

    2012-01-01

    We present a formal technique for safe distribution of workflow processes described declaratively as Nested Condition Response (NCR) Graphs and apply the technique to a distributed healthcare workflow. Concretely, we provide a method to synthesize from a NCR Graph and any distribution of its events......-organizational case management. The contributions of this paper is to adapt the technique to allow for nested processes and milestones and to apply it to a healthcare workflow identified in a previous field study at danish hospitals....

  8. Building and documenting workflows with python-based snakemake

    OpenAIRE

    Köster, Johannes; Rahmann, Sven

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to write human-readable workflows that document themselves. We have found Snakemake especially useful for building high-throughput sequencing data analysis pipelines and present examples from this area....

  9. Automated data reduction workflows for astronomy. The ESO Reflex environment

    Science.gov (United States)

    Freudling, W.; Romaniello, M.; Bramich, D. M.; Ballester, P.; Forchi, V.; García-Dabló, C. E.; Moehler, S.; Neeser, M. J.

    2013-11-01

    Context. Data from complex modern astronomical instruments often consist of a large number of different science and calibration files, and their reduction requires a variety of software tools. The execution chain of the tools represents a complex workflow that needs to be tuned and supervised, often by individual researchers that are not necessarily experts for any specific instrument. Aims: The efficiency of data reduction can be improved by using automatic workflows to organise data and execute a sequence of data reduction steps. To realize such efficiency gains, we designed a system that allows intuitive representation, execution and modification of the data reduction workflow, and has facilities for inspection and interaction with the data. Methods: The European Southern Observatory (ESO) has developed Reflex, an environment to automate data reduction workflows. Reflex is implemented as a package of customized components for the Kepler workflow engine. Kepler provides the graphical user interface to create an executable flowchart-like representation of the data reduction process. Key features of Reflex are a rule-based data organiser, infrastructure to re-use results, thorough book-keeping, data progeny tracking, interactive user interfaces, and a novel concept to exploit information created during data organisation for the workflow execution. Results: Automated workflows can greatly increase the efficiency of astronomical data reduction. In Reflex, workflows can be run non-interactively as a first step. Subsequent optimization can then be carried out while transparently re-using all unchanged intermediate products. We found that such workflows enable the reduction of complex data by non-expert users and minimizes mistakes due to book-keeping errors. Conclusions: Reflex includes novel concepts to increase the efficiency of astronomical data processing. While Reflex is a specific implementation of astronomical scientific workflows within the Kepler workflow

  10. Integrating prediction, provenance, and optimization into high energy workflows

    Energy Technology Data Exchange (ETDEWEB)

    Schram, M.; Bansal, V.; Friese, R. D.; Tallent, N. R.; Yin, J.; Barker, K. J.; Stephan, E.; Halappanavar, M.; Kerbyson, D. J.

    2017-10-01

    We propose a novel approach for efficient execution of workflows on distributed resources. The key components of this framework include: performance modeling to quantitatively predict workflow component behavior; optimization-based scheduling such as choosing an optimal subset of resources to meet demand and assignment of tasks to resources; distributed I/O optimizations such as prefetching; and provenance methods for collecting performance data. In preliminary results, these techniques improve throughput on a small Belle II workflow by 20%.

  11. A Model of Workflow Composition for Emergency Management

    Science.gov (United States)

    Xin, Chen; Bin-ge, Cui; Feng, Zhang; Xue-hui, Xu; Shan-shan, Fu

    The common-used workflow technology is not flexible enough in dealing with concurrent emergency situations. The paper proposes a novel model for defining emergency plans, in which workflow segments appear as a constituent part. A formal abstraction, which contains four operations, is defined to compose workflow segments under constraint rule. The software system of the business process resources construction and composition is implemented and integrated into Emergency Plan Management Application System.

  12. MCX based solid phase extraction combined with liquid chromatography tandem mass spectrometry for the simultaneous determination of 31 endocrine-disrupting compounds in surface water of Shanghai.

    Science.gov (United States)

    Zhang, Hong-Chang; Yu, Xue-jun; Yang, Wen-chao; Peng, Jin-feng; Xu, Ting; Yin, Da-Qiang; Hu, Xia-lin

    2011-10-15

    A novel analytical method employing MCX (mixed-mode cationic exchange) based solid phase extraction (SPE) coupled with liquid chromatography tandem mass spectrometry (LC-MS/MS) was developed to detect 31 endocrine-disrupting compounds (EDCs) in surface water samples simultaneously. The target EDCs belong to five classes, including seven estrogens, eight androgens, six progesterones, five adrenocortical hormones and five industrial compounds. In order to simultaneously concentrate the target EDCs and eliminate matrix interferences in the water samples, MCX SPE cartridges were employed for SPE, and then followed by a simple and highly efficient three-step sequential elution procedure. Two electrospray ionization (ESI) detection modes, positive (ESI+) and (ESI-), were optimized for HPLC-MS/MS analysis to obtain the highest sensitivity for all the EDCs. The limits of detection (LODs) were 0.02-1.9 ng L(-1), which are lower than or comparable to these reported in references. Wide linear ranges (LOD-100 ng L(-1) for ESI+ mode, and LOD-200 ng L(-1) for ESI- mode) were obtained with determination coefficients (R(2)) higher than 0.99 for all the compounds. With five internal standards, good recoveries (84.4-103.0%) of all the target compounds were obtained in selected surface water samples. The developed method was successfully applied to investigate the EDCs occurrence in the surface water of Shanghai by analyzing surface water samples from 11 sites. The results showed that nearly all the target compounds (30 in 31) were present in the surface water samples of Shanghai, of which three industrial compounds (4-t-OP, BPA, and BPF) showed the highest concentrations (median concentrations were 11.88-23.50 ng L(-1)), suggesting that industrial compounds were the dominating EDCs in the surface water of Shanghai, and much more attention should be paid on these compounds. Our present research demonstrated that SPE with MCX cartridges combined with HPLC-MS/MS was convenient

  13. Loading of free radicals on the functional graphene combined with liquid chromatography-tandem mass spectrometry screening method for the detection of radical-scavenging natural antioxidants.

    Science.gov (United States)

    Wang, Guoying; Shi, Gaofeng; Chen, Xuefu; Chen, Fuwen; Yao, Ruixing; Wang, Zhenju

    2013-11-13

    A novel free radical reaction combined with liquid chromatography electrospray ionization tandem mass spectrometry (FRR-LC-PDA-ESI/APCI-MS/MS) screening method was developed for the detection and identification of radical-scavenging natural antioxidants. Functionalized graphene was prepared by chemical method for loading free radicals (superoxide radical, peroxyl radical and PAHs free radical). Separation was performed with and without a preliminary exposure of the sample to specific free radicals on the functionalized graphene, which can facilitate reaction kinetics (charge transfers) between free radicals and potential antioxidants. The difference in chromatographic peak areas is used to identify potential antioxidants. The structure of the antioxidants in one sample (Swertia chirayita) is identified using MS/MS and comparison with standards. Thirteen compounds were found to possess potential antioxidant activity, and their free radical-scavenging capacities were investigated. The thirteen compounds were identified as 1,3,5-trihydroxyxanthone-8-O-β-D-glucopyranoside (PD1), norswertianin (PD2), 1,3,5,8-tetrahydroxyxanthone (PD3), 3, 3', 4', 5, 8-penta hydroxyflavone-6-β-D-glucopyranosiduronic acid-6'-pentopyranose-7-O-glucopyranoside (PD4), 1,5,8-trihydroxy-3-methoxyxanthone (PD5), swertiamarin (PS1), 2-C-β-D-glucopyranosyl-1,3,7-trihydroxylxanthone (PS2), 1,3,7-trihydroxylxanthone-8-O-β-D-glucopyranoside (PL1), 1,3,8-trihydroxyl xanthone-5-O-β-D-glucopyranoside (PL2), 1,3,7-trihydroxy-8-methoxyxanthone (PL3), 1,2,3-trihydroxy-7,8-dimethoxyxanthone (PL4), 1,8-dihydroxy-2,6-dimethoxy xanthone (PL5) and 1,3,5,8-tetramethoxydecussatin (PL6). The reactivity and SC50 values of those compounds were investigated, respectively. PD4 showed the strongest capability for scavenging PAHs free radical; PL4 showed prominent scavenging capacities in the lipid peroxidation processes; it was found that all components in S. chirayita exhibited weak reactivity in the superoxide

  14. Combination of dispersive liquid-liquid microextraction with flame atomic absorption spectrometry using microsample introduction for determination of lead in water samples.

    Science.gov (United States)

    Naseri, Mohammad Taghi; Hemmatkhah, Payam; Hosseini, Mohammad Reza Milani; Assadi, Yaghoub

    2008-03-03

    The dispersive liquid-liquid microextraction (DLLME) was combined with the flame atomic absorption spectrometry (FAAS) for determination of lead in the water samples. Diethyldithiophosphoric acid (DDTP), carbon tetrachloride and methanol were used as chelating agent, extraction solvent and disperser solvent, respectively. A new FAAS sample introduction system was employed for the microvolume nebulization of the non-flammable chlorinated organic extracts. Injection of 20 microL volumes of the organic extract into an air-acetylene flame provided very sensitive spike-like and reproducible signals. Some effective parameters on the microextraction and the complex formation were selected and optimized. These parameters include extraction and disperser solvent type as well as their volume, extraction time, salt effect, pH and amount of the chelating agent. Under the optimized conditions, the enrichment factor of 450 was obtained from a sample volume of 25.0 mL. The enhancement factor, calculated as the ratio of the slopes of the calibration graphs with and without preconcentration, which was about 1000. The calibration graph was linear in the range of 1-70 microgL(-1) with a detection limit of 0.5 microgL(-1). The relative standard deviation (R.S.D.) for seven replicate measurements of 5.0 and 50 microgL(-1) of lead were 3.8 and 2.0%, respectively. The relative recoveries of lead in tap, well, river and seawater samples at the spiking level of 20 microgL(-1) ranged from 93.8 to 106.2%. The characteristics of the proposed method were compared with those of the liquid-liquid extraction (LLE), cloud point extraction (CPE), on-line and off-line solid-phase extraction (SPE) as well as co-precipitation, based on bibliographic data. Operation simplicity, rapidity, low cost, high enrichment factor, good repeatability, and low consumption of the extraction solvent at a microliter level are the main advantages of the proposed method.

  15. Combination of dispersive liquid-liquid microextraction with flame atomic absorption spectrometry using microsample introduction for determination of lead in water samples

    International Nuclear Information System (INIS)

    Naseri, Mohammad Taghi; Hemmatkhah, Payam; Hosseini, Mohammad Reza Milani; Assadi, Yaghoub

    2008-01-01

    The dispersive liquid-liquid microextraction (DLLME) was combined with the flame atomic absorption spectrometry (FAAS) for determination of lead in the water samples. Diethyldithiophosphoric acid (DDTP), carbon tetrachloride and methanol were used as chelating agent, extraction solvent and disperser solvent, respectively. A new FAAS sample introduction system was employed for the microvolume nebulization of the non-flammable chlorinated organic extracts. Injection of 20 μL volumes of the organic extract into an air-acetylene flame provided very sensitive spike-like and reproducible signals. Some effective parameters on the microextraction and the complex formation were selected and optimized. These parameters include extraction and disperser solvent type as well as their volume, extraction time, salt effect, pH and amount of the chelating agent. Under the optimized conditions, the enrichment factor of 450 was obtained from a sample volume of 25.0 mL. The enhancement factor, calculated as the ratio of the slopes of the calibration graphs with and without preconcentration, which was about 1000. The calibration graph was linear in the range of 1-70 μg L -1 with a detection limit of 0.5 μg L -1 . The relative standard deviation (R.S.D.) for seven replicate measurements of 5.0 and 50 μg L -1 of lead were 3.8 and 2.0%, respectively. The relative recoveries of lead in tap, well, river and seawater samples at the spiking level of 20 μg L -1 ranged from 93.8 to 106.2%. The characteristics of the proposed method were compared with those of the liquid-liquid extraction (LLE), cloud point extraction (CPE), on-line and off-line solid-phase extraction (SPE) as well as co-precipitation, based on bibliographic data. Operation simplicity, rapidity, low cost, high enrichment factor, good repeatability, and low consumption of the extraction solvent at a microliter level are the main advantages of the proposed method

  16. DIaaS: Data-Intensive workflows as a service - Enabling easy composition and deployment of data-intensive workflows on Virtual Research Environments

    Science.gov (United States)

    Filgueira, R.; Ferreira da Silva, R.; Deelman, E.; Atkinson, M.

    2016-12-01

    We present the Data-Intensive workflows as a Service (DIaaS) model for enabling easy data-intensive workflow composition and deployment on clouds using containers. DIaaS model backbone is Asterism, an integrated solution for running data-intensive stream-based applications on heterogeneous systems, which combines the benefits of dispel4py with Pegasus workflow systems. The stream-based executions of an Asterism workflow are managed by dispel4py, while the data movement between different e-Infrastructures, and the coordination of the application execution are automatically managed by Pegasus. DIaaS combines Asterism framework with Docker containers to provide an integrated, complete, easy-to-use, portable approach to run data-intensive workflows on distributed platforms. Three containers integrate the DIaaS model: a Pegasus node, and an MPI and an Apache Storm clusters. Container images are described as Dockerfiles (available online at http://github.com/dispel4py/pegasus_dispel4py), linked to Docker Hub for providing continuous integration (automated image builds), and image storing and sharing. In this model, all required software (workflow systems and execution engines) for running scientific applications are packed into the containers, which significantly reduces the effort (and possible human errors) required by scientists or VRE administrators to build such systems. The most common use of DIaaS will be to act as a backend of VREs or Scientific Gateways to run data-intensive applications, deploying cloud resources upon request. We have demonstrated the feasibility of DIaaS using the data-intensive seismic ambient noise cross-correlation application (Figure 1). The application preprocesses (Phase1) and cross-correlates (Phase2) traces from several seismic stations. The application is submitted via Pegasus (Container1), and Phase1 and Phase2 are executed in the MPI (Container2) and Storm (Container3) clusters respectively. Although both phases could be executed

  17. Business and scientific workflows a web service-oriented approach

    CERN Document Server

    Tan, Wei

    2013-01-01

    Focuses on how to use web service computing and service-based workflow technologies to develop timely, effective workflows for both business and scientific fields Utilizing web computing and Service-Oriented Architecture (SOA), Business and Scientific Workflows: A Web Service-Oriented Approach focuses on how to design, analyze, and deploy web service-based workflows for both business and scientific applications in many areas of healthcare and biomedicine. It also discusses and presents the recent research and development results. This informative reference features app

  18. Comparison of Resource Platform Selection Approaches for Scientific Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Simmhan, Yogesh; Ramakrishnan, Lavanya

    2010-03-05

    Cloud computing is increasingly considered as an additional computational resource platform for scientific workflows. The cloud offers opportunity to scale-out applications from desktops and local cluster resources. At the same time, it can eliminate the challenges of restricted software environments and queue delays in shared high performance computing environments. Choosing from these diverse resource platforms for a workflow execution poses a challenge for many scientists. Scientists are often faced with deciding resource platform selection trade-offs with limited information on the actual workflows. While many workflow planning methods have explored task scheduling onto different resources, these methods often require fine-scale characterization of the workflow that is onerous for a scientist. In this position paper, we describe our early exploratory work into using blackbox characteristics to do a cost-benefit analysis across of using cloud platforms. We use only very limited high-level information on the workflow length, width, and data sizes. The length and width are indicative of the workflow duration and parallelism. The data size characterizes the IO requirements. We compare the effectiveness of this approach to other resource selection models using two exemplar scientific workflows scheduled on desktops, local clusters, HPC centers, and clouds. Early results suggest that the blackbox model often makes the same resource selections as a more fine-grained whitebox model. We believe the simplicity of the blackbox model can help inform a scientist on the applicability of cloud computing resources even before porting an existing workflow.

  19. Design, Modelling and Analysis of a Workflow Reconfiguration

    DEFF Research Database (Denmark)

    Mazzara, Manuel; Abouzaid, Faisal; Dragoni, Nicola

    2011-01-01

    This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design and to ve......This paper describes a case study involving the reconfiguration of an office workflow. We state the requirements on a system implementing the workflow and its reconfiguration, and describe the system’s design in BPMN. We then use an asynchronous pi-calculus and Web.1 to model the design...

  20. A Strategy for an MLS Workflow Management System

    National Research Council Canada - National Science Library

    Kang, Myong H; Froscher, Judith N; Eppinger, Brian J; Moskowitz, Ira S

    1999-01-01

    .... Therefore, DoD needs MLS workflow management systems (WFMS) to enable globally distributed users and existing applications to cooperate across classification domains to achieve mission critical goals...

  1. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  2. Deriving DICOM surgical extensions from surgical workflows

    Science.gov (United States)

    Burgert, O.; Neumuth, T.; Gessat, M.; Jacobs, S.; Lemke, H. U.

    2007-03-01

    The generation, storage, transfer, and representation of image data in radiology are standardized by DICOM. To cover the needs of image guided surgery or computer assisted surgery in general one needs to handle patient information besides image data. A large number of objects must be defined in DICOM to address the needs of surgery. We propose an analysis process based on Surgical Workflows that helps to identify these objects together with use cases and requirements motivating for their specification. As the first result we confirmed the need for the specification of representation and transfer of geometric models. The analysis of Surgical Workflows has shown that geometric models are widely used to represent planned procedure steps, surgical tools, anatomical structures, or prosthesis in the context of surgical planning, image guided surgery, augmented reality, and simulation. By now, the models are stored and transferred in several file formats bare of contextual information. The standardization of data types including contextual information and specifications for handling of geometric models allows a broader usage of such models. This paper explains the specification process leading to Geometry Mesh Service Object Pair classes. This process can be a template for the definition of further DICOM classes.

  3. Workflow management for a cosmology collaboratory

    International Nuclear Information System (INIS)

    Loken, Stewart C.; McParland, Charles

    2001-01-01

    The Nearby Supernova Factory Project will provide a unique opportunity to bring together simulation and observation to address crucial problems in particle and nuclear physics. Its goal is to significantly enhance our understanding of the nuclear processes in supernovae and to improve our ability to use both Type Ia and Type II supernovae as reference light sources (standard candles) in precision measurements of cosmological parameters. Over the past several years, astronomers and astrophysicists have been conducting in-depth sky searches with the goal of identifying supernovae in their earliest evolutionary stages and, during the 4 to 8 weeks of their most ''explosive'' activity, measure their changing magnitude and spectra. The search program currently under development at LBNL is an earth-based observation program utilizing observational instruments at Haleakala and Mauna Kea, Hawaii and Mt. Palomar, California. This new program provides a demanding testbed for the integration of computational, data management and collaboratory technologies. A critical element of this effort is the use of emerging workflow management tools to permit collaborating scientists to manage data processing and storage and to integrate advanced supernova simulation into the real-time control of the experiments. This paper describes the workflow management framework for the project, discusses security and resource allocation requirements and reviews emerging tools to support this important aspect of collaborative work

  4. The Prosthetic Workflow in the Digital Era

    Directory of Open Access Journals (Sweden)

    Lidia Tordiglione

    2016-01-01

    Full Text Available The purpose of this retrospective study was to clinically evaluate the benefits of adopting a full digital workflow for the implementation of fixed prosthetic restorations on natural teeth. To evaluate the effectiveness of these protocols, treatment plans were drawn up for 15 patients requiring rehabilitation of one or more natural teeth. All the dental impressions were taken using a Planmeca PlanScan® (Planmeca OY, Helsinki, Finland intraoral scanner, which provided digital casts on which the restorations were digitally designed using Exocad® (Exocad GmbH, Germany, 2010 software and fabricated by CAM processing on 5-axis milling machines. A total of 28 single crowns were made from monolithic zirconia, 12 vestibular veneers from lithium disilicate, and 4 three-quarter vestibular veneers with palatal extension. While the restorations were applied, the authors could clinically appreciate the excellent match between the digitally produced prosthetic design and the cemented prostheses, which never required any occlusal or proximal adjustment. Out of all the restorations applied, only one exhibited premature failure and was replaced with no other complications or need for further scanning. From the clinical experience gained using a full digital workflow, the authors can confirm that these work processes enable the fabrication of clinically reliable restorations, with all the benefits that digital methods bring to the dentist, the dental laboratory, and the patient.

  5. Mass spectrometry in clinical chemistry

    International Nuclear Information System (INIS)

    Pettersen, J.E.

    1977-01-01

    A brief description is given of the functional elements of a mass spectrometer and of some currently employed mass spectrometric techniques, such as combined gas chromatography-mass spectrometry, mass chromatography, and selected ion monitoring. Various areas of application of mass spectrometry in clinical chemistry are discussed, such as inborn errors of metabolism and other metabolic disorders, intoxications, quantitative determinations of drugs, hormones, gases, and trace elements, and the use of isotope dilution mass spectrometry as a definitive method for the establishment of true values for concentrations of various compounds in reference sera. It is concluded that mass spectrometry is of great value in clinical chemistry. (Auth.)

  6. Application of matrix-assisted laser desorption/ionization mass spectrometry imaging in combination with LC-MS in pharmacokinetic study of metformin

    Czech Academy of Sciences Publication Activity Database

    Strnad, Štěpán; Vrkoslav, Vladimír; Klimšová, Z.; Zemenová, J.; Cvačka, Josef; Maletínská, Lenka; Sýkora, D.

    2018-01-01

    Roč. 10, č. 2 (2018), s. 71-81 ISSN 1757-6180 Institutional support: RVO:61388963 Keywords : dried blood spots * mass spectrometry imaging * metformin Subject RIV: CB - Analytical Chemistry, Separation OBOR OECD: Analytical chemistry Impact factor: 2.673, year: 2016

  7. Combined high-pressure cell-ultrahigh vacuum system for fast testing of model metal alloy catalysts using scanning mass spectrometry

    DEFF Research Database (Denmark)

    Johansson, Martin; Jørgensen, Jan Hoffmann; Chorkendorff, Ib

    2004-01-01

    and gas sampling device over the sample surface. The gas sampled is analyzed with mass spectrometry. Experiments can be made at pressures up to 1 bar and temperatures up to 500 °C. It is shown that the lateral resolution is better than 0.2 mm and that up to 20 circular spots, 1 mm in diameter, can...

  8. Dispersive solid phase extraction combined with ion-pair ultra high-performance liquid chromatography tandem mass spectrometry for quantification of nucleotides in Lactococcus lactis

    DEFF Research Database (Denmark)

    Magdenoska, Olivera; Martinussen, Jan; Thykær, Jette

    2013-01-01

    solid phase extraction with charcoal and subsequent analysis with ion-pair liquid chromatography coupled with electrospray ionization tandem mass spectrometry was established for quantification of intracellular pools of the 28 most important nucleotides. The method can handle extracts where cells leak...

  9. A combined accelerator mass spectrometry-positron emission tomography human microdose study with 14C- and 11C-labelled verapamil.

    Science.gov (United States)

    Wagner, Claudia C; Simpson, Marie; Zeitlinger, Markus; Bauer, Martin; Karch, Rudolf; Abrahim, Aiman; Feurstein, Thomas; Schütz, Matthias; Kletter, Kurt; Müller, Markus; Lappin, Graham; Langer, Oliver

    2011-02-01

    In microdose studies, the pharmacokinetic profile of a drug in blood after administration of a dose up to 100 μg is measured with sensitive analytical techniques, such as accelerator mass spectrometry (AMS). As most drugs exert their effect in tissue rather than blood, methodology is needed for extending pharmacokinetic analysis to different tissue compartments. In the present study, we combined, for the first time, AMS analysis with positron emission tomography (PET) in order to determine the pharmacokinetic profile of the model drug verapamil in plasma and brain of humans. In order to assess pharmacokinetic dose linearity of verapamil, data were acquired and compared after administration of an intravenous microdose and after an intravenous microdose administered concomitantly with an oral therapeutic dose. Six healthy male subjects received an intravenous microdose [0.05 mg] (period 1) and an intravenous microdose administered concomitantly with an oral therapeutic dose [80 mg] of verapamil (period 2) in a randomized, crossover, two-period study design. The intravenous dose was a mixture of (R/S)-[14C]verapamil and (R)-[11C]verapamil and the oral dose was unlabelled racaemic verapamil. Brain distribution of radioactivity was measured with PET whereas plasma pharmacokinetics of (R)- and (S)-verapamil were determined with AMS. PET data were analysed by pharmacokinetic modelling to estimate the rate constants for transfer (k) of radioactivity across the blood-brain barrier. Most pharmacokinetic parameters of (R)- and (S)-verapamil as well as parameters describing exchange of radioactivity between plasma and brain (influx rate constant [K(1)] = 0.030 ± 0.003 and 0.031 ± 0.005 mL/mL/min and efflux rate constant [k(2)] = 0.099 ± 0.006 and 0.095 ± 0.008 min-1 for period 1 and 2, respectively) were not statistically different between the two periods although there was a trend for nonlinear pharmacokinetics for the (R)-enantiomer. On the other hand, all

  10. Hydroponic isotope labeling of entire plants and high-performance mass spectrometry for quantitative plant proteomics.

    Science.gov (United States)

    Bindschedler, Laurence V; Mills, Davinia J S; Cramer, Rainer

    2012-01-01

    Hydroponic isotope labeling of entire plants (HILEP) combines hydroponic plant cultivation and metabolic labeling with stable isotopes using (15)N-containing inorganic salts to label whole and mature plants. Employing (15)N salts as the sole nitrogen source for HILEP leads to the production of healthy-looking plants which contain (15)N proteins labeled to nearly 100%. Therefore, HILEP is suitable for quantitative plant proteomic analysis, where plants are grown in either (14)N- or (15)N-hydroponic media and pooled when the biological samples are collected for relative proteome quantitation. The pooled (14)N-/(15)N-protein extracts can be fractionated in any suitable way and digested with a protease for shotgun proteomics, using typically reverse phase liquid chromatography nanoelectrospray ionization tandem mass spectrometry (RPLC-nESI-MS/MS). Best results were obtained with a hybrid ion trap/FT-MS mass spectrometer, combining high mass accuracy and sensitivity for the MS data acquisition with speed and high-throughput MS/MS data acquisition, increasing the number of proteins identified and quantified and improving protein quantitation. Peak processing and picking from raw MS data files, protein identification, and quantitation were performed in a highly automated way using integrated MS data analysis software with minimum manual intervention, thus easing the analytical workflow. In this methodology paper, we describe how to grow Arabidopsis plants hydroponically for isotope labeling using (15)N salts and how to quantitate the resulting proteomes using a convenient workflow that does not require extensive bioinformatics skills.

  11. Fourier Transform Mass Spectrometry.

    Science.gov (United States)

    Gross, Michael L.; Rempel, Don L.

    1984-01-01

    Discusses the nature of Fourier transform mass spectrometry and its unique combination of high mass resolution, high upper mass limit, and multichannel advantage. Examines its operation, capabilities and limitations, applications (ion storage, ion manipulation, ion chemistry), and future applications and developments. (JN)

  12. Galaxy tools and workflows for sequence analysis with applications in molecular plant pathology.

    Science.gov (United States)

    Cock, Peter J A; Grüning, Björn A; Paszkiewicz, Konrad; Pritchard, Leighton

    2013-01-01

    The Galaxy Project offers the popular web browser-based platform Galaxy for running bioinformatics tools and constructing simple workflows. Here, we present a broad collection of additional Galaxy tools for large scale analysis of gene and protein sequences. The motivating research theme is the identification of specific genes of interest in a range of non-model organisms, and our central example is the identification and prediction of "effector" proteins produced by plant pathogens in order to manipulate their host plant. This functional annotation of a pathogen's predicted capacity for virulence is a key step in translating sequence data into potential applications in plant pathology. This collection includes novel tools, and widely-used third-party tools such as NCBI BLAST+ wrapped for use within Galaxy. Individual bioinformatics software tools are typically available separately as standalone packages, or in online browser-based form. The Galaxy framework enables the user to combine these and other tools to automate organism scale analyses as workflows, without demanding familiarity with command line tools and scripting. Workflows created using Galaxy can be saved and are reusable, so may be distributed within and between research groups, facilitating the construction of a set of standardised, reusable bioinformatic protocols. The Galaxy tools and workflows described in this manuscript are open source and freely available from the Galaxy Tool Shed (http://usegalaxy.org/toolshed or http://toolshed.g2.bx.psu.edu).

  13. Image data compression in diagnostic imaging. International literature review and workflow recommendation

    International Nuclear Information System (INIS)

    Braunschweig, R.; Kaden, Ingmar; Schwarzer, J.; Sprengel, C.; Klose, K.

    2009-01-01

    Purpose: Today healthcare policy is based on effectiveness. Diagnostic imaging became a ''pace-setter'' due to amazing technical developments (e.g. multislice CT), extensive data volumes, and especially the well defined workflow-orientated scenarios on a local and (inter)national level. To make centralized networks sufficient, image data compression has been regarded as the key to a simple and secure solution. In February 2008 specialized working groups of the DRG held a consensus conference. They designed recommended data compression techniques and ratios. Material und methoden: The purpose of our paper is an international review of the literature of compression technologies, different imaging procedures (e.g. DR, CT etc.), and targets (abdomen, etc.) and to combine recommendations for compression ratios and techniques with different workflows. The studies were assigned to 4 different levels (0-3) according to the evidence. 51 studies were assigned to the highest level 3. Results: We recommend a compression factor of 1: 8 (excluding cranial scans 1:5). For workflow reasons data compression should be based on the modalities (CT, etc.). PACS-based compression is currently possible but fails to maximize workflow benefits. Only the modality-based scenarios achieve all benefits. (orig.)

  14. Development of a High-Throughput Ion-Exchange Resin Characterization Workflow.

    Science.gov (United States)

    Liu, Chun; Dermody, Daniel; Harris, Keith; Boomgaard, Thomas; Sweeney, Jeff; Gisch, Daryl; Goltz, Bob

    2017-06-12

    A novel high-throughout (HTR) ion-exchange (IEX) resin workflow has been developed for characterizing ion exchange equilibrium of commercial and experimental IEX resins against a range of different applications where water environment differs from site to site. Because of its much higher throughput, design of experiment (DOE) methodology can be easily applied for studying the effects of multiple factors on resin performance. Two case studies will be presented to illustrate the efficacy of the combined HTR workflow and DOE method. In case study one, a series of anion exchange resins have been screened for selective removal of NO 3 - and NO 2 - in water environments consisting of multiple other anions, varied pH, and ionic strength. The response surface model (RSM) is developed to statistically correlate the resin performance with the water composition and predict the best resin candidate. In case study two, the same HTR workflow and DOE method have been applied for screening different cation exchange resins in terms of the selective removal of Mg 2+ , Ca 2+ , and Ba 2+ from high total dissolved salt (TDS) water. A master DOE model including all of the cation exchange resins is created to predict divalent cation removal by different IEX resins under specific conditions, from which the best resin candidates can be identified. The successful adoption of HTR workflow and DOE method for studying the ion exchange of IEX resins can significantly reduce the resources and time to address industry and application needs.

  15. Interacting with the National Database for Autism Research (NDAR) via the LONI Pipeline workflow environment.

    Science.gov (United States)

    Torgerson, Carinna M; Quinn, Catherine; Dinov, Ivo; Liu, Zhizhong; Petrosyan, Petros; Pelphrey, Kevin; Haselgrove, Christian; Kennedy, David N; Toga, Arthur W; Van Horn, John Darrell

    2015-03-01

    Under the umbrella of the National Database for Clinical Trials (NDCT) related to mental illnesses, the National Database for Autism Research (NDAR) seeks to gather, curate, and make openly available neuroimaging data from NIH-funded studies of autism spectrum disorder (ASD). NDAR has recently made its database accessible through the LONI Pipeline workflow design and execution environment to enable large-scale analyses of cortical architecture and function via local, cluster, or "cloud"-based computing resources. This presents a unique opportunity to overcome many of the customary limitations to fostering biomedical neuroimaging as a science of discovery. Providing open access to primary neuroimaging data, workflow methods, and high-performance computing will increase uniformity in data collection protocols, encourage greater reliability of published data, results replication, and broaden the range of researchers now able to perform larger studies than ever before. To illustrate the use of NDAR and LONI Pipeline for performing several commonly performed neuroimaging processing steps and analyses, this paper presents example workflows useful for ASD neuroimaging researchers seeking to begin using this valuable combination of online data and computational resources. We discuss the utility of such database and workflow processing interactivity as a motivation for the sharing of additional primary data in ASD research and elsewhere.

  16. Image data compression in diagnostic imaging. International literature review and workflow recommendation

    Energy Technology Data Exchange (ETDEWEB)

    Braunschweig, R.; Kaden, Ingmar [Klinik fuer Bildgebende Diagnostik und Interventionsradiologie, BG-Kliniken Bergmannstrost Halle (Germany); Schwarzer, J.; Sprengel, C. [Dept. of Management Information System and Operations Research, Martin-Luther-Univ. Halle Wittenberg (Germany); Klose, K. [Medizinisches Zentrum fuer Radiologie, Philips-Univ. Marburg (Germany)

    2009-07-15

    Purpose: Today healthcare policy is based on effectiveness. Diagnostic imaging became a ''pace-setter'' due to amazing technical developments (e.g. multislice CT), extensive data volumes, and especially the well defined workflow-orientated scenarios on a local and (inter)national level. To make centralized networks sufficient, image data compression has been regarded as the key to a simple and secure solution. In February 2008 specialized working groups of the DRG held a consensus conference. They designed recommended data compression techniques and ratios. Material und methoden: The purpose of our paper is an international review of the literature of compression technologies, different imaging procedures (e.g. DR, CT etc.), and targets (abdomen, etc.) and to combine recommendations for compression ratios and techniques with different workflows. The studies were assigned to 4 different levels (0-3) according to the evidence. 51 studies were assigned to the highest level 3. Results: We recommend a compression factor of 1: 8 (excluding cranial scans 1:5). For workflow reasons data compression should be based on the modalities (CT, etc.). PACS-based compression is currently possible but fails to maximize workflow benefits. Only the modality-based scenarios achieve all benefits. (orig.)

  17. Biowep: a workflow enactment portal for bioinformatics applications.

    Science.gov (United States)

    Romano, Paolo; Bartocci, Ezio; Bertolini, Guglielmo; De Paoli, Flavio; Marra, Domenico; Mauri, Giancarlo; Merelli, Emanuela; Milanesi, Luciano

    2007-03-08

    The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS), can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical databases and analysis software and the creation of

  18. Biowep: a workflow enactment portal for bioinformatics applications

    Directory of Open Access Journals (Sweden)

    Romano Paolo

    2007-03-01

    Full Text Available Abstract Background The huge amount of biological information, its distribution over the Internet and the heterogeneity of available software tools makes the adoption of new data integration and analysis network tools a necessity in bioinformatics. ICT standards and tools, like Web Services and Workflow Management Systems (WMS, can support the creation and deployment of such systems. Many Web Services are already available and some WMS have been proposed. They assume that researchers know which bioinformatics resources can be reached through a programmatic interface and that they are skilled in programming and building workflows. Therefore, they are not viable to the majority of unskilled researchers. A portal enabling these to take profit from new technologies is still missing. Results We designed biowep, a web based client application that allows for the selection and execution of a set of predefined workflows. The system is available on-line. Biowep architecture includes a Workflow Manager, a User Interface and a Workflow Executor. The task of the Workflow Manager is the creation and annotation of workflows. These can be created by using either the Taverna Workbench or BioWMS. Enactment of workflows is carried out by FreeFluo for Taverna workflows and by BioAgent/Hermes, a mobile agent-based middleware, for BioWMS ones. Main workflows' processing steps are annotated on the basis of their input and output, elaboration type and application domain by using a classification of bioinformatics data and tasks. The interface supports users authentication and profiling. Workflows can be selected on the basis of users' profiles and can be searched through their annotations. Results can be saved. Conclusion We developed a web system that support the selection and execution of predefined workflows, thus simplifying access for all researchers. The implementation of Web Services allowing specialized software to interact with an exhaustive set of biomedical

  19. Towards seamless workflows in agile data science

    Science.gov (United States)

    Klump, J. F.; Robertson, J.

    2017-12-01

    Agile workflows are a response to projects with requirements that may change over time. They prioritise rapid and flexible responses to change, preferring to adapt to changes in requirements rather than predict them before a project starts. This suits the needs of research very well because research is inherently agile in its methodology. The adoption of agile methods has made collaborative data analysis much easier in a research environment fragmented across institutional data stores, HPC, personal and lab computers and more recently cloud environments. Agile workflows use tools that share a common worldview: in an agile environment, there may be more that one valid version of data, code or environment in play at any given time. All of these versions need references and identifiers. For example, a team of developers following the git-flow conventions (github.com/nvie/gitflow) may have several active branches, one for each strand of development. These workflows allow rapid and parallel iteration while maintaining identifiers pointing to individual snapshots of data and code and allowing rapid switching between strands. In contrast, the current focus of versioning in research data management is geared towards managing data for reproducibility and long-term preservation of the record of science. While both are important goals in the persistent curation domain of the institutional research data infrastructure, current tools emphasise planning over adaptation and can introduce unwanted rigidity by insisting on a single valid version or point of truth. In the collaborative curation domain of a research project, things are more fluid. However, there is no equivalent to the "versioning iso-surface" of the git protocol for the management and versioning of research data. At CSIRO we are developing concepts and tools for the agile management of software code and research data for virtual research environments, based on our experiences of actual data analytics projects in the

  20. Managing Library IT Workflow with Bugzilla

    Directory of Open Access Journals (Sweden)

    Nina McHale

    2010-09-01

    Full Text Available Prior to September 2008, all technology issues at the University of Colorado Denver's Auraria Library were reported to a dedicated departmental phone line. A variety of staff changes necessitated a more formal means of tracking, delegating, and resolving reported issues, and the department turned to Bugzilla, an open source bug tracking application designed by Mozilla.org developers. While designed with software development bug tracking in mind, Bugzilla can be easily customized and modified to serve as an IT ticketing system. Twenty-three months and over 2300 trouble tickets later, Auraria's IT department workflow is much smoother and more efficient. This article includes two Perl Template Toolkit code samples for customized Bugzilla screens for its use in a library environment; readers will be able to easily replicate the project in their own environments.

  1. Swabs to genomes: a comprehensive workflow

    Directory of Open Access Journals (Sweden)

    Madison I. Dunitz

    2015-05-01

    Full Text Available The sequencing, assembly, and basic analysis of microbial genomes, once a painstaking and expensive undertaking, has become much easier for research labs with access to standard molecular biology and computational tools. However, there are a confusing variety of options available for DNA library preparation and sequencing, and inexperience with bioinformatics can pose a significant barrier to entry for many who may be interested in microbial genomics. The objective of the present study was to design, test, troubleshoot, and publish a simple, comprehensive workflow from the collection of an environmental sample (a swab to a published microbial genome; empowering even a lab or classroom with limited resources and bioinformatics experience to perform it.

  2. The P2P approach to interorganizational workflows

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Weske, M.H.; Dittrich, K.R.; Geppert, A.; Norrie, M.C.

    2001-01-01

    This paper describes in an informal way the Public-To-Private (P2P) approach to interorganizational workflows, which is based on a notion of inheritance. The approach consists of three steps: (1) create a common understanding of the interorganizational workflow by specifying a shared public

  3. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  4. Design decisions in workflow management and quality of work.

    NARCIS (Netherlands)

    Waal, B.M.E. de; Batenburg, R.

    2009-01-01

    In this paper, the design and implementation of a workflow management (WFM) system in a large Dutch social insurance organisation is described. The effect of workflow design decisions on the quality of work is explored theoretically and empirically, using the model of Zur Mühlen as a frame of

  5. Conceptual framework and architecture for service mediating workflow management

    NARCIS (Netherlands)

    Hu, Jinmin; Grefen, P.W.P.J.

    2003-01-01

    This paper proposes a three-layer workflow concept framework to realize workflow enactment flexibility by dynamically binding activities to their implementations at run time. A service mediating layer is added to bridge business process definition and its implementation. Based on this framework, we

  6. Building and documenting workflows with python-based snakemake

    NARCIS (Netherlands)

    J. Köster (Johannes); S. Rahmann (Sven)

    2012-01-01

    textabstractSnakemake is a novel workflow engine with a simple Python-derived workflow definition language and an optimizing execution environment. It is the first system that supports multiple named wildcards (or variables) in input and output filenames of each rule definition. It also allows to

  7. Analyzing the Gap between Workflows and their Natural Language Descriptions

    NARCIS (Netherlands)

    Groth, P.T.; Gil, Y

    2009-01-01

    Scientists increasingly use workflows to represent and share their computational experiments. Because of their declarative nature, focus on pre-existing component composition and the availability of visual editors, workflows provide a valuable start for creating user-friendly environments for end

  8. Two-Layer Transaction Management for Workflow Management Applications

    NARCIS (Netherlands)

    Grefen, P.W.P.J.; Vonk, J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management applications require advanced transaction management that is not offered by traditional database systems. For this reason, a number of extended transaction models has been proposed in the past. None of these models seems completely adequate, though, because workflow management

  9. Parametric Room Acoustic workflows with real-time acoustic simulation

    DEFF Research Database (Denmark)

    Parigi, Dario

    2017-01-01

    The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages......The paper investigates and assesses the opportunities that real-time acoustic simulation offer to engage in parametric acoustics workflow and to influence architectural designs from early design stages...

  10. Open source workflow : a viable direction for BPM?

    NARCIS (Netherlands)

    Wohed, P.; Russell, N.C.; Hofstede, ter A.H.M.; Andersson, B.; Aalst, van der W.M.P.; Bellahsène, Z.; Léonard, M.

    2008-01-01

    With the growing interest in open source software in general and business process management and workflow systems in particular, it is worthwhile investigating the state of open source workflow management. The plethora of these offerings (recent surveys such as [4,6], each contain more than 30 such

  11. Distributed Global Transaction Support for Workflow Management Applications

    NARCIS (Netherlands)

    Vonk, J.; Grefen, P.W.P.J.; Boertjes, E.M.; Apers, Peter M.G.

    Workflow management systems require advanced transaction support to cope with their inherently long-running processes. The recent trend to distribute workflow executions requires an even more advanced transaction support system that is able to handle distribution. This paper presents a model as well

  12. A Collaborative Workflow for the Digitization of Unique Materials

    Science.gov (United States)

    Gueguen, Gretchen; Hanlon, Ann M.

    2009-01-01

    This paper examines the experience of one institution, the University of Maryland Libraries, as it made organizational efforts to harness existing workflows and to capture digitization done in the course of responding to patron requests. By examining the way this organization adjusted its existing workflows to put in place more systematic methods…

  13. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  14. Workflow automation based on OSI job transfer and manipulation

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Joosten, Stef M.M.; Guareis de farias, Cléver

    1999-01-01

    This paper shows that Workflow Management Systems (WFMS) and a data communication standard called Job Transfer and Manipulation (JTM) are built on the same concepts, even though different words are used. The paper analyses the correspondence of workflow concepts and JTM concepts. Besides, the

  15. From Paper Based Clinical Practice Guidelines to Declarative Workflow Management

    DEFF Research Database (Denmark)

    Lyng, Karen Marie; Hildebrandt, Thomas; Mukkamala, Raghava Rao

    2009-01-01

    a sub workflow can be described in a declarative workflow management system: the Resultmaker Online Consultant (ROC). The example demonstrates that declarative primitives allow to naturally extend the paper based flowchart to an executable model without introducing a complex cyclic control flow graph....

  16. Combined quantification of faecal sterols, stanols, stanones and bile acids in soils and terrestrial sediments by gas chromatography-mass spectrometry.

    Science.gov (United States)

    Birk, Jago Jonathan; Dippold, Michaela; Wiesenberg, Guido L B; Glaser, Bruno

    2012-06-15

    Faeces incorporation can alter the concentration patterns of stanols, stanones, Δ(5)-sterols and bile acids in soils and terrestrial sediments. A joint quantification of these substances would give robust and specific information about the faecal input. Therefore, a method was developed for their purification and determination via gas chromatography-mass spectrometry (GC-MS) based on a total lipid extract (TLE) of soils and terrestrial sediments. Stanols, stanones, Δ(5)-steroles and bile acids were extracted by a single Soxhlet extraction yielding a TLE. The TLE was saponified with KOH in methanol. Sequential liquid-liquid extraction was applied to recover the biomarkers from the saponified extract and to separate the bile acids from the neutral stanoles, stanones and Δ(5)-steroles. The neutral fraction was directly purified using solid phase extraction (SPE) columns packed with 5% deactivated silica gel. The bile acids were methylated in dry HCl in methanol and purified on SPE columns packed with activated silica gel. A mixture of hexamethyldisilazane (HMDS), trimethylchlorosilane (TMCS) and pyridine was used to silylate the hydroxyl groups of the stanols and Δ(5)-sterols avoiding a silylation of the keto groups of the stanones in their enol-form. Silylation of the bile acids was carried out with N,O-bis(trimethylsilyl)trifluoroacetamide (BSTFA) containing N-trimethylsilylimidazole (TSIM). TLEs from a set of soils with different physico-chemical properties were used for method evaluation and for comparison of amounts of faecal biomarkers analysed with saponification and without saponification of the TLE. Therefore, a Regosol, a Podzol and a Ferralsol were sampled. To proof the applicability of the method for faecal biomarker analyses in archaeological soils and sediments, additional samples were taken from pre-Columbian Anthrosols in Amazonia and an Anthrosol from a site in central Europe settled since the Neolithic. The comparison of the amounts of steroids

  17. A Multi-Dimensional Classification Model for Scientific Workflow Characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Ramakrishnan, Lavanya; Plale, Beth

    2010-04-05

    Workflows have been used to model repeatable tasks or operations in manufacturing, business process, and software. In recent years, workflows are increasingly used for orchestration of science discovery tasks that use distributed resources and web services environments through resource models such as grid and cloud computing. Workflows have disparate re uirements and constraints that affects how they might be managed in distributed environments. In this paper, we present a multi-dimensional classification model illustrated by workflow examples obtained through a survey of scientists from different domains including bioinformatics and biomedical, weather and ocean modeling, astronomy detailing their data and computational requirements. The survey results and classification model contribute to the high level understandingof scientific workflows.

  18. A practical workflow for making anatomical atlases for biological research.

    Science.gov (United States)

    Wan, Yong; Lewis, A Kelsey; Colasanto, Mary; van Langeveld, Mark; Kardon, Gabrielle; Hansen, Charles

    2012-01-01

    The anatomical atlas has been at the intersection of science and art for centuries. These atlases are essential to biological research, but high-quality atlases are often scarce. Recent advances in imaging technology have made high-quality 3D atlases possible. However, until now there has been a lack of practical workflows using standard tools to generate atlases from images of biological samples. With certain adaptations, CG artists' workflow and tools, traditionally used in the film industry, are practical for building high-quality biological atlases. Researchers have developed a workflow for generating a 3D anatomical atlas using accessible artists' tools. They used this workflow to build a mouse limb atlas for studying the musculoskeletal system's development. This research aims to raise the awareness of using artists' tools in scientific research and promote interdisciplinary collaborations between artists and scientists. This video (http://youtu.be/g61C-nia9ms) demonstrates a workflow for creating an anatomical atlas.

  19. Federated Database Services for Wind Tunnel Experiment Workflows

    Directory of Open Access Journals (Sweden)

    A. Paventhan

    2006-01-01

    Full Text Available Enabling the full life cycle of scientific and engineering workflows requires robust middleware and services that support effective data management, near-realtime data movement and custom data processing. Many existing solutions exploit the database as a passive metadata catalog. In this paper, we present an approach that makes use of federation of databases to host data-centric wind tunnel application workflows. The user is able to compose customized application workflows based on database services. We provide a reference implementation that leverages typical business tools and technologies: Microsoft SQL Server for database services and Windows Workflow Foundation for workflow services. The application data and user's code are both hosted in federated databases. With the growing interest in XML Web Services in scientific Grids, and with databases beginning to support native XML types and XML Web services, we can expect the role of databases in scientific computation to grow in importance.

  20. A new approach in proteomics of wheat gluten: combining chymotrypsin cleavage and matrix-assisted laser desorption/ionization quadrupole ion trap reflectron tandem mass spectrometry

    Czech Academy of Sciences Publication Activity Database

    Šalplachta, Jiří; Marchetti, M.; Chmelík, Josef; Allmaier, G.

    2005-01-01

    Roč. 19, č. 18 (2005), s. 2725-2728 ISSN 0951-4198 R&D Projects: GA MZe QD1023 Grant - others:Austrian Science Foundation(AT) P14181; Austrian Science Foundation;(AT) P15008 Institutional research plan: CEZ:AV0Z40310501 Keywords : wheat gluten * mass spectrometry * chymotrypsin Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 3.087, year: 2005

  1. Radiology information system: a workflow-based approach

    International Nuclear Information System (INIS)

    Zhang, Jinyan; Lu, Xudong; Nie, Hongchao; Huang, Zhengxing; Aalst, W.M.P. van der

    2009-01-01

    Introducing workflow management technology in healthcare seems to be prospective in dealing with the problem that the current healthcare Information Systems cannot provide sufficient support for the process management, although several challenges still exist. The purpose of this paper is to study the method of developing workflow-based information system in radiology department as a use case. First, a workflow model of typical radiology process was established. Second, based on the model, the system could be designed and implemented as a group of loosely coupled components. Each component corresponded to one task in the process and could be assembled by the workflow management system. The legacy systems could be taken as special components, which also corresponded to the tasks and were integrated through transferring non-work- flow-aware interfaces to the standard ones. Finally, a workflow dashboard was designed and implemented to provide an integral view of radiology processes. The workflow-based Radiology Information System was deployed in the radiology department of Zhejiang Chinese Medicine Hospital in China. The results showed that it could be adjusted flexibly in response to the needs of changing process, and enhance the process management in the department. It can also provide a more workflow-aware integration method, comparing with other methods such as IHE-based ones. The workflow-based approach is a new method of developing radiology information system with more flexibility, more functionalities of process management and more workflow-aware integration. The work of this paper is an initial endeavor for introducing workflow management technology in healthcare. (orig.)

  2. Identification of Tyrosine Phosphorylated Proteins by SH2 Domain Affinity Purification and Mass Spectrometry.

    Science.gov (United States)

    Buhs, Sophia; Gerull, Helwe; Nollau, Peter

    2017-01-01

    Phosphotyrosine signaling plays a major role in the control of many important biological functions such as cell proliferation and apoptosis. Deciphering of phosphotyrosine-dependent signaling is therefore of great interest paving the way for the understanding of physiological and pathological processes of signal transduction. On the basis of the specific binding of SH2 domains to phosphotyrosine residues, we here present an experimental workflow for affinity purification and subsequent identification of tyrosine phosphorylated proteins by mass spectrometry. In combination with SH2 profiling, a broadly applicable platform for the characterization of phosphotyrosine profiles in cell extracts, our pull down strategy enables researchers by now to identify proteins in signaling cascades which are differentially phosphorylated and selectively recognized by distinct SH2 domains.

  3. Scrambled eggs: A highly sensitive molecular diagnostic workflow for Fasciola species specific detection from faecal samples

    Science.gov (United States)

    Calvani, Nichola Eliza Davies; Windsor, Peter Andrew; Bush, Russell David

    2017-01-01

    Background Fasciolosis, due to Fasciola hepatica and Fasciola gigantica, is a re-emerging zoonotic parasitic disease of worldwide importance. Human and animal infections are commonly diagnosed by the traditional sedimentation and faecal egg-counting technique. However, this technique is time-consuming and prone to sensitivity errors when a large number of samples must be processed or if the operator lacks sufficient experience. Additionally, diagnosis can only be made once the 12-week pre-patent period has passed. Recently, a commercially available coprological antigen ELISA has enabled detection of F. hepatica prior to the completion of the pre-patent period, providing earlier diagnosis and increased throughput, although species differentiation is not possible in areas of parasite sympatry. Real-time PCR offers the combined benefits of highly sensitive species differentiation for medium to large sample sizes. However, no molecular diagnostic workflow currently exists for the identification of Fasciola spp. in faecal samples. Methodology/Principal findings A new molecular diagnostic workflow for the highly-sensitive detection and quantification of Fasciola spp. in faecal samples was developed. The technique involves sedimenting and pelleting the samples prior to DNA isolation in order to concentrate the eggs, followed by disruption by bead-beating in a benchtop homogeniser to ensure access to DNA. Although both the new molecular workflow and the traditional sedimentation technique were sensitive and specific, the new molecular workflow enabled faster sample throughput in medium to large epidemiological studies, and provided the additional benefit of speciation. Further, good correlation (R2 = 0.74–0.76) was observed between the real-time PCR values and the faecal egg count (FEC) using the new molecular workflow for all herds and sampling periods. Finally, no effect of storage in 70% ethanol was detected on sedimentation and DNA isolation outcomes; enabling

  4. Scrambled eggs: A highly sensitive molecular diagnostic workflow for Fasciola species specific detection from faecal samples.

    Directory of Open Access Journals (Sweden)

    Nichola Eliza Davies Calvani

    2017-09-01

    Full Text Available Fasciolosis, due to Fasciola hepatica and Fasciola gigantica, is a re-emerging zoonotic parasitic disease of worldwide importance. Human and animal infections are commonly diagnosed by the traditional sedimentation and faecal egg-counting technique. However, this technique is time-consuming and prone to sensitivity errors when a large number of samples must be processed or if the operator lacks sufficient experience. Additionally, diagnosis can only be made once the 12-week pre-patent period has passed. Recently, a commercially available coprological antigen ELISA has enabled detection of F. hepatica prior to the completion of the pre-patent period, providing earlier diagnosis and increased throughput, although species differentiation is not possible in areas of parasite sympatry. Real-time PCR offers the combined benefits of highly sensitive species differentiation for medium to large sample sizes. However, no molecular diagnostic workflow currently exists for the identification of Fasciola spp. in faecal samples.A new molecular diagnostic workflow for the highly-sensitive detection and quantification of Fasciola spp. in faecal samples was developed. The technique involves sedimenting and pelleting the samples prior to DNA isolation in order to concentrate the eggs, followed by disruption by bead-beating in a benchtop homogeniser to ensure access to DNA. Although both the new molecular workflow and the traditional sedimentation technique were sensitive and specific, the new molecular workflow enabled faster sample throughput in medium to large epidemiological studies, and provided the additional benefit of speciation. Further, good correlation (R2 = 0.74-0.76 was observed between the real-time PCR values and the faecal egg count (FEC using the new molecular workflow for all herds and sampling periods. Finally, no effect of storage in 70% ethanol was detected on sedimentation and DNA isolation outcomes; enabling transport of samples from endemic

  5. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic

  6. Scrambled eggs: A highly sensitive molecular diagnostic workflow for Fasciola species specific detection from faecal samples.

    Science.gov (United States)

    Calvani, Nichola Eliza Davies; Windsor, Peter Andrew; Bush, Russell David; Šlapeta, Jan

    2017-09-01

    Fasciolosis, due to Fasciola hepatica and Fasciola gigantica, is a re-emerging zoonotic parasitic disease of worldwide importance. Human and animal infections are commonly diagnosed by the traditional sedimentation and faecal egg-counting technique. However, this technique is time-consuming and prone to sensitivity errors when a large number of samples must be processed or if the operator lacks sufficient experience. Additionally, diagnosis can only be made once the 12-week pre-patent period has passed. Recently, a commercially available coprological antigen ELISA has enabled detection of F. hepatica prior to the completion of the pre-patent period, providing earlier diagnosis and increased throughput, although species differentiation is not possible in areas of parasite sympatry. Real-time PCR offers the combined benefits of highly sensitive species differentiation for medium to large sample sizes. However, no molecular diagnostic workflow currently exists for the identification of Fasciola spp. in faecal samples. A new molecular diagnostic workflow for the highly-sensitive detection and quantification of Fasciola spp. in faecal samples was developed. The technique involves sedimenting and pelleting the samples prior to DNA isolation in order to concentrate the eggs, followed by disruption by bead-beating in a benchtop homogeniser to ensure access to DNA. Although both the new molecular workflow and the traditional sedimentation technique were sensitive and specific, the new molecular workflow enabled faster sample throughput in medium to large epidemiological studies, and provided the additional benefit of speciation. Further, good correlation (R2 = 0.74-0.76) was observed between the real-time PCR values and the faecal egg count (FEC) using the new molecular workflow for all herds and sampling periods. Finally, no effect of storage in 70% ethanol was detected on sedimentation and DNA isolation outcomes; enabling transport of samples from endemic to non

  7. Create, run, share, publish, and reference your LC-MS, FIA-MS, GC-MS, and NMR data analysis workflows with the Workflow4Metabolomics 3.0 Galaxy online infrastructure for metabolomics.

    Science.gov (United States)

    Guitton, Yann; Tremblay-Franco, Marie; Le Corguillé, Gildas; Martin, Jean-François; Pétéra, Mélanie; Roger-Mele, Pierrick; Delabrière, Alexis; Goulitquer, Sophie; Monsoor, Misharl; Duperier, Christophe; Canlet, Cécile; Servien, Rémi; Tardivel, Patrick; Caron, Christophe; Giacomoni, Franck; Thévenot, Etienne A

    2017-12-01

    Metabolomics is a key approach in modern functional genomics and systems biology. Due to the complexity of metabolomics data, the variety of experimental designs, and the multiplicity of bioinformatics tools, providing experimenters with a simple and efficient resource to conduct comprehensive and rigorous analysis of their data is of utmost importance. In 2014, we launched the Workflow4Metabolomics (W4M; http://workflow4metabolomics.org) online infrastructure for metabolomics built on the Galaxy environment, which offers user-friendly features to build and run data analysis workflows including preprocessing, statistical analysis, and annotation steps. Here we present the new W4M 3.0 release, which contains twice as many tools as the first version, and provides two features which are, to our knowledge, unique among online resources. First, data from the four major metabolomics technologies (i.e., LC-MS, FIA-MS, GC-MS, and NMR) can be analyzed on a single platform. By using three studies in human physiology, alga evolution, and animal toxicology, we demonstrate how the 40 available tools can be easily combined to address biological issues. Second, the full analysis (including the workflow, the parameter values, the input data and output results) can be referenced with a permanent digital object identifier (DOI). Publication of data analyses is of major importance for robust and reproducible science. Furthermore, the publicly shared workflows are of high-value for e-learning and training. The Workflow4Metabolomics 3.0 e-infrastructure thus not only offers a unique online environment for analysis of data from the main metabolomics technologies, but it is also the first reference repository for metabolomics workflows. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. High throughput and accurate serum proteome profiling by integrated sample preparation technology and single-run data independent mass spectrometry analysis.

    Science.gov (United States)

    Lin, Lin; Zheng, Jiaxin; Yu, Quan; Chen, Wendong; Xing, Jinchun; Chen, Chenxi; Tian, Ruijun

    2018-03-01

    Mass spectrometry (MS)-based serum proteome analysis is extremely challenging due to its high complexity and dynamic range of protein abundances. Developing high throughput and accurate serum proteomic profiling approach capable of analyzing large cohorts is urgently needed for biomarker discovery. Herein, we report a streamlined workflow for fast and accurate proteomic profiling from 1μL of blood serum. The workflow combined an integrated technique for highly sensitive and reproducible sample preparation and a new data-independent acquisition (DIA)-based MS method. Comparing with standard data dependent acquisition (DDA) approach, the optimized DIA method doubled the number of detected peptides and proteins with better reproducibility. Without protein immunodepletion and prefractionation, the single-run DIA analysis enables quantitative profiling of over 300 proteins with 50min gradient time. The quantified proteins span more than five orders of magnitude of abundance range and contain over 50 FDA-approved disease markers. The workflow allowed us to analyze 20 serum samples per day, with about 358 protein groups per sample being identified. A proof-of-concept study on renal cell carcinoma (RCC) serum samples confirmed the feasibility of the workflow for large scale serum proteomic profiling and disease-related biomarker discovery. Blood serum or plasma is the predominant specimen for clinical proteomic studies while the analysis is extremely challenging for its high complexity. Many efforts had been made in the past for serum proteomics for maximizing protein identifications, whereas few have been concerned with throughput and reproducibility. Here, we establish a rapid, robust and high reproducible DIA-based workflow for streamlined serum proteomic profiling from 1μL serum. The workflow doesn't need protein depletion and pre-fractionation, while still being able to detect disease-relevant proteins accurately. The workflow is promising in clinical application

  9. Distributed execution of aggregated multi domain workflows using an agent framework

    NARCIS (Netherlands)

    Zhao, Z.; Belloum, A.; de Laat, C.; Adriaans, P.; Hertzberger, B.; Zhang, L.J.; Watson, T.J.; Yang, J.; Hung, P.C.K.

    2007-01-01

    In e-Science, meaningful experiment processes and workflow engines emerge as important scientific resources. A complex experiment often involves services and processes developed in different scientific domains. Aggregating different workflows into one meta workflow avoids unnecessary rewriting of

  10. Phenotypic H-Antigen Typing by Mass Spectrometry Combined with Genetic Typing of H Antigens, O Antigens, and Toxins by Whole-Genome Sequencing Enhances Identification of Escherichia coli Isolates.

    Science.gov (United States)

    Cheng, Keding; Chui, Huixia; Domish, Larissa; Sloan, Angela; Hernandez, Drexler; McCorrister, Stuart; Robinson, Alyssia; Walker, Matthew; Peterson, Lorea A M; Majcher, Miles; Ratnam, Sam; Haldane, David J M; Bekal, Sadjia; Wylie, John; Chui, Linda; Tyler, Shaun; Xu, Bianli; Reimer, Aleisha; Nadon, Celine; Knox, J David; Wang, Gehua

    2016-08-01

    Mass spectrometry-based phenotypic H-antigen typing (MS-H) combined with whole-genome-sequencing-based genetic identification of H antigens, O antigens, and toxins (WGS-HOT) was used to type 60 clinical Escherichia coli isolates, 43 of which were previously identified as nonmotile, H type undetermined, or O rough by serotyping or having shown discordant MS-H and serotyping results. Whole-genome sequencing confirmed that MS-H was able to provide more accurate data regarding H antigen expression than serotyping. Further, enhanced and more confident O antigen identification resulted from gene cluster based typing in combination with conventional typing based on the gene pair comprising wzx and wzy and that comprising wzm and wzt The O antigen was identified in 94.6% of the isolates when the two genetic O typing approaches (gene pair and gene cluster) were used in conjunction, in comparison to 78.6% when the gene pair database was used alone. In addition, 98.2% of the isolates showed the existence of genes for various toxins and/or virulence factors, among which verotoxins (Shiga toxin 1 and/or Shiga toxin 2) were 100% concordant with conventional PCR based testing results. With more applications of mass spectrometry and whole-genome sequencing in clinical microbiology laboratories, this combined phenotypic and genetic typing platform (MS-H plus WGS-HOT) should be ideal for pathogenic E. coli typing. Copyright © 2016 Cheng et al.

  11. Enabling Efficient Climate Science Workflows in High Performance Computing Environments

    Science.gov (United States)

    Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.

    2015-12-01

    A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.

  12. Identification of Three Kinds of Citri Reticulatae Pericarpium Based on Deoxyribonucleic Acid Barcoding and High-performance Liquid Chromatography-diode Array Detection-electrospray Ionization/Mass Spectrometry/Mass Spectrometry Combined with Chemometric Analysis

    Science.gov (United States)

    Yu, Xiaoxue; Zhang, Yafeng; Wang, Dongmei; Jiang, Lin; Xu, Xinjun

    2018-01-01

    Background: Citri Reticulatae Pericarpium is the dried mature pericarp of Citrus reticulata Blanco which can be divided into “Chenpi” and “Guangchenpi.” “Guangchenpi” is the genuine Chinese medicinal material in Xinhui, Guangdong province; based on the greatest quality and least amount, it is most expensive among others. Hesperidin is used as the marker to identify Citri Reticulatae Pericarpium described in the Chinese Pharmacopoeia 2010. However, both “Chenpi” and “Guangchenpi” contain hesperidin so that it is impossible to differentiate them by measuring hesperidin. Objective: Our study aims to develop an efficient and accurate method to separate and identify “Guangchenpi” from other Citri Reticulatae Pericarpium. Materials and Methods: The genomic deoxyribonucleic acid (DNA) of all the materials was extracted and then the internal transcribed spacer 2 was amplified, sequenced, aligned, and analyzed. The secondary structures were created in terms of the database and website established by Jörg Schultz et al. High-performance liquid chromatography-diode array detection-electrospray Ionization/mass spectrometry (HPLC-DAD-ESI-MS)/MS coupled with chemometric analysis was applied to compare the differences in chemical profiles of the three kinds of Citri Reticulatae Pericarpium. Results: A total of 22 samples were classified into three groups. The results of DNA barcoding were in accordance with principal component analysis and hierarchical cluster analysis. Eight compounds were deduced from HPLC-DAD-ESI-MS/MS. Conclusions: This method is a reliable and effective tool to differentiate the three Citri Reticulatae Pericarpium. SUMMARY The internal transcribed spacer 2 regions and the secondary structure among three kinds of Citri Reticulatae Pericarpium varied considerablyAll the 22 samples were analyzed by high-performance liquid chromatography (HPLC) to obtain the chemical profilesPrincipal component analysis and hierarchical cluster analysis

  13. Exploring Dental Providers' Workflow in an Electronic Dental Record Environment.

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N; Ye, Zhan; Acharya, Amit

    2016-01-01

    A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR.

  14. Workflow Lexicons in Healthcare: Validation of the SWIM Lexicon.

    Science.gov (United States)

    Meenan, Chris; Erickson, Bradley; Knight, Nancy; Fossett, Jewel; Olsen, Elizabeth; Mohod, Prerna; Chen, Joseph; Langer, Steve G

    2017-06-01

    For clinical departments seeking to successfully navigate the challenges of modern health reform, obtaining access to operational and clinical data to establish and sustain goals for improving quality is essential. More broadly, health delivery organizations are also seeking to understand performance across multiple facilities and often across multiple electronic medical record (EMR) systems. Interpreting operational data across multiple vendor systems can be challenging, as various manufacturers may describe different departmental workflow steps in different ways and sometimes even within a single vendor's installed customer base. In 2012, The Society for Imaging Informatics in Medicine (SIIM) recognized the need for better quality and performance data standards and formed SIIM's Workflow Initiative for Medicine (SWIM), an initiative designed to consistently describe workflow steps in radiology departments as well as defining operational quality metrics. The SWIM lexicon was published as a working model to describe operational workflow steps and quality measures. We measured the prevalence of the SWIM lexicon workflow steps in both academic and community radiology environments using real-world patient observations and correlated that information with automatically captured workflow steps from our clinical information systems. Our goal was to measure frequency of occurrence of workflow steps identified by the SWIM lexicon in a real-world clinical setting, as well as to correlate how accurately departmental information systems captured patient flow through our health facility.

  15. Examining daily activity routines of older adults using workflow.

    Science.gov (United States)

    Chung, Jane; Ozkaynak, Mustafa; Demiris, George

    2017-07-01

    We evaluated the value of workflow analysis supported by a novel visualization technique to better understand the daily routines of older adults and highlight their patterns of daily activities and normal variability in physical functions. We used a self-reported activity diary to obtain data from six community-dwelling older adults for 14 consecutive days. Workflow for daily routine was analyzed using the EventFlow tool, which aggregates workflow information to highlight patterns and variabilities. A total of 1453 events were included in the data analysis. To demonstrate the patterns and variability of each individual's daily activities, participant activity workflows were visualized and compared. The workflow analysis revealed great variability in activity types, regularity, frequency, duration, and timing of performing certain activities across individuals. Also, when workflow approach was applied to spatial information of activities, the analysis revealed the ability to provide meaningful data on individuals' mobility in different levels of life spaces from home to community. Results suggest that using workflows to characterize the daily activities of older adults will be helpful for clinicians and researchers in understanding their daily routines and preparing education and prevention strategies tailored to each individual's activity level. This tool also has the potential to be integrated into consumer informatics technologies, such as patient portals or personal health records, so that consumers may be encouraged to become actively involved in monitoring and managing their health. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  17. The application of an emerging technique for protein-protein interaction interface mapping: the combination of photo-initiated cross-linking protein nanoprobes with mass spectrometry

    Czech Academy of Sciences Publication Activity Database

    Ptáčková, Renata; Ječmen, Tomáš; Novák, Petr; Šulc, Miroslav; Hudeček, J.; Stiborová, M.

    2014-01-01

    Roč. 15, č. 6 (2014), s. 9224-9241 E-ISSN 1422-0067 R&D Projects: GA ČR(CZ) GAP207/12/0627 Grant - others:Universita Karlova(CZ) 903413; Magistrát hlavního města Prahy(CZ) CZ.2.16/3.1.00/24023; UNCE(BE) 204025/2012 Institutional support: RVO:61388971 Keywords : nanoprobes * mass spectrometry * protein-protein interactions Subject RIV: CE - Biochemistry Impact factor: 2.862, year: 2014

  18. Deploying and sharing U-Compare workflows as web services.

    Science.gov (United States)

    Kontonatsios, Georgios; Korkontzelos, Ioannis; Kolluru, Balakrishna; Thompson, Paul; Ananiadou, Sophia

    2013-02-18

    U-Compare is a text mining platform that allows the construction, evaluation and comparison of text mining workflows. U-Compare contains a large library of components that are tuned to the biomedical domain. Users can rapidly develop biomedical text mining workflows by mixing and matching U-Compare's components. Workflows developed using U-Compare can be exported and sent to other users who, in turn, can import and re-use them. However, the resulting workflows are standalone applications, i.e., software tools that run and are accessible only via a local machine, and that can only be run with the U-Compare platform. We address the above issues by extending U-Compare to convert standalone workflows into web services automatically, via a two-click process. The resulting web services can be registered on a central server and made publicly available. Alternatively, users can make web services available on their own servers, after installing the web application framework, which is part of the extension to U-Compare. We have performed a user-oriented evaluation of the proposed extension, by asking users who have tested the enhanced functionality of U-Compare to complete questionnaires that assess its functionality, reliability, usability, efficiency and maintainability. The results obtained reveal that the new functionality is well received by users. The web services produced by U-Compare are built on top of open standards, i.e., REST and SOAP protocols, and therefore, they are decoupled from the underlying platform. Exported workflows can be integrated with any application that supports these open standards. We demonstrate how the newly extended U-Compare enhances the cross-platform interoperability of workflows, by seamlessly importing a number of text mining workflow web services exported from U-Compare into Taverna, i.e., a generic scientific workflow construction platform.

  19. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Directory of Open Access Journals (Sweden)

    David K Brown

    Full Text Available Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS, a workflow management system and web interface for high performance computing (HPC. JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  20. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing.

    Science.gov (United States)

    Brown, David K; Penkler, David L; Musyoka, Thommas M; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS.

  1. JMS: An Open Source Workflow Management System and Web-Based Cluster Front-End for High Performance Computing

    Science.gov (United States)

    Brown, David K.; Penkler, David L.; Musyoka, Thommas M.; Bishop, Özlem Tastan

    2015-01-01

    Complex computational pipelines are becoming a staple of modern scientific research. Often these pipelines are resource intensive and require days of computing time. In such cases, it makes sense to run them over high performance computing (HPC) clusters where they can take advantage of the aggregated resources of many powerful computers. In addition to this, researchers often want to integrate their workflows into their own web servers. In these cases, software is needed to manage the submission of jobs from the web interface to the cluster and then return the results once the job has finished executing. We have developed the Job Management System (JMS), a workflow management system and web interface for high performance computing (HPC). JMS provides users with a user-friendly web interface for creating complex workflows with multiple stages. It integrates this workflow functionality with the resource manager, a tool that is used to control and manage batch jobs on HPC clusters. As such, JMS combines workflow management functionality with cluster administration functionality. In addition, JMS provides developer tools including a code editor and the ability to version tools and scripts. JMS can be used by researchers from any field to build and run complex computational pipelines and provides functionality to include these pipelines in external interfaces. JMS is currently being used to house a number of bioinformatics pipelines at the Research Unit in Bioinformatics (RUBi) at Rhodes University. JMS is an open-source project and is freely available at https://github.com/RUBi-ZA/JMS. PMID:26280450

  2. Distributed Workflow Service Composition Based on CTR Technology

    Science.gov (United States)

    Feng, Zhilin; Ye, Yanming

    Recently, WS-BPEL has gradually become the basis of a standard for web service description and composition. However, WS-BPEL cannot efficiently describe distributed workflow services for lacking of special expressive power and formal semantics. This paper presents a novel method for modeling distributed workflow service composition with Concurrent TRansaction logic (CTR). The syntactic structure of WS-BPEL and CTR are analyzed, and new rules of mapping WS-BPEL into CTR are given. A case study is put forward to show that the proposed method is appropriate for modeling workflow business services under distributed environments.

  3. CMS Alignement and Calibration workflows: lesson learned and future plans

    CERN Document Server

    AUTHOR|(CDS)2069172

    2014-01-01

    We review the online and offline workflows designed to align and calibrate the CMS detector. Starting from the gained experience during the first LHC run, we discuss the expected developments for Run II. In particular, we describe the envisioned different stages, from the alignment using cosmic rays data to the detector alignment and calibration using the first proton-proton collisions data ( O(100 pb-1) ) and a larger dataset ( O(1 fb-1) ) to reach the target precision. The automatisation of the workflow and the integration in the online and offline activity (dedicated triggers and datasets, data skims, workflows to compute the calibration and alignment constants) are discussed.

  4. What is needed for effective open access workflows?

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Institutions and funders are pushing forward open access with ever new guidelines and policies. Since institutional repositories are important maintainers of green open access, they should support easy and fast workflows for researchers and libraries to release publications. Based on the requirements specification of researchers, libraries and publishers, possible supporting software extensions are discussed. How does a typical workflow look like? What has to be considered by the researchers and by the editors in the library before releasing a green open access publication? Where and how can software support and improve existing workflows?

  5. Enhanced reproducibility of SADI web service workflows with Galaxy and Docker.

    Science.gov (United States)

    Aranguren, Mikel Egaña; Wilkinson, Mark D

    2015-01-01

    Semantic Web technologies have been widely applied in the life sciences, for example by data providers such as OpenLifeData and through web services frameworks such as SADI. The recently reported OpenLifeData2SADI project offers access to the vast OpenLifeData data store through SADI services. This article describes how to merge data retrieved from OpenLifeData2SADI with other SADI services using the Galaxy bioinformatics analysis platform, thus making this semantic data more amenable to complex analyses. This is demonstrated using a working example, which is made distributable and reproducible through a Docker image that includes SADI tools, along with the data and workflows that constitute the demonstration. The combination of Galaxy and Docker offers a solution for faithfully reproducing and sharing complex data retrieval and analysis workflows based on the SADI Semantic web service design patterns.

  6. EDMS based workflow for Printing Industry

    Directory of Open Access Journals (Sweden)

    Prathap Nayak

    2013-04-01

    Full Text Available Information is indispensable factor of any enterprise. It can be a record or a document generated for every transaction that is made, which is either a paper based or in electronic format for future reference. A Printing Industry is one such industry in which managing information of various formats, with latest workflows and technologies, could be a nightmare and a challenge for any operator or an user when each process from the least bit of information to a printed product are always dependendent on each other. Hence the information has to be harmonized artistically in order to avoid production downtime or employees pointing fingers at each other. This paper analyses how the implementation of Electronic Document Management System (EDMS could contribute to the Printing Industry for immediate access to stored documents within and across departments irrespective of geographical boundaries. The paper outlines initially with a brief history, contemporary EDMS system and some illustrated examples with a study done by choosing Library as a pilot area for evaluating EDMS. The paper ends with an imitative proposal that maps several document management based activities for implementation of EDMS for a Printing Industry.

  7. Workflow management in large distributed systems

    International Nuclear Information System (INIS)

    Legrand, I; Newman, H; Voicu, R; Dobre, C; Grigoras, C

    2011-01-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  8. Resilient workflows for computational mechanics platforms

    International Nuclear Information System (INIS)

    Nguyen, Toan; Trifan, Laurentiu; Desideri, Jean-Antoine

    2010-01-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future.

  9. Resilient workflows for computational mechanics platforms

    Science.gov (United States)

    Nguyên, Toàn; Trifan, Laurentiu; Désidéri, Jean-Antoine

    2010-06-01

    Workflow management systems have recently been the focus of much interest and many research and deployment for scientific applications worldwide [26, 27]. Their ability to abstract the applications by wrapping application codes have also stressed the usefulness of such systems for multidiscipline applications [23, 24]. When complex applications need to provide seamless interfaces hiding the technicalities of the computing infrastructures, their high-level modeling, monitoring and execution functionalities help giving production teams seamless and effective facilities [25, 31, 33]. Software integration infrastructures based on programming paradigms such as Python, Mathlab and Scilab have also provided evidence of the usefulness of such approaches for the tight coupling of multidisciplne application codes [22, 24]. Also high-performance computing based on multi-core multi-cluster infrastructures open new opportunities for more accurate, more extensive and effective robust multi-discipline simulations for the decades to come [28]. This supports the goal of full flight dynamics simulation for 3D aircraft models within the next decade, opening the way to virtual flight-tests and certification of aircraft in the future [23, 24, 29].

  10. Workflow management in large distributed systems

    Science.gov (United States)

    Legrand, I.; Newman, H.; Voicu, R.; Dobre, C.; Grigoras, C.

    2011-12-01

    The MonALISA (Monitoring Agents using a Large Integrated Services Architecture) framework provides a distributed service system capable of controlling and optimizing large-scale, data-intensive applications. An essential part of managing large-scale, distributed data-processing facilities is a monitoring system for computing facilities, storage, networks, and the very large number of applications running on these systems in near realtime. All this monitoring information gathered for all the subsystems is essential for developing the required higher-level services—the components that provide decision support and some degree of automated decisions—and for maintaining and optimizing workflow in large-scale distributed systems. These management and global optimization functions are performed by higher-level agent-based services. We present several applications of MonALISA's higher-level services including optimized dynamic routing, control, data-transfer scheduling, distributed job scheduling, dynamic allocation of storage resource to running jobs and automated management of remote services among a large set of grid facilities.

  11. Application of gamma spectrometry technique in combination with weighing for material balance taking in the production of highly enriched U-A1 fuel

    International Nuclear Information System (INIS)

    Serin, P.A.

    1975-07-01

    The purpose of this project is to obtain the data on material balance for a batch of highly enriched U-Al alloys (used in the NRX and NRU reactors) during production of fuel, using gamma spectrometry (mainly the 186 KeV photopeak) and weighing, and to determine operational data of the Agency's single channel stabilized spectrometer (SAM-1) for measurement of the product typical for the production of highly enriched U-Al fuel (U-Al billets, fuel elements, scrap). The data collected indicates that gamma spectrometry using the single channel stabilized spectrometer is a valid non-destructive method of determining quantitatively U-235 content of U-Al alloy in the form of cast billets or extruded fuel elements providing that adequate standards are available. An accuracy of better than + 1% relative can be obtained using a simple jig to provide reproducible counting geometry. Count rates should be kept well below the saturation level of the detector and counter, preferably by a lead collimator in front of the detector. This non-destructive method is not easily applicable to scrap because of the inability to maintain constant geometry and to prepare standards closely similar in size and shape to the samples

  12. Screening antiallergic components from Carthamus tinctorius using rat basophilic leukemia 2H3 cell membrane chromatography combined with high-performance liquid chromatography and tandem mass spectrometry.

    Science.gov (United States)

    Han, Shengli; Huang, Jing; Cui, Ronghua; Zhang, Tao

    2015-02-01

    Carthamus tinctorius, used in traditional Chinese medicine, has many pharmacological effects, such as anticoagulant effects, antioxidant effects, antiaging effects, regulation of gene expression, and antitumor effects. However, there is no report on the antiallergic effects of the components in C. tinctorius. In the present study, we investigated the antiallergic components of C. tinctorius and its mechanism of action. A rat basophilic leukemia 2H3/cell membrane chromatography coupled online with high-performance liquid chromatography and tandem mass spectrometry method was developed to screen antiallergic components from C. tinctorius. The screening results showed that Hydroxysafflor yellow A, from C. tinctorius, was the targeted component that retained on the rat basophilic leukemia 2H3/cell membrane chromatography column. We measured the amount of β-hexosaminidase and histamine released in mast cells and the key markers of degranulation. The release assays showed that Hydroxysafflor yellow A could attenuate the immunoglobulin E induced release of allergic cytokines without affecting cell viability from 1.0 to 50.0 μM. In conclusion, the established rat basophilic leukemia 2H3 cell membrane chromatography coupled with online high-performance liquid chromatography and tandem mass spectrometry method successfully screened and identified Hydroxysafflor yellow A from C. tinctorius as a potential antiallergic component. Pharmacological analysis elucidated that Hydroxysafflor yellow A is an effective natural component for inhibiting immunoglobulin E-antigen-mediated degranulation. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Classification of the medicinal plants of the genus Atractylodes using high-performance liquid chromatography with diode array and tandem mass spectrometry detection combined with multivariate statistical analysis.

    Science.gov (United States)

    Cho, Hyun-Deok; Kim, Unyong; Suh, Joon Hyuk; Eom, Han Young; Kim, Junghyun; Lee, Seul Gi; Choi, Yong Seok; Han, Sang Beom

    2016-04-01

    Analytical methods using high-performance liquid chromatography with diode array and tandem mass spectrometry detection were developed for the discrimination of the rhizomes of four Atractylodes medicinal plants: A. japonica, A. macrocephala, A. chinensis, and A. lancea. A quantitative study was performed, selecting five bioactive components, including atractylenolide I, II, III, eudesma-4(14),7(11)-dien-8-one and atractylodin, on twenty-six Atractylodes samples of various origins. Sample extraction was optimized to sonication with 80% methanol for 40 min at room temperature. High-performance liquid chromatography with diode array detection was established using a C18 column with a water/acetonitrile gradient system at a flow rate of 1.0 mL/min, and the detection wavelength was set at 236 nm. Liquid chromatography with tandem mass spectrometry was applied to certify the reliability of the quantitative results. The developed methods were validated by ensuring specificity, linearity, limit of quantification, accuracy, precision, recovery, robustness, and stability. Results showed that cangzhu contained higher amounts of atractylenolide I and atractylodin than baizhu, and especially atractylodin contents showed the greatest variation between baizhu and cangzhu. Multivariate statistical analysis, such as principal component analysis and hierarchical cluster analysis, were also employed for further classification of the Atractylodes plants. The established method was suitable for quality control of the Atractylodes plants. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Lead discovery for mammalian elongation of long chain fatty acids family 6 using a combination of high-throughput fluorescent-based assay and RapidFire mass spectrometry assay

    International Nuclear Information System (INIS)

    Takamiya, Mari; Sakurai, Masaaki; Teranishi, Fumie; Ikeda, Tomoko; Kamiyama, Tsutomu; Asai, Akira

    2016-01-01

    A high-throughput RapidFire mass spectrometry assay is described for elongation of very long-chain fatty acids family 6 (Elovl6). Elovl6 is a microsomal enzyme that regulates the elongation of C12-16 saturated and monounsaturated fatty acids. Elovl6 may be a new therapeutic target for fat metabolism disorders such as obesity, type 2 diabetes, and nonalcoholic steatohepatitis. To identify new Elovl6 inhibitors, we developed a high-throughput fluorescence screening assay in 1536-well format. However, a number of false positives caused by fluorescent interference have been identified. To pick up the real active compounds among the primary hits from the fluorescence assay, we developed a RapidFire mass spectrometry assay and a conventional radioisotope assay. These assays have the advantage of detecting the main products directly without using fluorescent-labeled substrates. As a result, 276 compounds (30%) of the primary hits (921 compounds) in a fluorescence ultra-high-throughput screening method were identified as common active compounds in these two assays. It is concluded that both methods are very effective to eliminate false positives. Compared with the radioisotope method using an expensive 14 C-labeled substrate, the RapidFire mass spectrometry method using unlabeled substrates is a high-accuracy, high-throughput method. In addition, some of the hit compounds selected from the screening inhibited cellular fatty acid elongation in HEK293 cells expressing Elovl6 transiently. This result suggests that these compounds may be promising lead candidates for therapeutic drugs. Ultra-high-throughput fluorescence screening followed by a RapidFire mass spectrometry assay was a suitable strategy for lead discovery against Elovl6. - Highlights: • A novel assay for elongation of very-long-chain fatty acids 6 (Elovl6) is proposed. • RapidFire mass spectrometry (RF-MS) assay is useful to select real screening hits. • RF-MS assay is proved to be beneficial because of

  15. Mass spectrometry

    DEFF Research Database (Denmark)

    Nyvang Hartmeyer, Gitte; Jensen, Anne Kvistholm; Böcher, Sidsel

    2010-01-01

    Matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF MS) is currently being introduced for the rapid and accurate identification of bacteria. We describe 2 MALDI-TOF MS identification cases - 1 directly on spinal fluid and 1 on grown bacteria. Rapidly obtained...

  16. DNA Qualification Workflow for Next Generation Sequencing of Histopathological Samples

    Science.gov (United States)

    Simbolo, Michele; Gottardi, Marisa; Corbo, Vincenzo; Fassan, Matteo; Mafficini, Andrea; Malpeli, Giorgio; Lawlor, Rita T.; Scarpa, Aldo

    2013-01-01

    Histopathological samples are a treasure-trove of DNA for clinical research. However, the quality of DNA can vary depending on the source or extraction method applied. Thus a standardized and cost-effective workflow for the qualification of DNA preparations is essential to guarantee interlaboratory reproducible results. The qualification process consists of the quantification of double strand DNA (dsDNA) and the assessment of its suitability for downstream applications, such as high-throughput next-generation sequencing. We tested the two most frequently used instrumentations to define their role in this process: NanoDrop, based on UV spectroscopy, and Qubit 2.0, which uses fluorochromes specifically binding dsDNA. Quantitative PCR (qPCR) was used as the reference technique as it simultaneously assesses DNA concentration and suitability for PCR amplification. We used 17 genomic DNAs from 6 fresh-frozen (FF) tissues, 6 formalin-fixed paraffin-embedded (FFPE) tissues, 3 cell lines, and 2 commercial preparations. Intra- and inter-operator variability was negligible, and intra-methodology variability was minimal, while consistent inter-methodology divergences were observed. In fact, NanoDrop measured DNA concentrations higher than Qubit and its consistency with dsDNA quantification by qPCR was limited to high molecular weight DNA from FF samples and cell lines, where total DNA and dsDNA quantity virtually coincide. In partially degraded DNA from FFPE samples, only Qubit proved highly reproducible and consistent with qPCR measurements. Multiplex PCR amplifying 191 regions of 46 cancer-related genes was designated the downstream application, using 40 ng dsDNA from FFPE samples calculated by Qubit. All but one sample produced amplicon libraries suitable for next-generation sequencing. NanoDrop UV-spectrum verified contamination of the unsuccessful sample. In conclusion, as qPCR has high costs and is labor intensive, an alternative effective standard workflow for

  17. DNA qualification workflow for next generation sequencing of histopathological samples.

    Directory of Open Access Journals (Sweden)

    Michele Simbolo

    workflow for qualification of DNA preparations should include the sequential combination of NanoDrop and Qubit to assess the purity and quantity of dsDNA, respectively.

  18. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.

    2017-06-15

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select parts to create systems, and modeling and simulation tools to evaluate alternative design choices. Data standards enable the ready exchange of information within such a workflow, allowing repositories and tools to be connected from a diversity of sources. The present paper describes one such workflow that utilizes, among others, the Synthetic Biology Open Language (SBOL) to describe genetic designs, the Systems Biology Markup Language to model these designs, and SBOL Visual to visualize these designs. We describe how a standard-enabled workflow can be used to produce types of design information, including multiple repositories and software tools exchanging information using a variety of data standards. Recently, the ACS Synthetic Biology journal has recommended the use of SBOL in their publications.

  19. A standard-enabled workflow for synthetic biology

    KAUST Repository

    Myers, Chris J.; Beal, Jacob; Gorochowski, Thomas E.; Kuwahara, Hiroyuki; Madsen, Curtis; McLaughlin, James Alastair; Mısırlı, Gö ksel; Nguyen, Tramy; Oberortner, Ernst; Samineni, Meher; Wipat, Anil; Zhang, Michael; Zundel, Zach

    2017-01-01

    A synthetic biology workflow is composed of data repositories that provide information about genetic parts, sequence-level design tools to compose these parts into circuits, visualization tools to depict these designs, genetic design tools to select

  20. Text mining meets workflow: linking U-Compare with Taverna

    Science.gov (United States)

    Kano, Yoshinobu; Dobson, Paul; Nakanishi, Mio; Tsujii, Jun'ichi; Ananiadou, Sophia

    2010-01-01

    Summary: Text mining from the biomedical literature is of increasing importance, yet it is not easy for the bioinformatics community to create and run text mining workflows due to the lack of accessibility and interoperability of the text mining resources. The U-Compare system provides a wide range of bio text mining resources in a highly interoperable workflow environment where workflows can very easily be created, executed, evaluated and visualized without coding. We have linked U-Compare to Taverna, a generic workflow system, to expose text mining functionality to the bioinformatics community. Availability: http://u-compare.org/taverna.html, http://u-compare.org Contact: kano@is.s.u-tokyo.ac.jp Supplementary information: Supplementary data are available at Bioinformatics online. PMID:20709690

  1. Job life cycle management libraries for CMS workflow management projects

    International Nuclear Information System (INIS)

    Lingen, Frank van; Wilkinson, Rick; Evans, Dave; Foulkes, Stephen; Afaq, Anzar; Vaandering, Eric; Ryu, Seangchan

    2010-01-01

    Scientific analysis and simulation requires the processing and generation of millions of data samples. These tasks are often comprised of multiple smaller tasks divided over multiple (computing) sites. This paper discusses the Compact Muon Solenoid (CMS) workflow infrastructure, and specifically the Python based workflow library which is used for so called task lifecycle management. The CMS workflow infrastructure consists of three layers: high level specification of the various tasks based on input/output data sets, life cycle management of task instances derived from the high level specification and execution management. The workflow library is the result of a convergence of three CMS sub projects that respectively deal with scientific analysis, simulation and real time data aggregation from the experiment. This will reduce duplication and hence development and maintenance costs.

  2. A Community-Driven Workflow Recommendation and Reuse Infrastructure

    Data.gov (United States)

    National Aeronautics and Space Administration — Promote and encourage process and workflow reuse  within NASA Earth eXchange (NEX) by developing a proactive recommendation technology based on collective NEX user...

  3. modeling workflow management in a distributed computing system

    African Journals Online (AJOL)

    Dr Obe

    communication system, which allows for computerized support. ... Keywords: Distributed computing system; Petri nets;Workflow management. 1. ... A distributed operating system usually .... the questionnaire is returned with invalid data,.

  4. Scientific workflows as productivity tools for drug discovery.

    Science.gov (United States)

    Shon, John; Ohkawa, Hitomi; Hammer, Juergen

    2008-05-01

    Large pharmaceutical companies annually invest tens to hundreds of millions of US dollars in research informatics to support their early drug discovery processes. Traditionally, most of these investments are designed to increase the efficiency of drug discovery. The introduction of do-it-yourself scientific workflow platforms has enabled research informatics organizations to shift their efforts toward scientific innovation, ultimately resulting in a possible increase in return on their investments. Unlike the handling of most scientific data and application integration approaches, researchers apply scientific workflows to in silico experimentation and exploration, leading to scientific discoveries that lie beyond automation and integration. This review highlights some key requirements for scientific workflow environments in the pharmaceutical industry that are necessary for increasing research productivity. Examples of the application of scientific workflows in research and a summary of recent platform advances are also provided.

  5. FOSS geospatial libraries in scientific workflow environments: experiences and directions

    CSIR Research Space (South Africa)

    McFerren, G

    2011-07-01

    Full Text Available of experiments. In context of three sets of research (wildfire research, flood modelling and the linking of disease outbreaks to multi-scale environmental conditions), we describe our efforts to provide geospatial capability for scientific workflow software...

  6. Workflows for microarray data processing in the Kepler environment

    Directory of Open Access Journals (Sweden)

    Stropp Thomas

    2012-05-01

    Full Text Available Abstract Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data and therefore are close to

  7. Workflows for microarray data processing in the Kepler environment.

    Science.gov (United States)

    Stropp, Thomas; McPhillips, Timothy; Ludäscher, Bertram; Bieda, Mark

    2012-05-17

    Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or R

  8. Workflows for microarray data processing in the Kepler environment

    Science.gov (United States)

    2012-01-01

    Background Microarray data analysis has been the subject of extensive and ongoing pipeline development due to its complexity, the availability of several options at each analysis step, and the development of new analysis demands, including integration with new data sources. Bioinformatics pipelines are usually custom built for different applications, making them typically difficult to modify, extend and repurpose. Scientific workflow systems are intended to address these issues by providing general-purpose frameworks in which to develop and execute such pipelines. The Kepler workflow environment is a well-established system under continual development that is employed in several areas of scientific research. Kepler provides a flexible graphical interface, featuring clear display of parameter values, for design and modification of workflows. It has capabilities for developing novel computational components in the R, Python, and Java programming languages, all of which are widely used for bioinformatics algorithm development, along with capabilities for invoking external applications and using web services. Results We developed a series of fully functional bioinformatics pipelines addressing common tasks in microarray processing in the Kepler workflow environment. These pipelines consist of a set of tools for GFF file processing of NimbleGen chromatin immunoprecipitation on microarray (ChIP-chip) datasets and more comprehensive workflows for Affymetrix gene expression microarray bioinformatics and basic primer design for PCR experiments, which are often used to validate microarray results. Although functional in themselves, these workflows can be easily customized, extended, or repurposed to match the needs of specific projects and are designed to be a toolkit and starting point for specific applications. These workflows illustrate a workflow programming paradigm focusing on local resources (programs and data) and therefore are close to traditional shell scripting or

  9. Combining combing and secondary ion mass spectrometry to study DNA on chips using 13C and 15N labeling [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Armelle Cabin-Flaman

    2016-06-01

    Full Text Available Dynamic secondary ion mass spectrometry (D-SIMS imaging of combed DNA – the combing, imaging by SIMS or CIS method – has been developed previously using a standard NanoSIMS 50 to reveal, on the 50 nm scale, individual DNA fibers labeled with different, non-radioactive isotopes in vivo and to quantify these isotopes. This makes CIS especially suitable for determining the times, places and rates of DNA synthesis as well as the detection of the fine-scale re-arrangements of DNA and of molecules associated with combed DNA fibers. Here, we show how CIS may be extended to 13C-labeling via the detection and quantification of the 13C14N- recombinant ion and the use of the 13C:12C ratio, we discuss how CIS might permit three successive labels, and we suggest ideas that might be explored using CIS.

  10. Kronos: a workflow assembler for genome analytics and informatics

    Science.gov (United States)

    Taghiyar, M. Jafar; Rosner, Jamie; Grewal, Diljot; Grande, Bruno M.; Aniba, Radhouane; Grewal, Jasleen; Boutros, Paul C.; Morin, Ryan D.

    2017-01-01

    Abstract Background: The field of next-generation sequencing informatics has matured to a point where algorithmic advances in sequence alignment and individual feature detection methods have stabilized. Practical and robust implementation of complex analytical workflows (where such tools are structured into “best practices” for automated analysis of next-generation sequencing datasets) still requires significant programming investment and expertise. Results: We present Kronos, a software platform for facilitating the development and execution of modular, auditable, and distributable bioinformatics workflows. Kronos obviates the need for explicit coding of workflows by compiling a text configuration file into executable Python applications. Making analysis modules would still require programming. The framework of each workflow includes a run manager to execute the encoded workflows locally (or on a cluster or cloud), parallelize tasks, and log all runtime events. The resulting workflows are highly modular and configurable by construction, facilitating flexible and extensible meta-applications that can be modified easily through configuration file editing. The workflows are fully encoded for ease of distribution and can be instantiated on external systems, a step toward reproducible research and comparative analyses. We introduce a framework for building Kronos components that function as shareable, modular nodes in Kronos workflows. Conclusions: The Kronos platform provides a standard framework for developers to implement custom tools, reuse existing tools, and contribute to the community at large. Kronos is shipped with both Docker and Amazon Web Services Machine Images. It is free, open source, and available through the Python Package Index and at https://github.com/jtaghiyar/kronos. PMID:28655203

  11. Translating Unstructured Workflow Processes to Readable BPEL: Theory and Implementation

    DEFF Research Database (Denmark)

    van der Aalst, Willibrordus Martinus Pancratius; Lassen, Kristian Bisgaard

    2008-01-01

    and not easy to use by end-users. Therefore, we provide a mapping from Workflow Nets (WF-nets) to BPEL. This mapping builds on the rich theory of Petri nets and can also be used to map other languages (e.g., UML, EPC, BPMN, etc.) onto BPEL. In addition to this we have implemented the algorithm in a tool called...... WorkflowNet2BPEL4WS....

  12. Worklist handling in workflow-enabled radiological application systems

    Science.gov (United States)

    Wendler, Thomas; Meetz, Kirsten; Schmidt, Joachim; von Berg, Jens

    2000-05-01

    For the next generation integrated information systems for health care applications, more emphasis has to be put on systems which, by design, support the reduction of cost, the increase inefficiency and the improvement of the quality of services. A substantial contribution to this will be the modeling. optimization, automation and enactment of processes in health care institutions. One of the perceived key success factors for the system integration of processes will be the application of workflow management, with workflow management systems as key technology components. In this paper we address workflow management in radiology. We focus on an important aspect of workflow management, the generation and handling of worklists, which provide workflow participants automatically with work items that reflect tasks to be performed. The display of worklists and the functions associated with work items are the visible part for the end-users of an information system using a workflow management approach. Appropriate worklist design and implementation will influence user friendliness of a system and will largely influence work efficiency. Technically, in current imaging department information system environments (modality-PACS-RIS installations), a data-driven approach has been taken: Worklist -- if present at all -- are generated from filtered views on application data bases. In a future workflow-based approach, worklists will be generated by autonomous workflow services based on explicit process models and organizational models. This process-oriented approach will provide us with an integral view of entire health care processes or sub- processes. The paper describes the basic mechanisms of this approach and summarizes its benefits.

  13. Development of isotope dilution-liquid chromatography/mass spectrometry combined with standard addition techniques for the accurate determination of tocopherols in infant formula

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joonhee; Jang, Eun-Sil; Kim, Byungjoo, E-mail: byungjoo@kriss.re.kr

    2013-07-17

    Graphical abstract: -- Highlights: •ID-LC/MS method showed biased results for tocopherols analysis in infant formula. •H/D exchange of deuterated tocopherols in sample preparation was the source of bias. •Standard addition (SA)-ID-LC/MS was developed as an alternative to ID-LC/MS. •Details of calculation and uncertainty evaluation of the SA-IDMS were described. •SA-ID-LC/MS showed a higher-order metrological quality as a reference method. -- Abstract: During the development of isotope dilution-liquid chromatography/mass spectrometry (ID-LC/MS) for tocopherol analysis in infant formula, biased measurement results were observed when deuterium-labeled tocopherols were used as internal standards. It turned out that the biases came from intermolecular H/D exchange and intramolecular H/D scrambling of internal standards in sample preparation processes. Degrees of H/D exchange and scrambling showed considerable dependence on sample matrix. Standard addition-isotope dilution mass spectrometry (SA-IDMS) based on LC/MS was developed in this study to overcome the shortcomings of using deuterium-labeled internal standards while the inherent advantage of isotope dilution techniques is utilized for the accurate recovery correction in sample preparation processes. Details of experimental scheme, calculation equation, and uncertainty evaluation scheme are described in this article. The proposed SA-IDMS method was applied to several infant formula samples to test its validity. The method was proven to have a higher-order metrological quality with providing very accurate and precise measurement results.

  14. Non-chromatographic speciation analysis of mercury by flow injection on-line preconcentration in combination with chemical vapor generation atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Wu Hong [School of Chemistry and Chemical Engineering, State Key Laboratory of Coordination Chemistry and Key Laboratory of MOE for Life Science, Nanjing University, Nanjing 210093 (China); Department of Chemistry, Xuzhou Normal University, Xuzhou 221116 (China); Jin Yan [School of Chemistry and Chemical Engineering, State Key Laboratory of Coordination Chemistry and Key Laboratory of MOE for Life Science, Nanjing University, Nanjing 210093 (China); Han Weiying [School of Chemistry and Chemical Engineering, State Key Laboratory of Coordination Chemistry and Key Laboratory of MOE for Life Science, Nanjing University, Nanjing 210093 (China); Miao, Qiang [School of Chemistry and Chemical Engineering, State Key Laboratory of Coordination Chemistry and Key Laboratory of MOE for Life Science, Nanjing University, Nanjing 210093 (China); Bi Shuping [School of Chemistry and Chemical Engineering, State Key Laboratory of Coordination Chemistry and Key Laboratory of MOE for Life Science, Nanjing University, Nanjing 210093 (China)]. E-mail: bisp@nju.edu.cn

    2006-07-15

    A novel non-chromatographic approach for direct speciation of mercury, based on the selective retention inorganic mercury and methylmercury on the inner wall of a knotted reactor by using ammonium diethyl dithiophosphate and dithizone as complexing agents respectively, was developed for flow injection on-line sorption preconcentration coupled with chemical vapor generation non-dispersive atomic fluorescence spectrometry. With the sample pH kept at 2.0, the preconcentration of inorganic mercury on the inner walls of the knotted reactor was carried out based on the exclusive retention of Hg-DDP complex in the presence of methylmercury via on-line merging the sample solution with ammonium diethyl dithiophosphate solution, and selective preconcentration methylmercury was achieved with dithizone instead of ammonium diethyl dithiophosphate. A 15% (v/v) HCl was introduced to elute the retained mercury species and merge with KBH{sub 4} solution for atomic fluorescence spectrometry detection. Under the optimal experimental conditions, the sample throughputs of inorganic mercury and methylmercury were 30 and 20 h{sup -1} with the enhancement factors of 13 and 24. The detection limits were found to be 3.6 ng l{sup -1} for Hg{sup 2+} and 2.0 ng l{sup -1} for CH{sub 3}Hg{sup +}. The precisions (RSD) for the 11 replicate measurements of each 0.2 {mu}g l{sup -1} of Hg{sup 2+} and CH{sub 3}Hg{sup +} were 2.2% and 2.8%, respectively. The developed method was validated by the analysis of certified reference materials (simulated natural water, rice flour and pork) and by recovery measurements on spiked samples, and was applied to the determination of inorganic mercury and methylmercury in biological and environmental water samples.

  15. Qualitative metabolome analysis of human cerebrospinal fluid by 13C-/12C-isotope dansylation labeling combined with liquid chromatography Fourier transform ion cyclotron resonance mass spectrometry.

    Science.gov (United States)

    Guo, Kevin; Bamforth, Fiona; Li, Liang

    2011-02-01

    Metabolome analysis of human cerebrospinal fluid (CSF) is challenging because of low abundance of metabolites present in a small volume of sample. We describe and apply a sensitive isotope labeling LC-MS technique for qualitative analysis of the CSF metabolome. After a CSF sample is divided into two aliquots, they are labeled by (13)C-dansyl and (12)C-dansyl chloride, respectively. The differentially labeled aliquots are then mixed and subjected to LC-MS using Fourier-transform ion cyclotron resonance mass spectrometry (FTICR MS). Dansylation offers significant improvement in the performance of chromatography separation and detection sensitivity. Moreover, peaks detected in the mass spectra can be readily analyzed for ion pair recognition and database search based on accurate mass and/or retention time information. It is shown that about 14,000 features can be detected in a 25-min LC-FTICR MS run of a dansyl-labeled CSF sample, from which about 500 metabolites can be profiled. Results from four CSF samples are compared to gauge the detectability of metabolites by this method. About 261 metabolites are commonly detected in replicate runs of four samples. In total, 1132 unique metabolite ion pairs are detected and 347 pairs (31%) matched with at least one metabolite in the Human Metabolome Database. We also report a dansylation library of 220 standard compounds and, using this library, about 85 metabolites can be positively identified. Among them, 21 metabolites have never been reported to be associated with CSF. These results illustrate that the dansylation LC-FTICR MS method can be used to analyze the CSF metabolome in a more comprehensive manner. © American Society for Mass Spectrometry, 2011

  16. Non-chromatographic speciation analysis of mercury by flow injection on-line preconcentration in combination with chemical vapor generation atomic fluorescence spectrometry

    International Nuclear Information System (INIS)

    Wu Hong; Jin Yan; Han Weiying; Miao, Qiang; Bi Shuping

    2006-01-01

    A novel non-chromatographic approach for direct speciation of mercury, based on the selective retention inorganic mercury and methylmercury on the inner wall of a knotted reactor by using ammonium diethyl dithiophosphate and dithizone as complexing agents respectively, was developed for flow injection on-line sorption preconcentration coupled with chemical vapor generation non-dispersive atomic fluorescence spectrometry. With the sample pH kept at 2.0, the preconcentration of inorganic mercury on the inner walls of the knotted reactor was carried out based on the exclusive retention of Hg-DDP complex in the presence of methylmercury via on-line merging the sample solution with ammonium diethyl dithiophosphate solution, and selective preconcentration methylmercury was achieved with dithizone instead of ammonium diethyl dithiophosphate. A 15% (v/v) HCl was introduced to elute the retained mercury species and merge with KBH 4 solution for atomic fluorescence spectrometry detection. Under the optimal experimental conditions, the sample throughputs of inorganic mercury and methylmercury were 30 and 20 h -1 with the enhancement factors of 13 and 24. The detection limits were found to be 3.6 ng l -1 for Hg 2+ and 2.0 ng l -1 for CH 3 Hg + . The precisions (RSD) for the 11 replicate measurements of each 0.2 μg l -1 of Hg 2+ and CH 3 Hg + were 2.2% and 2.8%, respectively. The developed method was validated by the analysis of certified reference materials (simulated natural water, rice flour and pork) and by recovery measurements on spiked samples, and was applied to the determination of inorganic mercury and methylmercury in biological and environmental water samples

  17. eMZed: an open source framework in Python for rapid and interactive development of LC/MS data analysis workflows

    OpenAIRE

    Kiefer, P; Schmitt, U; Vorholt, J A

    2013-01-01

    Summary: The Python-based, open-source eMZed framework was developed for mass spectrometry (MS) users to create tailored workflows for liquid chromatography (LC)/MS data analysis. The goal was to establish a unique framework with comprehensive basic functionalities that are easy to apply and allow for the extension and modification of the framework in a straightforward manner. eMZed supports the iterative development and prototyping of individual evaluation strategies by providing a computing...

  18. Assessment of current mass spectrometric workflows for the quantification of low abundant proteins and phosphorylation sites

    Directory of Open Access Journals (Sweden)

    Manuel Bauer

    2015-12-01

    Full Text Available The data described here provide a systematic performance evaluation of popular data-dependent (DDA and independent (DIA mass spectrometric (MS workflows currently used in quantitative proteomics. We assessed the limits of identification, quantification and detection for each method by analyzing a dilution series of 20 unmodified and 10 phosphorylated synthetic heavy labeled reference peptides, respectively, covering six orders of magnitude in peptide concentration with and without a complex human cell digest background. We found that all methods performed very similarly in the absence of background proteins, however, when analyzing whole cell lysates, targeted methods were at least 5–10 times more sensitive than directed or DDA methods. In particular, higher stage fragmentation (MS3 of the neutral loss peak using a linear ion trap increased dynamic quantification range of some phosphopeptides up to 100-fold. We illustrate the power of this targeted MS3 approach for phosphopeptide monitoring by successfully quantifying 9 phosphorylation sites of the kinetochore and spindle assembly checkpoint component Mad1 over different cell cycle states from non-enriched pull-down samples. The data are associated to the research article ‘Evaluation of data-dependent and data-independent mass spectrometric workflows for sensitive quantification of proteins and phosphorylation sites׳ (Bauer et al., 2014 [1]. The mass spectrometry and the analysis dataset have been deposited to the ProteomeXchange Consortium (http://proteomecentral.proteomexchange.org via the PRIDE partner repository with the dataset identifier PXD000964.

  19. Inverse IMRT workflow process at Austin health

    International Nuclear Information System (INIS)

    Rykers, K.; Fernando, W.; Grace, M.; Liu, G.; Rolfo, A.; Viotto, A.; Mantle, C.; Lawlor, M.; Au-Yeung, D.; Quong, G.; Feigen, M.; Lim-Joon, D.; Wada, M.

    2004-01-01

    Full text: The work presented here will review the strategies adopted at Austin Health to bring IMRT into clinical use. IMRT is delivered using step and shoot mode on an Elekta Precise machine with 40 pairs of 1cm wide MLC leaves. Planning is done using CMS Focus/XiO. A collaborative approach for RO's, Physicists and RTs from concept to implementation was adopted. An overview will be given of the workflow for the clinic, the equipment used, tolerance levels and the lessons learned. 1. Strategic Planning for IMRT 2. Training a. MSKCC (New York) b.ESTRO (Amsterdam) c.Elekta (US and UK) 3. Linac testing and data acquisition a. Equipment and software review and selection b. Linac reliability/geometric and mechanical checks c. Draft Patient QA procedure d. EPI Image matching checks and procedures 4. Planning system checks a. export of dose matrix (options) b. dose calculation choices 5. IMRT Research Initiatives a. IMRT Planning Studies, Stabilisation, On-line Imaging 6. Equipment Procurement and testing a. Physics and Linac Equipment, Hardware, Software/Licences, Stabilisation 7. Establishing a DICOM Environment a. Prescription sending, Image transfer for EPI checks b. QA Files 8. Physics QA (Pre-Treatment) a.Clinical plan review; DVH checks b. geometry; dosimetry checks; DICOM checks c. 2D Distance to agreement; mm difference reports; Gamma function index 9. Documentation a.Protocol Development i. ICRU 50/62 reporting and prescribing b. QA for Physics c. QA for RT's d. Generation of a report for RO/patient history. Copyright (2004) Australasian College of Physical Scientists and Engineers in Medicine

  20. CO2 Storage Feasibility: A Workflow for Site Characterisation

    Directory of Open Access Journals (Sweden)

    Nepveu Manuel

    2015-04-01

    Full Text Available In this paper, we present an overview of the SiteChar workflow model for site characterisation and assessment for CO2 storage. Site characterisation and assessment is required when permits are requested from the legal authorities in the process of starting a CO2 storage process at a given site. The goal is to assess whether a proposed CO2 storage site can indeed be used for permanent storage while meeting the safety requirements demanded by the European Commission (EC Storage Directive (9, Storage Directive 2009/31/EC. Many issues have to be scrutinised, and the workflow presented here is put forward to help efficiently organise this complex task. Three issues are highlighted: communication within the working team and with the authorities; interdependencies in the workflow and feedback loops; and the risk-based character of the workflow. A general overview (helicopter view of the workflow is given; the issues involved in communication and the risk assessment process are described in more detail. The workflow as described has been tested within the SiteChar project on five potential storage sites throughout Europe. This resulted in a list of key aspects of site characterisation which can help prepare and focus new site characterisation studies.

  1. LabelFlow Framework for Annotating Workflow Provenance

    Directory of Open Access Journals (Sweden)

    Pinar Alper

    2018-02-01

    Full Text Available Scientists routinely analyse and share data for others to use. Successful data (reuse relies on having metadata describing the context of analysis of data. In many disciplines the creation of contextual metadata is referred to as reporting. One method of implementing analyses is with workflows. A stand-out feature of workflows is their ability to record provenance from executions. Provenance is useful when analyses are executed with changing parameters (changing contexts and results need to be traced to respective parameters. In this paper we investigate whether provenance can be exploited to support reporting. Specifically; we outline a case-study based on a real-world workflow and set of reporting queries. We observe that provenance, as collected from workflow executions, is of limited use for reporting, as it supports queries partially. We identify that this is due to the generic nature of provenance, its lack of domain-specific contextual metadata. We observe that the required information is available in implicit form, embedded in data. We describe LabelFlow, a framework comprised of four Labelling Operators for decorating provenance with domain-specific Labels. LabelFlow can be instantiated for a domain by plugging it with domain-specific metadata extractors. We provide a tool that takes as input a workflow, and produces as output a Labelling Pipeline for that workflow, comprised of Labelling Operators. We revisit the case-study and show how Labels provide a more complete implementation of reporting queries.

  2. A Kepler Workflow Tool for Reproducible AMBER GPU Molecular Dynamics.

    Science.gov (United States)

    Purawat, Shweta; Ieong, Pek U; Malmstrom, Robert D; Chan, Garrett J; Yeung, Alan K; Walker, Ross C; Altintas, Ilkay; Amaro, Rommie E

    2017-06-20

    With the drive toward high throughput molecular dynamics (MD) simulations involving ever-greater numbers of simulation replicates run for longer, biologically relevant timescales (microseconds), the need for improved computational methods that facilitate fully automated MD workflows gains more importance. Here we report the development of an automated workflow tool to perform AMBER GPU MD simulations. Our workflow tool capitalizes on the capabilities of the Kepler platform to deliver a flexible, intuitive, and user-friendly environment and the AMBER GPU code for a robust and high-performance simulation engine. Additionally, the workflow tool reduces user input time by automating repetitive processes and facilitates access to GPU clusters, whose high-performance processing power makes simulations of large numerical scale possible. The presented workflow tool facilitates the management and deployment of large sets of MD simulations on heterogeneous computing resources. The workflow tool also performs systematic analysis on the simulation outputs and enhances simulation reproducibility, execution scalability, and MD method development including benchmarking and validation. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  3. Dual element ((15)N/(14)N, (13)C/(12)C) isotope analysis of glyphosate and AMPA by derivatization-gas chromatography isotope ratio mass spectrometry (GC/IRMS) combined with LC/IRMS.

    Science.gov (United States)

    Mogusu, Emmanuel O; Wolbert, J Benjamin; Kujawinski, Dorothea M; Jochmann, Maik A; Elsner, Martin

    2015-07-01

    To assess sources and degradation of the herbicide glyphosate [N-(phosphonomethyl) glycine] and its metabolite AMPA (aminomethylphosphonic acid), concentration measurements are often inconclusive and even (13)C/(12)C analysis alone may give limited information. To advance isotope ratio analysis of an additional element, we present compound-specific (15)N/(14)N analysis of glyphosate and AMPA by a two step derivatization in combination with gas chromatography/isotope ratio mass spectrometry (GC/IRMS). The N-H group was derivatized with isopropyl chloroformate (iso-PCF), and remaining acidic groups were subsequently methylated with trimethylsilyldiazomethane (TMSD). Iso-PCF treatment at pH 10 indicated decomposition of the derivative. At pH 10, and with an excess of iso-PCF by 10-24, greatest yields and accurate (15)N/(14)N ratios were obtained (deviation from elemental analyzer-IRMS: -0.2 ± 0.9% for glyphosate; -0.4 ± 0.7% for AMPA). Limits for accurate δ(15)N analysis of glyphosate and AMPA were 150 and 250 ng injected, respectively. A combination of δ(15)N and δ(13)C analysis by liquid chromatography/isotope ratio mass spectrometry (LC/IRMS) (1) enabled an improved distinction of commercial glyphosate products and (2) showed that glyphosate isotope values during degradation by MnO2 clearly fell outside the commercial product range. This highlights the potential of combined carbon and nitrogen isotopes analysis to trace sources and degradation of glyphosate.

  4. Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows.

    Science.gov (United States)

    Sztromwasser, Pawel; Puntervoll, Pål; Petersen, Kjell

    2011-07-26

    Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.

  5. Data partitioning enables the use of standard SOAP Web Services in genome-scale workflows

    Directory of Open Access Journals (Sweden)

    Sztromwasser Paweł

    2011-06-01

    Full Text Available Biological databases and computational biology tools are provided by research groups around the world, and made accessible on the Web. Combining these resources is a common practice in bioinformatics, but integration of heterogeneous and often distributed tools and datasets can be challenging. To date, this challenge has been commonly addressed in a pragmatic way, by tedious and error-prone scripting. Recently however a more reliable technique has been identified and proposed as the platform that would tie together bioinformatics resources, namely Web Services. In the last decade the Web Services have spread wide in bioinformatics, and earned the title of recommended technology. However, in the era of high-throughput experimentation, a major concern regarding Web Services is their ability to handle large-scale data traffic. We propose a stream-like communication pattern for standard SOAP Web Services, that enables efficient flow of large data traffic between a workflow orchestrator and Web Services. We evaluated the data-partitioning strategy by comparing it with typical communication patterns on an example pipeline for genomic sequence annotation. The results show that data-partitioning lowers resource demands of services and increases their throughput, which in consequence allows to execute in-silico experiments on genome-scale, using standard SOAP Web Services and workflows. As a proof-of-principle we annotated an RNA-seq dataset using a plain BPEL workflow engine.

  6. Mixed Methods Approach for Measuring the Impact of Video Telehealth on Outpatient Clinic Triage Nurse Workflow

    Science.gov (United States)

    Cady, Rhonda G.; Finkelstein, Stanley M.

    2015-01-01

    Nurse-delivered telephone triage is a common component of outpatient clinic settings. Adding new communication technology to clinic triage has the potential to not only transform the triage process, but also alter triage workflow. Evaluating the impact of new technology on an existing workflow is paramount to maximizing efficiency of the delivery system. This study investigated triage nurse workflow before and after the implementation of video telehealth using a sequential mixed methods protocol that combined ethnography and time-motion study to provide a robust analysis of the implementation environment. Outpatient clinic triage using video telehealth required significantly more time than telephone triage, indicating a reduction in nurse efficiency. Despite the increased time needed to conduct video telehealth, nurses consistently rated it useful in providing triage. Interpretive analysis of the qualitative and quantitative data suggests the increased depth and breadth of data available during video triage alters the assessment triage nurses provide physicians. This in turn could impact the time physicians spend formulating a diagnosis and treatment plan. While the immediate impact of video telehealth is a reduction in triage nurse efficiency, what is unknown is the impact of video telehealth on physician and overall clinic efficiency. Future studies should address this area. PMID:24080753

  7. Supporting the Construction of Workflows for Biodiversity Problem-Solving Accessing Secure, Distributed Resources

    Directory of Open Access Journals (Sweden)

    J.S. Pahwa

    2006-01-01

    Full Text Available In the Biodiversity World (BDW project we have created a flexible and extensible Web Services-based Grid environment for biodiversity researchers to solve problems in biodiversity and analyse biodiversity patterns. In this environment, heterogeneous and globally distributed biodiversity-related resources such as data sets and analytical tools are made available to be accessed and assembled by users into workflows to perform complex scientific experiments. One such experiment is bioclimatic modelling of the geographical distribution of individual species using climate variables in order to explain past and future climate-related changes in species distribution. Data sources and analytical tools required for such analysis of species distribution are widely dispersed, available on heterogeneous platforms, present data in different formats and lack inherent interoperability. The present BDW system brings all these disparate units together so that the user can combine tools with little thought as to their original availability, data formats and interoperability. The new prototype BDW system architecture not only brings together heterogeneous resources but also enables utilisation of computational resources and provides a secure access to BDW resources via a federated security model. We describe features of the new BDW system and its security model which enable user authentication from a workflow application as part of workflow execution.

  8. Enabling Structured Exploration of Workflow Performance Variability in Extreme-Scale Environments

    Energy Technology Data Exchange (ETDEWEB)

    Kleese van Dam, Kerstin; Stephan, Eric G.; Raju, Bibi; Altintas, Ilkay; Elsethagen, Todd O.; Krishnamoorthy, Sriram

    2015-11-15

    Workflows are taking an Workflows are taking an increasingly important role in orchestrating complex scientific processes in extreme scale and highly heterogeneous environments. However, to date we cannot reliably predict, understand, and optimize workflow performance. Sources of performance variability and in particular the interdependencies of workflow design, execution environment and system architecture are not well understood. While there is a rich portfolio of tools for performance analysis, modeling and prediction for single applications in homogenous computing environments, these are not applicable to workflows, due to the number and heterogeneity of the involved workflow and system components and their strong interdependencies. In this paper, we investigate workflow performance goals and identify factors that could have a relevant impact. Based on our analysis, we propose a new workflow performance provenance ontology, the Open Provenance Model-based WorkFlow Performance Provenance, or OPM-WFPP, that will enable the empirical study of workflow performance characteristics and variability including complex source attribution.

  9. Hollow fiber liquid phase microextraction combined with graphite furnace atomic absorption spectrometry for the determination of methylmercury in human hair and sludge samples

    Energy Technology Data Exchange (ETDEWEB)

    Jiang Hongmei [Department of Chemistry, Wuhan University, Wuhan 430072 (China); Hu Bin [Department of Chemistry, Wuhan University, Wuhan 430072 (China)], E-mail: binhu@whu.edu.cn; Chen Beibei; Zu Wanqing [Department of Chemistry, Wuhan University, Wuhan 430072 (China)

    2008-07-15

    Two methods, based on hollow fiber liquid-liquid-liquid (three phase) microextraction (HF-LLLME) and hollow fiber liquid phase (two phase) microextraction (HF-LPME), have been developed and critically compared for the determination of methylmercury content in human hair and sludge by graphite furnace atomic absorption spectrometry (GFAAS). In HF-LPME, methylmercury was extracted into the organic phase (toluene) prior to its determination by GFAAS, while inorganic mercury remained as a free species in the sample solution. In HF-LLLME, methylmercury was first extracted into the organic phase (toluene) and then into the acceptor phase (4% thiourea in 1 mol L{sup -1} HCl) prior to its determination by GFAAS, while inorganic mercury remained in the sample solution. The total mercury was determined by inductively coupled plasma-mass spectrometry (ICP-MS), and the levels of inorganic mercury in both HF-LLLME and HF-LPME were obtained by subtracting methylmercury from total mercury. The factors affecting the microextraction of methylmercury, including organic solvent, extraction time, stirring rate and ionic strength, were investigated and the optimal extraction conditions were established for both HF-LLLPME and HF-LPME. With a consumption of 3.0 mL of the sample solution, the enrichment factors were 204 and 55 for HF-LLLPME and HF-LPME, respectively. The limits of detection (LODs) for methylmercury were 0.1 {mu}g L{sup -1} and 0.4 {mu}g L{sup -1} (as Hg) with precisions (RSDs (%), c = 5 {mu}g L{sup -1} (as Hg), n = 5) of 13% and 11% for HF-LLLPME-GFAAS and HF-LPME-GFAAS, respectively. For ICP-MS determination of total mercury, a limit of detection of 39 ng L{sup -} {sup 1} was obtained. Finally, HF-LLLME-GFAAS was applied to the determination of methylmercury content in human hair and sludge, and the recoveries for the spiked samples were in the range of 99-113%. In order to validate the method, HF-LLLME-GFAAS was also applied to the analysis of a certified reference

  10. Identification of intact high molecular weight glutenin subunits from the wheat proteome using combined liquid chromatography-electrospray ionization mass spectrometry.

    Science.gov (United States)

    Lagrain, Bert; Brunnbauer, Markus; Rombouts, Ine; Koehler, Peter

    2013-01-01

    The present paper describes a method for the identification of intact high molecular weight glutenin subunits (HMW-GS), the quality determining proteins from the wheat storage proteome. The method includes isolation of HMW-GS from wheat flour, further separation of HMW-GS by reversed-phase high-performance liquid chromatography (RP-HPLC), and their subsequent molecular identification with electrospray ionization mass spectrometry using a quadrupole-time-of-flight mass analyzer. For HMW-GS isolation, wheat proteins were reduced and extracted from flour with 50% 1-propanol containing 1% dithiothreitol. HMW-GS were then selectively precipitated from the protein mixture by adjusting the 1-propanol concentration to 60%. The composition of the precipitated proteins was first evaluated by sodium dodecyl sulfate-polyacrylamide gel electrophoresis with Coomassie staining and RP-HPLC with ultraviolet detection. Besides HMW-GS (≥65%), the isolated proteins mainly contained ω5-gliadins. Secondly, the isolated protein fraction was analyzed by liquid chromatography-mass spectrometry. Optimal chromatographic separation of HMW-GS from the other proteins in the isolated fraction was obtained when the mobile phase contained 0.1% trifluoroacetic acid as ion-pairing agent. Individual HMW-GS were then identified by determining their molecular masses from the high-resolution mass spectra and comparing these with theoretical masses calculated from amino acid sequences. Using formic acid instead of trifluoroacetic acid in the mobile phase increased protein peak intensities in the base peak mass chromatogram. This allowed the detection of even traces of other wheat proteins than HMW-GS in the isolated fraction, but the chromatographic separation was inferior with a major overlap between the elution ranges of HMW-GS and ω-gliadins. Overall, the described method allows a rapid assessment of wheat quality through the direct determination of the HMW-GS composition and offers a basis for

  11. Identification of intact high molecular weight glutenin subunits from the wheat proteome using combined liquid chromatography-electrospray ionization mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Bert Lagrain

    Full Text Available The present paper describes a method for the identification of intact high molecular weight glutenin subunits (HMW-GS, the quality determining proteins from the wheat storage proteome. The method includes isolation of HMW-GS from wheat flour, further separation of HMW-GS by reversed-phase high-performance liquid chromatography (RP-HPLC, and their subsequent molecular identification with electrospray ionization mass spectrometry using a quadrupole-time-of-flight mass analyzer. For HMW-GS isolation, wheat proteins were reduced and extracted from flour with 50% 1-propanol containing 1% dithiothreitol. HMW-GS were then selectively precipitated from the protein mixture by adjusting the 1-propanol concentration to 60%. The composition of the precipitated proteins was first evaluated by sodium dodecyl sulfate-polyacrylamide gel electrophoresis with Coomassie staining and RP-HPLC with ultraviolet detection. Besides HMW-GS (≥65%, the isolated proteins mainly contained ω5-gliadins. Secondly, the isolated protein fraction was analyzed by liquid chromatography-mass spectrometry. Optimal chromatographic separation of HMW-GS from the other proteins in the isolated fraction was obtained when the mobile phase contained 0.1% trifluoroacetic acid as ion-pairing agent. Individual HMW-GS were then identified by determining their molecular masses from the high-resolution mass spectra and comparing these with theoretical masses calculated from amino acid sequences. Using formic acid instead of trifluoroacetic acid in the mobile phase increased protein peak intensities in the base peak mass chromatogram. This allowed the detection of even traces of other wheat proteins than HMW-GS in the isolated fraction, but the chromatographic separation was inferior with a major overlap between the elution ranges of HMW-GS and ω-gliadins. Overall, the described method allows a rapid assessment of wheat quality through the direct determination of the HMW-GS composition and

  12. Non-targeted volatile profiles for the classification of the botanical origin of Chinese honey by solid-phase microextraction and gas chromatography-mass spectrometry combined with chemometrics.

    Science.gov (United States)

    Chen, Hui; Jin, Linghe; Fan, Chunlin; Wang, Wenwen

    2017-11-01

    A potential method for the discrimination and prediction of honey samples of various botanical origins was developed based on the non-targeted volatile profiles obtained by solid-phase microextraction with gas chromatography and mass spectrometry combined with chemometrics. The blind analysis of non-targeted volatile profiles was carried out using solid-phase microextraction with gas chromatography and mass spectrometry for 87 authentic honey samples from four botanical origins (acacia, linden, vitex, and rape). The number of variables was reduced from 2734 to 70 by using a series of filters. Based on the optimized 70 variables, 79.12% of the variance was explained by the first four principal components. Partial least squares discriminant analysis, naïve Bayes analysis, and back-propagation artificial neural network were used to develop the classification and prediction models. The 100% accuracy revealed a perfect classification of the botanical origins. In addition, the reliability and practicability of the models were validated by an independent set of additional 20 authentic honey samples. All 20 samples were accurately classified. The confidence measures indicated that the performance of the naïve Bayes model was better than the other two models. Finally, the characteristic volatile compounds of linden honey were tentatively identified. The proposed method is reliable and accurate for the classification of honey of various botanical origins. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Correct primary structure assessment and extensive glyco-profiling of cetuximab by a combination of intact, middle-up, middle-down and bottom-up ESI and MALDI mass spectrometry techniques.

    Science.gov (United States)

    Ayoub, Daniel; Jabs, Wolfgang; Resemann, Anja; Evers, Waltraud; Evans, Catherine; Main, Laura; Baessmann, Carsten; Wagner-Rousset, Elsa; Suckau, Detlev; Beck, Alain

    2013-01-01

    The European Medicines Agency received recently the first marketing authorization application for a biosimilar monoclonal antibody (mAb) and adopted the final guidelines on biosimilar mAbs and Fc-fusion proteins. The agency requires high similarity between biosimilar and reference products for approval. Specifically, the amino acid sequences must be identical. The glycosylation pattern of the antibody is also often considered to be a very important quality attribute due to its strong effect on quality, safety, immunogenicity, pharmacokinetics and potency. Here, we describe a case study of cetuximab, which has been marketed since 2004. Biosimilar versions of the product are now in the pipelines of numerous therapeutic antibody biosimilar developers. We applied a combination of intact, middle-down, middle-up and bottom-up electrospray ionization and matrix assisted laser desorption ionization mass spectrometry techniques to characterize the amino acid sequence and major post-translational modifications of the marketed cetuximab product, with special emphasis on glycosylation. Our results revealed a sequence error in the reported sequence of the light chain in databases and in publications, thus highlighting the potency of mass spectrometry to establish correct antibody sequences. We were also able to achieve a comprehensive identification of cetuximab's glycoforms and glycosylation profile assessment on both Fab and Fc domains. Taken together, the reported approaches and data form a solid framework for the comparability of antibodies and their biosimilar candidates that could be further applied to routine structural assessments of these and other antibody-based products.

  14. Visual Workflows for Oil and Gas Exploration

    KAUST Repository

    Hollt, Thomas

    2013-04-14

    The most important resources to fulfill today’s energy demands are fossil fuels, such as oil and natural gas. When exploiting hydrocarbon reservoirs, a detailed and credible model of the subsurface structures to plan the path of the borehole, is crucial in order to minimize economic and ecological risks. Before that, the placement, as well as the operations of oil rigs need to be planned carefully, as off-shore oil exploration is vulnerable to hazards caused by strong currents. The oil and gas industry therefore relies on accurate ocean forecasting systems for planning their operations. This thesis presents visual workflows for creating subsurface models as well as planning the placement and operations of off-shore structures. Creating a credible subsurface model poses two major challenges: First, the structures in highly ambiguous seismic data are interpreted in the time domain. Second, a velocity model has to be built from this interpretation to match the model to depth measurements from wells. If it is not possible to obtain a match at all positions, the interpretation has to be updated, going back to the first step. This results in a lengthy back and forth between the different steps, or in an unphysical velocity model in many cases. We present a novel, integrated approach to interactively creating subsurface models from reflection seismics, by integrating the interpretation of the seismic data using an interactive horizon extraction technique based on piecewise global optimization with velocity modeling. Computing and visualizing the effects of changes to the interpretation and velocity model on the depth-converted model, on the fly enables an integrated feedback loop that enables a completely new connection of the seismic data in time domain, and well data in depth domain. For planning the operations of off-shore structures we present a novel integrated visualization system that enables interactive visual analysis of ensemble simulations used in ocean

  15. Automated Processing Workflow for Ambient Seismic Recordings

    Science.gov (United States)

    Girard, A. J.; Shragge, J.

    2017-12-01

    Structural imaging using body-wave energy present in ambient seismic data remains a challenging task, largely because these wave modes are commonly much weaker than surface wave energy. In a number of situations body-wave energy has been extracted successfully; however, (nearly) all successful body-wave extraction and imaging approaches have focused on cross-correlation processing. While this is useful for interferometric purposes, it can also lead to the inclusion of unwanted noise events that dominate the resulting stack, leaving body-wave energy overpowered by the coherent noise. Conversely, wave-equation imaging can be applied directly on non-correlated ambient data that has been preprocessed to mitigate unwanted energy (i.e., surface waves, burst-like and electromechanical noise) to enhance body-wave arrivals. Following this approach, though, requires a significant preprocessing effort on often Terabytes of ambient seismic data, which is expensive and requires automation to be a feasible approach. In this work we outline an automated processing workflow designed to optimize body wave energy from an ambient seismic data set acquired on a large-N array at a mine site near Lalor Lake, Manitoba, Canada. We show that processing ambient seismic data in the recording domain, rather than the cross-correlation domain, allows us to mitigate energy that is inappropriate for body-wave imaging. We first develop a method for window selection that automatically identifies and removes data contaminated by coherent high-energy bursts. We then apply time- and frequency-domain debursting techniques to mitigate the effects of remaining strong amplitude and/or monochromatic energy without severely degrading the overall waveforms. After each processing step we implement a QC check to investigate improvements in the convergence rates - and the emergence of reflection events - in the cross-correlation plus stack waveforms over hour-long windows. Overall, the QC analyses suggest that

  16. Combination of flame atomic absorption spectrometry with ligandless-dispersive liquid- liquid microextraction for preconcentration and determination of trace amount of lead in water samples

    Directory of Open Access Journals (Sweden)

    Y.M. Baghelani

    2013-05-01

    Full Text Available A new ligandless-dispersive liquid–liquid microextraction method has been developed for the separation and flame atomic absorption spectrometry determination of trace amount of lead(II ion. In the proposed approach 1,2-dicholorobenzene and ethanol were used as extraction and dispersive solvents. Factors influencing the extraction efficiency of lead, including the extraction and dispersive solvent type and volume, pH of sample solution, concentration of chloride and extraction time were studied. Under the optimal conditions, the calibration curve was linear in the range of 7.0–6000 ng mL−1 of lead with R2 = 0.9992 (n = 10 and detection limit based on three times the standard deviation of the blank (3Sb was 0.5 ng mL−1 in original solution. The relative standard deviation for eight replicate determinations of 1.0 mg mL-1 lead was ±1.6%. The high efficiency of dispersive liquid-liquid microextraction to carry out the determination of trace amounts of lead in complex matrices was demonstrated. The proposed method has been applied for determination of trace amounts of lead in water samples and satisfactory results were obtained. The accuracy was checked by analyzing a certified reference material from the National Institute of Standard and Technology, Trace elements in water (NIST CRM 1643e.

  17. Simultaneous determination of kasugamycin and validamycin-A residues in cereals by consecutive solid-phase extraction combined with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Zhang, Hong; Wang, Chenchen; Li, Huidong; Nie, Yan; Fang, Liping; Chen, Zilei

    2018-03-01

    Two polar aminoglycosides, kasugamycin and validamycin-A, were determined in cereals (brown rice, wheat and corn) by high-performance liquid chromatography-tandem mass spectrometry. The analytes were extracted from samples using methanol and water (70:30, v/v) at pH 5.5, purified using both a hydrophilic-hydrophobic-balanced cartridge and a strong cation-exchange cartridge, and then analysed using multiple reaction monitoring in positive electrospray ionisation mode with a special ReproSil 100 C 18 high-performance liquid chromatography column. This newly proposed method yielded good sensitivity and excellent chromatographic performance. The limits of quantification for kasugamycin and validamycin-A were 4.1 µg/kg and 1.0 µg/kg, respectively. The recoveries for both compounds at three fortification levels (4, 100 and 500 µg/kg for kasugamycin; 1, 10 and 100 µg/kg for validamycin-A) ranged from 75% to 110%, and the relative standard deviations were below 15%.

  18. Anaerobic digestion of solid slaughterhouse waste: study of biological stabilization by Fourier Transform infrared spectroscopy and thermogravimetry combined with mass spectrometry.

    Science.gov (United States)

    Cuetos, María José; Gómez, Xiomar; Otero, Marta; Morán, Antonio

    2010-07-01

    In this paper, Fourier Transform infrared spectroscopy (FTIR) along with thermogravimetric analysis together with mass spectrometry (TG-MS analysis) were employed to study the organic matter transformation attained under anaerobic digestion of slaughterhouse waste and to establish the stability of the digestates obtained when compared with fresh wastes. Digestate samples studied were obtained from successful digestion and failed systems treating slaughterhouse waste and the organic fraction of municipal solid wastes. The FTIR spectra and TG profiles from well stabilized products (from successful digestion systems) showed an increase in the aromaticity degree and the reduction of volatile content and aliphatic structures as stabilization proceeded. On the other hand, the FTIR spectra of non-stable reactors showed a high aliphaticity degree and fat content. When comparing differential thermogravimetry (DTG) profiles of the feed and digestate samples obtained from all successful anaerobic systems, a reduction in the intensity of the low-temperature range (approximately 300 degrees C) peak was observed, while the weight loss experienced at high-temperature (450-550 degrees C) was variable for the different systems. Compared to the original waste, the intensity of the weight loss peak in the high-temperature range decreased in the reactors with higher hydraulic retention time (HRT) whereas its intensity increased and the peak was displaced to higher temperatures for the digesters with lower HRT.

  19. Screening of HIV-1 Protease Using a Combination of an Ultra-High-Throughput Fluorescent-Based Assay and RapidFire Mass Spectrometry.

    Science.gov (United States)

    Meng, Juncai; Lai, Ming-Tain; Munshi, Vandna; Grobler, Jay; McCauley, John; Zuck, Paul; Johnson, Eric N; Uebele, Victor N; Hermes, Jeffrey D; Adam, Gregory C

    2015-06-01

    HIV-1 protease (PR) represents one of the primary targets for developing antiviral agents for the treatment of HIV-infected patients. To identify novel PR inhibitors, a label-free, high-throughput mass spectrometry (HTMS) assay was developed using the RapidFire platform and applied as an orthogonal assay to confirm hits identified in a fluorescence resonance energy transfer (FRET)-based primary screen of > 1 million compounds. For substrate selection, a panel of peptide substrates derived from natural processing sites for PR was evaluated on the RapidFire platform. As a result, KVSLNFPIL, a new substrate measured to have a ~ 20- and 60-fold improvement in k cat/K m over the frequently used sequences SQNYPIVQ and SQNYPIV, respectively, was identified for the HTMS screen. About 17% of hits from the FRET-based primary screen were confirmed in the HTMS confirmatory assay including all 304 known PR inhibitors in the set, demonstrating that the HTMS assay is effective at triaging false-positives while capturing true hits. Hence, with a sampling rate of ~7 s per well, the RapidFire HTMS assay enables the high-throughput evaluation of peptide substrates and functions as an efficient tool for hits triage in the discovery of novel PR inhibitors. © 2015 Society for Laboratory Automation and Screening.

  20. The influence of reactive side products on the electrooxidation of methanol--a combined in situ infrared spectroscopy and online mass spectrometry study.

    Science.gov (United States)

    Reichert, R; Schnaidt, J; Jusys, Z; Behm, R J

    2014-07-21

    Aiming at a better understanding of the impact of reaction intermediates and reactive side products on electrocatalytic reactions under conditions characteristic for technical applications, i.e., at high reactant conversions, we have investigated the electrooxidation of methanol on a Pt film electrode in mixtures containing defined concentrations of the reaction intermediates formaldehyde or formic acid. Employing simultaneous in situ infrared spectroscopy and online mass spectrometry in parallel to voltammetric measurements, we examined the effects of the latter molecules on the adlayer build-up and composition and on the formation of volatile reaction products CO2 and methylformate, as well as on the overall reaction rate. To assess the individual contributions of each component, we used isotope labeling techniques, where one of the two C1 components in the mixtures of methanol with either formaldehyde or formic acid was (13)C-labeled. The data reveal pronounced effects of the additional components formaldehyde and formic acid on the reaction, although their concentration was much lower (10%) than that of the main reactant methanol. Most important, the overall Faradaic current responses and the amounts of CO2 formed upon oxidation of the mixtures are always lower than the sums of the contributions from the individual components, indicative of a non-additive behavior of both Faradaic current and CO2 formation in the mixtures. Mechanistic reasons and consequences for reactions in a technical reactor, with high reactant conversion, are discussed.

  1. Determination of ppq-levels of alkylmethoxypyrazines in wine by stirbar sorptive extraction combined with multidimensional gas chromatography-mass spectrometry.

    Science.gov (United States)

    Wen, Yan; Ontañon, Ignacio; Ferreira, Vicente; Lopez, Ricardo

    2018-07-30

    Alkylmethoxypyrazines are powerful odorants in many food products. A new method for analysing 3-isopropyl-2-methoxypyrazine, 3-s-butyl-2-methoxypyrazine and 3-isobutyl-2-methoxypyrazine has been developed and applied to wine. The analytes were extracted from 5 mL of wine using stirbar sorptive extraction followed by thermal desorption and multidimensional gas chromatography-mass spectrometry analysis in a single oven. The extraction conditions were optimized in order to obtain a high recovery of the 3-alkyl-2-methoxypyrazines (MP). The detection limits of the method in all cases were under 0.08 ng/L, well below the olfactory thresholds of these compounds in wine. The reproducibility of the method was adequate (below 10%), the linearity satisfactory and the recoveries in all cases close to 100%. The method has been applied to the analysis of 111 Spanish and French wine samples. The levels found suggest that MP have a low direct impact on the aroma properties of wines from the regions around the Pyrenean massif. Copyright © 2018 The Authors. Published by Elsevier Ltd.. All rights reserved.

  2. Determination of acetamiprid, imidacloprid, and spirotetramat and their relevant metabolites in pistachio using modified QuEChERS combined with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Faraji, Mohammad; Noorbakhsh, Roya; Shafieyan, Hooshang; Ramezani, Mohammadkazem

    2018-02-01

    A QuEChERS based methodology was developed for the simultaneous identification and quantification of acetamiprid, imidacloprid, and spirotetramat and their relevant metabolites in pistachio by liquid chromatography-tandem mass spectrometry for the first time. First, sample extraction was done with MeCN:citrate buffer:NaHCO 3 followed by phase separation with the addition of MgSO 4 :NaCl. The supernatant was then cleaned by a primary-secondary amine (PSA), GCB, and MgSO 4 . The proposed method provides a linearity in the range of 5-200µgL -1 , and the linear regression coefficients were higher than 0.99. LOD and LOQ were obtained to be 2 and 5µgkg -1 for the studied insecticides, respectively, with the exception of imidacloprid-olefin (5 and 10µgkg -1 ). Acceptable recoveries (91-110%) were obtained for all the analytes with good intra- and inter-precisions (0.4≥RSD ≤11.0). The method was then used for the pistachio samples collected from a field trial to estimate the maximum residue limits (MRLs) in next step. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Toward a high-throughput method for determining vicine and convicine levels in faba bean seeds using flow injection analysis combined with tandem mass spectrometry.

    Science.gov (United States)

    Purves, Randy W; Khazaei, Hamid; Vandenberg, Albert

    2018-08-01

    Although faba bean provides environmental and health benefits, vicine and convicine (v-c) limit its use as a source of vegetable protein. Crop improvement efforts to minimize v-c concentration require low-cost, rapid screening methods to distinguish between high and low v-c genotypes to accelerate development of new cultivars and to detect out-crossing events. To assist crop breeders, we developed a unique and rapid screening method that uses a 60 s instrumental analysis step to accurately distinguish between high and low v-c genotypes. The method involves flow injection analysis (FIA) coupled with tandem mass spectrometry (i.e., selective reaction monitoring, SRM). Using seeds with known v-c levels as calibrants, measured v-c levels were comparable with liquid chromatography (LC)-SRM results and the method was used to screen 370 faba bean genotypes. Widespread use of FIA-SRM will accelerate breeding of low v-c faba bean, thereby alleviating concerns about anti-nutritional effects of v-c in this crop. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Determination of Muscone in Rats Plasma following Oral Administration of Artificial Musk: Using of Combined Headspace Gas Chromatography-Mass Spectrometry

    Directory of Open Access Journals (Sweden)

    Qibiao Wu

    2014-01-01

    Full Text Available To develop an analytical method for determination of plasma concentrations of muscone in rats following oral administration of artificial musk, with the aim of investigating the pharmacokinetic profile of artificial musk. Plasma samples were pretreated with acetonitrile to precipitate proteins. Headspace injection coupled with gas chromatography-mass spectrometry was used for quantitative analysis of muscone concentrations. A strong linear relationship was obtained for plasma muscone concentrations ranging from 75.6 to 7560 ng·mL−1  R2=0.9998, with the minimum detectable concentration being 25 ng·mL−1. The within-day and interday precision for determination of three different concentrations of muscone were favorable (RSD < 25%. The average absolute recovery ranged from 83.7 to 88.6%, with an average relative recovery of 100.5 to 109.8%. The method described was characterized by stability and reliability, and in the present study showed significant specificity and high sensitivity. This method would be applicable to the analysis of plasma concentrations of muscone in preclinical contexts, where artificial musk is used.

  5. Fast and easy extraction combined with high resolution-mass spectrometry for residue analysis of two anticonvulsants and their transformation products in marine mussels.

    Science.gov (United States)

    Martínez Bueno, M J; Boillot, C; Fenet, H; Chiron, S; Casellas, C; Gómez, E

    2013-08-30

    Environmental field studies have shown that carbamazepine (Cbz) is one of the most frequently detected human pharmaceuticals in different aquatic compartments. However, little data is available on the detection of this substance and its transformation products in aquatic organisms. This study was thus mainly carried out to optimize and validate a simple and sensitive analytical methodology for the detection, characterization and quantification of Cbz and oxcarbazepine (Ox), two anticonvulsants, and six of their main transformation products in marine mussels (Mytilus galloprovincialis). A modified QuEChERS extraction method followed by analysis with liquid chromatography coupled to high resolution mass spectrometry (HRMS) was used. The analyses were performed using two-stage fragmentation to reveal the different fragmentation pathways that are highly useful for the identification of isomeric compounds, a common problem when several transformation products are analyzed. The developed analytical method allowed determination of the target analytes in the lower ng/g concentration levels. The mean recovery ranged from 67 to 110%. The relative standard deviation was under 11% in the intra-day and 18% in the inter-day analyses, respectively. Finally, the method was applied to marine mussel samples collected from Mediterranean Sea cultures in southeastern France. Residues of the psychiatric drug Cbz were occasionally found at levels up to 3.5ng/g dw. Lastly, in this study, other non-target compounds, such as caffeine, metoprolol, cotinine and ketoprofen, were identified in the real samples analyzed. Copyright © 2013 Elsevier B.V. All rights reserved.

  6. Rapid extraction combined with LC-tandem mass spectrometry (CREM-LC/MS/MS) for the determination of ciguatoxins in ciguateric fish flesh.

    Science.gov (United States)

    Lewis, Richard J; Yang, Aijun; Jones, Alun

    2009-07-01

    Ciguatera is a significant food borne disease caused by potent polyether toxins known as ciguatoxins, which accumulate in the flesh of ciguateric fish at risk levels above 0.1 ppb. The management of ciguatera has been hindered by the lack of analytical methods to detect and quantify clinically relevant levels of ciguatoxin in easily prepared crude extracts of fish. Here we report a ciguatoxin rapid extraction method (CREM) that allows the rapid preparation of fish flesh extracts for the detection and quantification of ciguatoxin by gradient reversed-phase liquid chromatography-tandem mass spectrometry (LC/MS/MS). CREM-LC/MS/MS delivers a linear response to P-CTX-1 spiked into fish prior to extraction. A similar response was obtained for P-CTX-1 spiked after extraction, indicating >95% extraction efficiency was achieved overall and 85% at the limit of quantification (0.1 ppb). Using this approach, levels >or=0.1 ppb P-CTX-1 could be detected and quantified from an extract of 2g fish flesh, making it suitable as a confirmatory assay for suspect ciguateric carnivorous fish in the Pacific Ocean. The approach is designed to simplify the extraction and analysis of multiple samples per day.

  7. Sensitive quantification of coixol, a potent insulin secretagogue, in Scoparia dulcis extract using high-performance liquid chromatography combined with tandem mass spectrometry and UV detection.

    Science.gov (United States)

    Ali, Arslan; Haq, Faraz Ul; Ul Arfeen, Qamar; Sharma, Khaga Raj; Adhikari, Achyut; Musharraf, Syed Ghulam

    2017-10-01

    Diabetes is a major global health problem which requires new studies for its prevention and control. Scoparia dulcis, a herbal product, is widely used for treatment of diabetes. Recent studies demonstrate coixol as a potent and nontoxic insulin secretagog from S. dulcis. This study focuses on developing two quantitative methods of coixol in S. dulcis methanol-based extracts. Quantification of coixol was performed using high-performance liquid chromatography-tandem mass spectrometry (method 1) and high-performance liquid chromatography-ultraviolet detection (method 2) with limits of detection of 0.26 and 11.6 pg/μL, respectively, and limits of quantification of 0.78 and 35.5 pg/μL, respectively. S. dulcis is rich in coixol content with values of 255.5 ± 2.1 mg/kg (method 1) and 220.4 ± 2.9 mg/kg (method 2). Excellent linearity with determination coefficients >0.999 was achieved for calibration curves from 10 to 7500 ng/mL (method 1) and from 175 to 7500 ng/mL (method 2). Good accuracy (bias < -8.6%) and precision (RSD < 8.5%) were obtained for both methods. Thus, they can be employed to analyze coixol in plant extracts and herbal formulations. Copyright © 2017 John Wiley & Sons, Ltd.

  8. Dual-target screening of bioactive components from traditional Chinese medicines by hollow fiber-based ligand fishing combined with liquid chromatography-mass spectrometry.

    Science.gov (United States)

    Chen, Liang; Wang, Xin; Liu, Youping; Di, Xin

    2017-09-05

    A novel strategy was developed for dual-target screening of bioactive components from traditional Chinese medicines (TCMs). This strategy was based on the use of low-cost microporous hollow fibers filled with target enzymes as baits to "fish out" the ligands in TCM extracts, followed by identification of the ligands dissociated from the target-ligand complexes by liquid chromatography-mass spectrometry. Ganjiang Huangqin Huanglian Renshen Decoction (GHHRD), a classical TCM prescription for diabetes treatment, was chosen as a model sample to evaluate the feasibility of the proposed strategy. Three bioactive components were successfully screened out from GHHRD. Coptisine was identified as the ligand of α-glucosidase and baicalin as the ligand of angiotensin-converting enzyme (ACE). Berberine was found to be a dual inhibitor of α-glucosidase and ACE. The results were further verified by enzyme inhibitory assay and molecular docking simulation. The study suggested that our developed strategy would be a powerful tool for screening bioactive components from multi-component and multi-target TCMs. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Simultaneous Determination of Perfluorinated Compounds in Edible Oil by Gel-Permeation Chromatography Combined with Dispersive Solid-Phase Extraction and Liquid Chromatography-Tandem Mass Spectrometry.

    Science.gov (United States)

    Yang, Lili; Jin, Fen; Zhang, Peng; Zhang, Yanxin; Wang, Jian; Shao, Hua; Jin, Maojun; Wang, Shanshan; Zheng, Lufei; Wang, Jing

    2015-09-30

    A simple analytical method was developed for the simultaneous analysis of 18 perfluorinated compounds (PFCs) in edible oil. The target compounds were extracted by acetonitrile, purified by gel permeation chromatography (GPC) and dispersive solid-phase extraction (DSPE) using graphitized carbon black (GCB) and octadecyl (C18), and analyzed by liquid chromatography-electrospray ionization tandem mass spectrometry (LC-ES-MS/MS) in negative ion mode. Recovery studies were performed at three fortification levels. The average recoveries of all target PFCs ranged from 60 to 129%, with an acceptable relative standard deviation (RSD) (1-20%, n = 3). The method detection limits (MDLs) ranged from 0.004 to 0.4 μg/kg, which was significantly improved compared with the existing liquid-liquid extraction and cleanup method. The method was successfully applied for the analysis of all target PFCs in edible oil samples collected from markets in Beijing, China, and the results revealed that C6-C10 perfluorocarboxylic acid (PFCAs) and C7 perfluorosulfonic acid PFSAs were the major PFCs detected in oil samples.

  10. Determination of Fusarium toxins in functional vegetable milks applying salting-out-assisted liquid-liquid extraction combined with ultra-high-performance liquid chromatography tandem mass spectrometry.

    Science.gov (United States)

    Hamed, Ahmed M; Arroyo-Manzanares, Natalia; García-Campaña, Ana M; Gámiz-Gracia, Laura

    2017-11-01

    Vegetable milks are considered as functional foods due to their physiological benefits. Although the consumption of these products has significantly increased, they have received little attention in legislation with regard to contaminants. However, they may contain mycotoxins resulting from the use of contaminated raw materials. In this work, ultra-high-performance liquid chromatography tandem mass spectrometry has been proposed for the determination of the most relevant Fusarium toxins (fumonisin B 1 and B 2 , HT-2 and T-2 toxins, zearalenone, deoxynivalenol and fusarenon-X) in different functional beverages based on cereals, legumes and seeds. Sample treatment consisted of a simple salting-out-assisted liquid-liquid extraction with no further clean-up. The method provided limits of quantification between 3.2 and 57.7 µg L -1 , recoveries above 80% and precision with RSD lower than 12%. The method was also applied for studying the occurrence of these mycotoxins in market samples of vegetable functional beverages and deoxynivalenol was found in three oat-based commercial drinks.

  11. Combination of solvent extractants for dispersive liquid-liquid microextraction of fungicides from water and fruit samples by liquid chromatography with tandem mass spectrometry.

    Science.gov (United States)

    Pastor-Belda, Marta; Garrido, Isabel; Campillo, Natalia; Viñas, Pilar; Hellín, Pilar; Flores, Pilar; Fenoll, José

    2017-10-15

    A multiresidue method was developed to determine twenty-five fungicides belonging to three different chemical families, oxazoles, strobilurins and triazoles, in water and fruit samples, using dispersive liquid-liquid microextraction (DLLME) and liquid chromatography/tandem mass spectrometry (LC-MS 2 ). Solid-liquid extraction with acetonitrile was used for the analysis in fruits, the extract being used as dispersant solvent in DLLME. Since some of the analytes showed high affinity for chloroform and the others were more efficiently extracted with undecanol, a mixture of both solvents was used as extractant in DLLME. After evaporation of CHCl 3 , the enriched phase was analyzed. Enrichment factors in the 23-119 and 12-60 ranges were obtained for waters and fruits, respectively. The approach was most sensitive for metominostrobin with limits of quantification of 1ngL -1 and 5ngkg -1 in waters and fruits, respectively, while a similar sensitivity was attained for tebuconazole in fruits. Recoveries of the fungicides varied between 86 and 116%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Quantitation of deuterated and non-deuterated phenylalanine and tyrosine in human plasma using the selective ion monitoring method with combined gas chromatography-mass spectrometry

    International Nuclear Information System (INIS)

    Zagalak, M.-J.; Curtius, H.-Ch.; Leimbacher, W.; Redweik, U.

    1977-01-01

    A specific method is described for the quantitative analysis of deutarated and non-deuterated phenylalanine and tyrosine in human plasma by gas chromatography-mass spectrometry using selective ion monitoring. From the several derivatives investigated, the N- or N,O-trifluoroacetyl methyl esters were found to be the most suitable for our purposes. DL-Phenylalanine-4-d 1 and L-tyrosine-d 7 were used as internal standards. The sensitivity of this method permits the measurement of amounts as small as ca. 2.5 ng/ml in plasma for both phenylalanine and tyrosine. The coefficients of variation were found to be ca. 1.6% (n=12) for phenylalanine and 3.0% (n=12) for tyrosine. Using this method, an in vivo determination of phenylalanine-4-monooxygenase activity in humans is possible by loading the subjects with deuterated L-phenylalanine-d 5 (accepted as substrate by phenylalanine-4-monooxygenase E.C. 1.14.16.1) and the subsequent measuring of deuterated L-tyrosine-d 4 formed and residual L-phenylalanine-d 5

  13. Determination of Pesticides by Gas Chromatography Combined with Mass Spectrometry Using Femtosecond Lasers Emitting at 267, 400, and 800 nm as the Ionization Source.

    Science.gov (United States)

    Yang, Xixiang; Imasaka, Tomoko; Imasaka, Totaro

    2018-04-03

    A standard sample mixture containing 51 pesticides was separated by gas chromatography (GC), and the constituents were identified by mass spectrometry (MS) using femtosecond lasers emitting at 267, 400, and 800 nm as the ionization source. A two-dimensional display of the GC/MS was successfully used for the determination of these compounds. A molecular ion was observed for 38 of the compounds at 267 nm and for 30 of the compounds at 800 nm, in contrast to 27 among 50 compounds when electron ionization was used. These results suggest that the ultraviolet laser is superior to the near-infrared laser for molecular weight determinations and for a more reliable analysis of these compounds. In order to study the conditions for optimal ionization, the experimental data were examined using the spectral properties (i.e., the excitation and ionization energies and absorption spectra for the neutral and ionized species) obtained by quantum chemical calculations. A few molecules remained unexplained by the currently reported rules, requiring additional rules for developing a full understanding of the femtosecond ionization process. The pesticides in the homogenized matrix obtained from kabosu ( citrus sphaerocarpa) were measured using lasers emitting at 267 and 800 nm. The pesticides were clearly separated and measured on the two-dimensional display, especially for the data measured at 267 nm, suggesting that this technique would have potential for use in the practical trace analysis of the pesticides in the environment.

  14. Speciation of Mn(II), Mn(VII) and total manganese in water and food samples by coprecipitation-atomic absorption spectrometry combination

    International Nuclear Information System (INIS)

    Citak, Demirhan; Tuzen, Mustafa; Soylak, Mustafa

    2010-01-01

    A speciation procedure based on the coprecipitation of manganese(II) with zirconium(IV) hydroxide has been developed for the investigation of levels of manganese species. The determination of manganese levels was performed by flame atomic absorption spectrometry (FAAS). Total manganese was determined after the reduction of Mn(VII) to Mn(II) by ascorbic acid. The analytical parameters including pH, amount of zirconium(IV), sample volume, etc., were investigated for the quantitative recoveries of manganese(II). The effects of matrix ions were also examined. The recoveries for manganese(II) were in the range of 95-98%. Preconcentration factor was calculated as 50. The detection limit for the analyte ions based on 3 sigma (n = 21) was 0.75 μg L -1 for Mn(II). The relative standard deviation was found to be lower than 7%. The validation of the presented procedure was performed by analysis of certified reference material having different matrices, NIST SRM 1515 (Apple Leaves) and NIST SRM 1568a (Rice Flour). The procedure was successfully applied to natural waters and food samples.

  15. [Determination of trace lead and cadmium in transgenic rice by crosslinked carboxymethyl konjac glucomannan microcolumn preconcentration combined with graphite furnace atomic absorption spectrometry].

    Science.gov (United States)

    Liu, Hua-qing; Li, Sheng-qing; Qu, Yang; Chen, Hao

    2012-02-01

    A novel method was developed for the determination of trace lead and cadmium in transgenic brown rice based on separation and preconcentration with a micro column packed with crosslinked carboxymethyl konjac glucomannan (CCMKGM) prior to its determination by graphite furnace atomic absorption spectrometry. Variables affecting the separation and preconcentration of lead and cadmium, such as the acidity of the aqueous solution, sample flow rate and volume, and eluent concentration and volume, were optimized. Under optimized condition, detection limits of the method for the determination of trace lead and cadmium in transgenic brown rice were 0.11 and 0.002 microg x L(-1), respectively. The obtained results of lead and cadmium in the certified reference material (GBW10010, GBS1-1) were in good agreement with the certified values. The recoveries were in the range of 90%-103% and 93%-105% for detection of Pb and Cd in transgenic brown rice and the wild-type brown rice samples respectively. This study could provide technical support for determination of trace Pb and Cd in transgenic rice.

  16. Improving adherence to the Epic Beacon ambulatory workflow.

    Science.gov (United States)

    Chackunkal, Ellen; Dhanapal Vogel, Vishnuprabha; Grycki, Meredith; Kostoff, Diana

    2017-06-01

    Computerized physician order entry has been shown to significantly improve chemotherapy safety by reducing the number of prescribing errors. Epic's Beacon Oncology Information System of computerized physician order entry and electronic medication administration was implemented in Henry Ford Health System's ambulatory oncology infusion centers on 9 November 2013. Since that time, compliance to the infusion workflow had not been assessed. The objective of this study was to optimize the current workflow and improve the compliance to this workflow in the ambulatory oncology setting. This study was a retrospective, quasi-experimental study which analyzed the composite workflow compliance rate of patient encounters from 9 to 23 November 2014. Based on this analysis, an intervention was identified and implemented in February 2015 to improve workflow compliance. The primary endpoint was to compare the composite compliance rate to the Beacon workflow before and after a pharmacy-initiated intervention. The intervention, which was education of infusion center staff, was initiated by ambulatory-based, oncology pharmacists and implemented by a multi-disciplinary team of pharmacists and nurses. The composite compliance rate was then reassessed for patient encounters from 2 to 13 March 2015 in order to analyze the effects of the determined intervention on compliance. The initial analysis in November 2014 revealed a composite compliance rate of 38%, and data analysis after the intervention revealed a statistically significant increase in the composite compliance rate to 83% ( p < 0.001). This study supports a pharmacist-initiated educational intervention can improve compliance to an ambulatory, oncology infusion workflow.

  17. Decaf: Decoupled Dataflows for In Situ High-Performance Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Dreher, M.; Peterka, T.

    2017-07-31

    Decaf is a dataflow system for the parallel communication of coupled tasks in an HPC workflow. The dataflow can perform arbitrary data transformations ranging from simply forwarding data to complex data redistribution. Decaf does this by allowing the user to allocate resources and execute custom code in the dataflow. All communication through the dataflow is efficient parallel message passing over MPI. The runtime for calling tasks is entirely message-driven; Decaf executes a task when all messages for the task have been received. Such a messagedriven runtime allows cyclic task dependencies in the workflow graph, for example, to enact computational steering based on the result of downstream tasks. Decaf includes a simple Python API for describing the workflow graph. This allows Decaf to stand alone as a complete workflow system, but Decaf can also be used as the dataflow layer by one or more other workflow systems to form a heterogeneous task-based computing environment. In one experiment, we couple a molecular dynamics code with a visualization tool using the FlowVR and Damaris workflow systems and Decaf for the dataflow. In another experiment, we test the coupling of a cosmology code with Voronoi tessellation and density estimation codes using MPI for the simulation, the DIY programming model for the two analysis codes, and Decaf for the dataflow. Such workflows consisting of heterogeneous software infrastructures exist because components are developed separately with different programming models and runtimes, and this is the first time that such heterogeneous coupling of diverse components was demonstrated in situ on HPC systems.

  18. Web-video-mining-supported workflow modeling for laparoscopic surgeries.

    Science.gov (United States)

    Liu, Rui; Zhang, Xiaoli; Zhang, Hao

    2016-11-01

    As quality assurance is of strong concern in advanced surgeries, intelligent surgical systems are expected to have knowledge such as the knowledge of the surgical workflow model (SWM) to support their intuitive cooperation with surgeons. For generating a robust and reliable SWM, a large amount of training data is required. However, training data collected by physically recording surgery operations is often limited and data collection is time-consuming and labor-intensive, severely influencing knowledge scalability of the surgical systems. The objective of this research is to solve the knowledge scalability problem in surgical workflow modeling with a low cost and labor efficient way. A novel web-video-mining-supported surgical workflow modeling (webSWM) method is developed. A novel video quality analysis method based on topic analysis and sentiment analysis techniques is developed to select high-quality videos from abundant and noisy web videos. A statistical learning method is then used to build the workflow model based on the selected videos. To test the effectiveness of the webSWM method, 250 web videos were mined to generate a surgical workflow for the robotic cholecystectomy surgery. The generated workflow was evaluated by 4 web-retrieved videos and 4 operation-room-recorded videos, respectively. The evaluation results (video selection consistency n-index ≥0.60; surgical workflow matching degree ≥0.84) proved the effectiveness of the webSWM method in generating robust and reliable SWM knowledge by mining web videos. With the webSWM method, abundant web videos were selected and a reliable SWM was modeled in a short time with low labor cost. Satisfied performances in mining web videos and learning surgery-related knowledge show that the webSWM method is promising in scaling knowledge for intelligent surgical systems. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. The myth of standardized workflow in primary care.

    Science.gov (United States)

    Holman, G Talley; Beasley, John W; Karsh, Ben-Tzion; Stone, Jamie A; Smith, Paul D; Wetterneck, Tosha B

    2016-01-01

    Primary care efficiency and quality are essential for the nation's health. The demands on primary care physicians (PCPs) are increasing as healthcare becomes more complex. A more complete understanding of PCP workflow variation is needed to guide future healthcare redesigns. This analysis evaluates workflow variation in terms of the sequence of tasks performed during patient visits. Two patient visits from 10 PCPs from 10 different United States Midwestern primary care clinics were analyzed to determine physician workflow. Tasks and the progressive sequence of those tasks were observed, documented, and coded by task category using a PCP task list. Variations in the sequence and prevalence of tasks at each stage of the primary care visit were assessed considering the physician, the patient, the visit's progression, and the presence of an electronic health record (EHR) at the clinic. PCP workflow during patient visits varies significantly, even for an individual physician, with no single or even common workflow pattern being present. The prevalence of specific tasks shifts significantly as primary care visits progress to their conclusion but, notably, PCPs collect patient information throughout the visit. PCP workflows were unpredictable during face-to-face patient visits. Workflow emerges as the result of a "dance" between physician and patient as their separate agendas are addressed, a side effect of patient-centered practice. Future healthcare redesigns should support a wide variety of task sequences to deliver high-quality primary care. The development of tools such as electronic health records must be based on the realities of primary care visits if they are to successfully support a PCP's mental and physical work, resulting in effective, safe, and efficient primary care. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Exploring Dental Providers’ Workflow in an Electronic Dental Record Environment

    Science.gov (United States)

    Schwei, Kelsey M; Cooper, Ryan; Mahnke, Andrea N.; Ye, Zhan

    2016-01-01

    Summary Background A workflow is defined as a predefined set of work steps and partial ordering of these steps in any environment to achieve the expected outcome. Few studies have investigated the workflow of providers in a dental office. It is important to understand the interaction of dental providers with the existing technologies at point of care to assess breakdown in the workflow which could contribute to better technology designs. Objective The study objective was to assess electronic dental record (EDR) workflows using time and motion methodology in order to identify breakdowns and opportunities for process improvement. Methods A time and motion methodology was used to study the human-computer interaction and workflow of dental providers with an EDR in four dental centers at a large healthcare organization. A data collection tool was developed to capture the workflow of dental providers and staff while they interacted with an EDR during initial, planned, and emergency patient visits, and at the front desk. Qualitative and quantitative analysis was conducted on the observational data. Results Breakdowns in workflow were identified while posting charges, viewing radiographs, e-prescribing, and interacting with patient scheduler. EDR interaction time was significantly different between dentists and dental assistants (6:20 min vs. 10:57 min, p = 0.013) and between dentists and dental hygienists (6:20 min vs. 9:36 min, p = 0.003). Conclusions On average, a dentist spent far less time than dental assistants and dental hygienists in data recording within the EDR. PMID:27437058

  1. Imaging mass spectrometry statistical analysis.

    Science.gov (United States)

    Jones, Emrys A; Deininger, Sören-Oliver; Hogendoorn, Pancras C W; Deelder, André M; McDonnell, Liam A

    2012-08-30

    Imaging mass spectrometry is increasingly used to identify new candidate biomarkers. This clinical application of imaging mass spectrometry is highly multidisciplinary: expertise in mass spectrometry is necessary to acquire high quality data, histology is required to accurately label the origin of each pixel's mass spectrum, disease biology is necessary to understand the potential meaning of the imaging mass spectrometry results, and statistics to assess the confidence of any findings. Imaging mass spectrometry data analysis is further complicated because of the unique nature of the data (within the mass spectrometry field); several of the assumptions implicit in the analysis of LC-MS/profiling datasets are not applicable to imaging. The very large size of imaging datasets and the reporting of many data analysis routines, combined with inadequate training and accessible reviews, have exacerbated this problem. In this paper we provide an accessible review of the nature of imaging data and the different strategies by which the data may be analyzed. Particular attention is paid to the assumptions of the data analysis routines to ensure that the reader is apprised of their correct usage in imaging mass spectrometry research. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. A rapid and practical strategy for the determination of platinum, palladium, ruthenium, rhodium, iridium and gold in large amounts of ultrabasic rock by inductively coupled plasma optical emission spectrometry combined with ultrasound extraction

    Science.gov (United States)

    Zhang, Gai; Tian, Min

    2015-04-01

    This proposed method regulated the determination of platinum, palladium, ruthenium, rhodium, iridium and gold in platinum-group ores by nickel sulfide fire assay—inductively coupled plasma optical emission spectrometry (ICP-OES) combined with ultrasound extraction for the first time. The quantitative limits were 0.013-0.023μg/g. The samples were fused to separate the platinum-group elements from matrix. The nickel sulfide button was then dissolved with hydrochloric acid and the insoluble platinum-group sulfide residue was dissolved with aqua regia by ultrasound bath and finally determined by ICP-OES. The proposed method has been applied into the determination of platinum-group element and gold in large amounts of ultrabasic rocks from the Great Dyke of Zimbabwe.

  3. Combined discrete nebulization and microextraction process for molybdenum determination by flame atomic absorption spectrometry (FAAS); Avaliacao da combinacao da nebulizacao discreta e processos de microextracao aplicados a determinacao de molibdenio por espectrometria de absorcao atomica com chama (FAAS)

    Energy Technology Data Exchange (ETDEWEB)

    Oviedo, Jenny A.; Jesus, Amanda M.D. de; Fialho, Lucimar L.; Pereira-Filho, Edenir R., E-mail: erpf@ufscar.br [Universidade Federal de Sao Carlos (UFSCar), SP (Brazil). Departamento de Quimica

    2014-04-15

    Simple and sensitive procedures for the extraction/preconcentration of molybdenum based on vortex-assisted solidified floating organic drop microextraction (VA-SFODME) and cloud point combined with flame absorption atomic spectrometry (FAAS) and discrete nebulization were developed. The influence of the discrete nebulization on the sensitivity of the molybdenum preconcentration processes was studied. An injection volume of 200 μ resulted in a lower relative standard deviation with both preconcentration procedures. Enrichment factors of 31 and 67 and limits of detection of 25 and 5 μ L{sup -1} were obtained for cloud point and VA-SFODME, respectively. The developed procedures were applied to the determination of Mo in mineral water and multivitamin samples. (author)

  4. Analysis of the Constituents in “Zhu She Yong Xue Shuan Tong” by Ultra High Performance Liquid Chromatography with Quadrupole Time-of-Flight Mass Spectrometry Combined with Preparative High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    Lin-Lin Wang

    2015-11-01

    Full Text Available “Zhu She Yong Xue Shuan Tong” lyophilized powder (ZSYXST, consists of a series of saponins extracted from Panax notoginseng, which has been widely used in China for the treatment of strokes. In this study, an ultra-high performance liquid chromatography with quadrupole time-of-flight mass spectrometry (UHPLC-Q-TOF/MS combined with preparative high performance liquid chromatography (PHPLC method was developed to rapidly identify both major and minor saponins in ZSYXST. Some high content components were removed through PHPLC in order to increase the sensitivity of the trace saponins. Then, specific characteristic fragment ions in both positive and negative mode were utilized to determine the types of aglycone, saccharide, as well as the saccharide chain linkages. As a result, 94 saponins, including 20 pairs of isomers and ten new compounds, which could represent higher than 98% components in ZSYXST, were identified or tentatively identified in commercial ZSYXST samples.

  5. Novel capsule phase microextraction in combination with liquid chromatography-tandem mass spectrometry for determining personal care products in environmental water.

    Science.gov (United States)

    Lakade, Sameer S; Borrull, Francesc; Furton, Kenneth G; Kabir, Abuzar; Marcé, Rosa Maria; Fontanals, Núria

    2018-05-01

    A novel sample preparation technique named capsule phase microextraction (CPME) is presented here. The technique utilizes a miniaturized microextraction capsule (MEC) as the extraction medium. The MEC consists of two conjoined porous tubular polypropylene membranes, one of which encapsulates the sorbent through sol-gel technology, while the other encapsulates a magnetic metal rod. As such, MEC integrates both the extraction and stirring mechanisms into a single device. The aim of this article is to demonstrate the application potential of CPME as sample preparation technique for the extraction of a group of personal care products (PCPs) from water matrices. Among the different sol-gel sorbent materials (UCON ® , poly(caprolactone-dimethylsiloxane-caprolactone) (PCAP-DMS-CAP) and Carbowax 20M (CW-20M)) evaluated, CW-20M MEC demonstrated the best extraction performance for the selected PCPs. The extraction conditions for sol-gel CW-20M MEC were optimized, including sample pH, stirring speed, addition of salt, extraction time, sample volume, liquid desorption solvent, and time. Under the optimal conditions, sol-gel CW-20M MEC provided recoveries, ranging between 47 and 90% for all analytes, except for ethylparaben, which showed a recovery of 26%. The method based on CPME with sol-gel CW-20M followed by liquid chromatography-tandem mass spectrometry was developed and validated for the extraction of PCPs from river water and effluent wastewater samples. When analyzing different environmental samples, some analytes such as 2,4-dihydroxybenzophenone, 2,2-dihydroxy-4-4 methoxybenzophenone and 3-benzophenone were found at low ng L -1 .

  6. Target analysis of primary aromatic amines combined with a comprehensive screening of migrating substances in kitchen utensils by liquid chromatography-high resolution mass spectrometry.

    Science.gov (United States)

    Sanchis, Yovana; Coscollà, Clara; Roca, Marta; Yusà, Vicent

    2015-06-01

    An analytical strategy including both the quantitative target analysis of 8 regulated primary aromatic amines (PAAs), as well as a comprehensive post-run target screening of 77 migrating substances, was developed for nylon utensils, using liquid chromatography-orbitrap-high resolution mass spectrometry (LC-HRMS) operating in full scan mode. The accurate mass data were acquired with a resolving power of 50,000 FWHM (scan speed, 2 Hz), and by alternating two acquisition events, ESI+ with and without fragmentation. The target method was validated after statistical optimization of the main ionization and fragmentation parameters. The quantitative method presented appropriate performance to be used in official monitoring with recoveries ranging from 78% to 112%, precision in terms of Relative Standard Deviation (RSD) was less than 15%, and the limits of quantification were between 2 and 2.5 µg kg(-1). For post-target screening, a customized theoretical database was built for food contact material migrants, including bisphenols, phthalates, and other amines. For identification purposes, accurate exact mass (<5 ppm) and some diagnostic ions including fragments were used. The strategy was applied to 10 real samples collected from different retailers in the Valencian Region (Spain) during 2014. Six out of eight target PAAs were detected in at least one sample in the target analysis. The most frequently detected compounds were 4,4'-methylenedianiline and aniline, with concentrations ranging from 2.4 to 19,715 µg kg(-1) and 2.5 to 283 µg kg(-1), respectively. Two phthalates were identified and confirmed in the post-run target screening analysis. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Qualitative Metabolome Analysis of Human Cerebrospinal Fluid by 13C-/12C-Isotope Dansylation Labeling Combined with Liquid Chromatography Fourier Transform Ion Cyclotron Resonance Mass Spectrometry

    Science.gov (United States)

    Guo, Kevin; Bamforth, Fiona; Li, Liang

    2011-02-01

    Metabolome analysis of human cerebrospinal fluid (CSF) is challenging because of low abundance of metabolites present in a small volume of sample. We describe and apply a sensitive isotope labeling LC-MS technique for qualitative analysis of the CSF metabolome. After a CSF sample is divided into two aliquots, they are labeled by 13C-dansyl and 12C-dansyl chloride, respectively. The differentially labeled aliquots are then mixed and subjected to LC-MS using Fourier-transform ion cyclotron resonance mass spectrometry (FTICR MS). Dansylation offers significant improvement in the performance of chromatography separation and detection sensitivity. Moreover, peaks detected in the mass spectra can be readily analyzed for ion pair recognition and database search based on accurate mass and/or retention time information. It is shown that about 14,000 features can be detected in a 25-min LC-FTICR MS run of a dansyl-labeled CSF sample, from which about 500 metabolites can be profiled. Results from four CSF samples are compared to gauge the detectability of metabolites by this method. About 261 metabolites are commonly detected in replicate runs of four samples. In total, 1132 unique metabolite ion pairs are detected and 347 pairs (31%) matched with at least one metabolite in the Human Metabolome Database. We also report a dansylation library of 220 standard compounds and, using this library, about 85 metabolites can be positively identified. Among them, 21 metabolites have never been reported to be associated with CSF. These results illustrate that the dansylation LC-FTICR MS method can be used to analyze the CSF metabolome in a more comprehensive manner.

  8. A novel method of liquid chromatography–tandem mass spectrometry combined with chemical derivatization for the determination of ribonucleosides in urine

    Energy Technology Data Exchange (ETDEWEB)

    Li, Shangfu [State Key Laboratory of Environmental and Biological Analysis, Department of Chemistry, Hong Kong Baptist University, Hong Kong (China); Jin, Yibao [Shenzhen Institute for Drug Control, Shenzhen 518055 (China); State Key Laboratory Breeding Base-Shenzhen Key Laboratory of Chemical Biology, Graduate School at Shenzhen, Tsinghua University, Shenzhen 518055 (China); Tang, Zhi; Lin, Shuhai [State Key Laboratory of Environmental and Biological Analysis, Department of Chemistry, Hong Kong Baptist University, Hong Kong (China); Liu, Hongxia [State Key Laboratory Breeding Base-Shenzhen Key Laboratory of Chemical Biology, Graduate School at Shenzhen, Tsinghua University, Shenzhen 518055 (China); Key Laboratory of Metabolomics at Shenzhen, Shenzhen 518055 (China); Jiang, Yuyang [State Key Laboratory Breeding Base-Shenzhen Key Laboratory of Chemical Biology, Graduate School at S