WorldWideScience

Sample records for hallam automated pipeline

  1. Oil pipeline valve automation for spill reduction

    Energy Technology Data Exchange (ETDEWEB)

    Mohitpour, Mo; Trefanenko, Bill [Enbridge Technology Inc, Calgary (Canada); Tolmasquim, Sueli Tiomno; Kossatz, Helmut [TRANSPETRO - PETROBRAS Transporte S.A., Rio de Janeiro, RJ (Brazil)

    2003-07-01

    Liquid pipeline codes generally stipulate placement of block valves along liquid transmission pipelines such as on each side of major river crossings where environmental hazards could cause or are foreseen to potentially cause serious consequences. Codes, however, do not stipulate any requirement for block valve spacing for low vapour pressure petroleum transportation, nor for remote pipeline valve operations to reduce spills. A review of pipeline codes for valve requirement and spill limitation in high consequence areas is thus presented along with a criteria for an acceptable spill volume that could be caused by pipeline leak/full rupture. A technique for deciding economically and technically effective pipeline block valve automation for remote operation to reduce oil spilled and control of hazards is also provided. In this review, industry practice is highlighted and application of the criteria for maximum permissible oil spill and the technique for deciding valve automation thus developed, as applied to ORSUB pipeline is presented. ORSUB is one of the three initially selected pipelines that have been studied. These pipelines represent about 14% of the total length of petroleum transmission lines operated by PETROBRAS Transporte S.A. (TRANSPETRO) in Brazil. Based on the implementation of valve motorization on these three pipeline, motorization of block valves for remote operation on the remaining pipelines is intended, depending on the success of these implementations, on historical records of failure and appropriate ranking. (author)

  2. ARTIP: Automated Radio Telescope Image Processing Pipeline

    Science.gov (United States)

    Sharma, Ravi; Gyanchandani, Dolly; Kulkarni, Sarang; Gupta, Neeraj; Pathak, Vineet; Pande, Arti; Joshi, Unmesh

    2018-02-01

    The Automated Radio Telescope Image Processing Pipeline (ARTIP) automates the entire process of flagging, calibrating, and imaging for radio-interferometric data. ARTIP starts with raw data, i.e. a measurement set and goes through multiple stages, such as flux calibration, bandpass calibration, phase calibration, and imaging to generate continuum and spectral line images. Each stage can also be run independently. The pipeline provides continuous feedback to the user through various messages, charts and logs. It is written using standard python libraries and the CASA package. The pipeline can deal with datasets with multiple spectral windows and also multiple target sources which may have arbitrary combinations of flux/bandpass/phase calibrators.

  3. Automated Single Cell Data Decontamination Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Tennessen, Kristin [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.; Pati, Amrita [Lawrence Berkeley National Lab. (LBNL), Walnut Creek, CA (United States). Dept. of Energy Joint Genome Inst.

    2014-03-21

    Recent technological advancements in single-cell genomics have encouraged the classification and functional assessment of microorganisms from a wide span of the biospheres phylogeny.1,2 Environmental processes of interest to the DOE, such as bioremediation and carbon cycling, can be elucidated through the genomic lens of these unculturable microbes. However, contamination can occur at various stages of the single-cell sequencing process. Contaminated data can lead to wasted time and effort on meaningless analyses, inaccurate or erroneous conclusions, and pollution of public databases. A fully automated decontamination tool is necessary to prevent these instances and increase the throughput of the single-cell sequencing process

  4. LEMON: an (almost) completely automated differential-photometry pipeline

    Science.gov (United States)

    Terrón, V.; Fernández, M.

    2011-11-01

    We present LEMON, the CCD differential-photometry pipeline, written in Python, developed at the Institute of Astrophysics of Andalusia (CSIC) and originally designed for its use at the 1.23 m CAHA telescope for automated variable stars detection and analysis. The aim of our tool is to make it possible to completely reduce thousands of images of time series in a matter of hours and with minimal user interaction, if not none at all, automatically detecting variable stars and presenting the results to the astronomer.

  5. Automation of the second iron ore slurry pipeline from Samarco

    Energy Technology Data Exchange (ETDEWEB)

    Aguilar, Juliana M.; Fonseca, Mario L.; Drumond, Pablo P.; Barbosa, Sylvio [IHM Engenharia, Belo Horizonte, MG (Brazil)

    2009-07-01

    The second iron ore slurry pipeline from Samarco was build to attend the Third Pellet Plant Project, which includes a new Concentration Plant at Germano-MG and a third Pellet Plant at Ubu-ES. It has 396km of extension and links the two plants by pulping the iron ore slurry prepared at Germano Unit. This works aims to present the iron ore slurry pipeline with emphasis on the automation architecture for the supervision and control system, interconnect throughout the pipe extension by fiber optics. The control system is composed of ControlLogix CLP's at the pulping and valve station and Micrologix CLP's at the pressure and cathodic protection monitoring points, totalizing 19 PLC's. The supervisory system was developed using the Wonderware IAS 3.0 suite, including the supervisory software InTouch 9.5 and the integrated ArchestrA IDE, and is composed of two data servers in redundancy and nine operation stations. The control and supervision system is interconnect through and Ethernet network using fiber optics and multiplexer modules (GE JungleMux) for voice, data and video. Among the expected results, it can be highlighted the sequence automation, greater process data availability (real and historical) and greater facility for the operation and detection of failures. (author)

  6. An Automated Field Phenotyping Pipeline for Application in Grapevine Research

    Science.gov (United States)

    Kicherer, Anna; Herzog, Katja; Pflanz, Michael; Wieland, Markus; Rüger, Philipp; Kecke, Steffen; Kuhlmann, Heiner; Töpfer, Reinhard

    2015-01-01

    Due to its perennial nature and size, the acquisition of phenotypic data in grapevine research is almost exclusively restricted to the field and done by visual estimation. This kind of evaluation procedure is limited by time, cost and the subjectivity of records. As a consequence, objectivity, automation and more precision of phenotypic data evaluation are needed to increase the number of samples, manage grapevine repositories, enable genetic research of new phenotypic traits and, therefore, increase the efficiency in plant research. In the present study, an automated field phenotyping pipeline was setup and applied in a plot of genetic resources. The application of the PHENObot allows image acquisition from at least 250 individual grapevines per hour directly in the field without user interaction. Data management is handled by a database (IMAGEdata). The automatic image analysis tool BIVcolor (Berries in Vineyards-color) permitted the collection of precise phenotypic data of two important fruit traits, berry size and color, within a large set of plants. The application of the PHENObot represents an automated tool for high-throughput sampling of image data in the field. The automated analysis of these images facilitates the generation of objective and precise phenotypic data on a larger scale. PMID:25730485

  7. Reduce operational cost and extend the life of pipeline infrastructure by automating remote cathodic protection systems

    Energy Technology Data Exchange (ETDEWEB)

    Rosado, Elroy [Freewave Technologies, Inc., Boulder, CO (United States). Latin America

    2009-07-01

    Energy and Pipeline Companies wrestle to control operating costs largely affected by new government regulations, ageing buried metal assets, rising steel prices, expanding pipeline operations, new interference points, HCA encroachment, restrictive land use policies, heightened network security, and an ageing soon-to-retire work force. With operating costs on the rise, seemingly out of control, many CP and Operations Professionals look to past best practices in cost containment through automation. Many companies achieve solid business results through deployment of telemetry and SCADA automation of remote assets and now hope to expand this success to further optimize operations by automating remote cathodic protection systems. This presentation will provide examples of how new remote cathodic protection systems are helping energy and pipeline companies address the growing issue of the aging pipeline infrastructure and reduce their costs while optimizing their operations. (author)

  8. AGAPE (Automated Genome Analysis PipelinE for pan-genome analysis of Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Giltae Song

    Full Text Available The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community.

  9. SeqMule: automated pipeline for analysis of human exome/genome sequencing data

    OpenAIRE

    Yunfei Guo; Xiaolei Ding; Yufeng Shen; Lyon, Gholson J; Kai Wang

    2015-01-01

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computati...

  10. An Advanced Pre-Processing Pipeline to Improve Automated Photogrammetric Reconstructions of Architectural Scenes

    Directory of Open Access Journals (Sweden)

    Marco Gaiani

    2016-02-01

    Full Text Available Automated image-based 3D reconstruction methods are more and more flooding our 3D modeling applications. Fully automated solutions give the impression that from a sample of randomly acquired images we can derive quite impressive visual 3D models. Although the level of automation is reaching very high standards, image quality is a fundamental pre-requisite to produce successful and photo-realistic 3D products, in particular when dealing with large datasets of images. This article presents an efficient pipeline based on color enhancement, image denoising, color-to-gray conversion and image content enrichment. The pipeline stems from an analysis of various state-of-the-art algorithms and aims to adjust the most promising methods, giving solutions to typical failure causes. The assessment evaluation proves how an effective image pre-processing, which considers the entire image dataset, can improve the automated orientation procedure and dense 3D point cloud reconstruction, even in the case of poor texture scenarios.

  11. TRAPLINE: a standardized and automated pipeline for RNA sequencing data analysis, evaluation and annotation.

    Science.gov (United States)

    Wolfien, Markus; Rimmbach, Christian; Schmitz, Ulf; Jung, Julia Jeannine; Krebs, Stefan; Steinhoff, Gustav; David, Robert; Wolkenhauer, Olaf

    2016-01-06

    Technical advances in Next Generation Sequencing (NGS) provide a means to acquire deeper insights into cellular functions. The lack of standardized and automated methodologies poses a challenge for the analysis and interpretation of RNA sequencing data. We critically compare and evaluate state-of-the-art bioinformatics approaches and present a workflow that integrates the best performing data analysis, data evaluation and annotation methods in a Transparent, Reproducible and Automated PipeLINE (TRAPLINE) for RNA sequencing data processing (suitable for Illumina, SOLiD and Solexa). Comparative transcriptomics analyses with TRAPLINE result in a set of differentially expressed genes, their corresponding protein-protein interactions, splice variants, promoter activity, predicted miRNA-target interactions and files for single nucleotide polymorphism (SNP) calling. The obtained results are combined into a single file for downstream analysis such as network construction. We demonstrate the value of the proposed pipeline by characterizing the transcriptome of our recently described stem cell derived antibiotic selected cardiac bodies ('aCaBs'). TRAPLINE supports NGS-based research by providing a workflow that requires no bioinformatics skills, decreases the processing time of the analysis and works in the cloud. The pipeline is implemented in the biomedical research platform Galaxy and is freely accessible via www.sbi.uni-rostock.de/RNAseqTRAPLINE or the specific Galaxy manual page (https://usegalaxy.org/u/mwolfien/p/trapline---manual).

  12. Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.

    Directory of Open Access Journals (Sweden)

    Jens H Westhoff

    Full Text Available The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.

  13. Development of an automated imaging pipeline for the analysis of the zebrafish larval kidney.

    Science.gov (United States)

    Westhoff, Jens H; Giselbrecht, Stefan; Schmidts, Miriam; Schindler, Sebastian; Beales, Philip L; Tönshoff, Burkhard; Liebel, Urban; Gehrig, Jochen

    2013-01-01

    The analysis of kidney malformation caused by environmental influences during nephrogenesis or by hereditary nephropathies requires animal models allowing the in vivo observation of developmental processes. The zebrafish has emerged as a useful model system for the analysis of vertebrate organ development and function, and it is suitable for the identification of organotoxic or disease-modulating compounds on a larger scale. However, to fully exploit its potential in high content screening applications, dedicated protocols are required allowing the consistent visualization of inner organs such as the embryonic kidney. To this end, we developed a high content screening compatible pipeline for the automated imaging of standardized views of the developing pronephros in zebrafish larvae. Using a custom designed tool, cavities were generated in agarose coated microtiter plates allowing for accurate positioning and orientation of zebrafish larvae. This enabled the subsequent automated acquisition of stable and consistent dorsal views of pronephric kidneys. The established pipeline was applied in a pilot screen for the analysis of the impact of potentially nephrotoxic drugs on zebrafish pronephros development in the Tg(wt1b:EGFP) transgenic line in which the developing pronephros is highlighted by GFP expression. The consistent image data that was acquired allowed for quantification of gross morphological pronephric phenotypes, revealing concentration dependent effects of several compounds on nephrogenesis. In addition, applicability of the imaging pipeline was further confirmed in a morpholino based model for cilia-associated human genetic disorders associated with different intraflagellar transport genes. The developed tools and pipeline can be used to study various aspects in zebrafish kidney research, and can be readily adapted for the analysis of other organ systems.

  14. Middle Phanerozoic mass extinctions and a tribute to the work of Professor Tony Hallam

    NARCIS (Netherlands)

    Wignall, Paul B.; van de Schootbrugge, Bas

    Tony Hallam's contributions to mass extinction studies span more than 50 years and this thematic issue provides an opportunity to pay tribute to the many pioneering contributions he has made to this field. Early work (1961) on the Jurassic in Europe revealed a link, during the Toarcian Stage,

  15. Data Validation Package, June 2016 Groundwater Sampling at the Hallam, Nebraska, Decommissioned Reactor Site, August 2016

    Energy Technology Data Exchange (ETDEWEB)

    Surovchak, Scott [USDOE Office of Legacy Management, Washington, DC (United States); Miller, Michele [Navarro Research and Engineering, Oak Ridge, TN (United States)

    2016-08-01

    The 2008 Long-Term Surveillance Plan [LTSP] for the Decommissioned Hallam Nuclear Power Facility, Hallam, Nebraska (http://www.lm.doe.gov/Hallam/Documents.aspx) requires groundwater monitoring once every 2 years. Seventeen monitoring wells at the Hallam site were sampled during this event as specified in the plan. Planned monitoring locations are shown in Attachment 1, Sampling and Analysis Work Order. Water levels were measured at all sampled wells and at two additional wells (6A and 6B) prior to the start of sampling. Additionally, water levels of each sampled well were measured at the beginning of sampling. See Attachment 2, Trip Report, for additional details. Sampling and analysis were conducted as specified in Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PRO/S04351, continually updated, http://energy.gov/lm/downloads/sampling-and-analysis-plan-us-department- energy-office-legacy-management-sites). Gross alpha and gross beta are the only parameters that were detected at statistically significant concentrations. Time/concentration graphs of the gross alpha and gross beta data are included in Attachment 3, Data Presentation. The gross alpha and gross beta activity concentrations observed are consistent with values previously observed and are attributed to naturally occurring radionuclides (e.g., uranium and uranium decay chain products) in the groundwater.

  16. Automated pipeline for rapid production and screening of HIV-specific monoclonal antibodies using pichia pastoris.

    Science.gov (United States)

    Shah, Kartik A; Clark, John J; Goods, Brittany A; Politano, Timothy J; Mozdzierz, Nicholas J; Zimnisky, Ross M; Leeson, Rachel L; Love, J Christopher; Love, Kerry R

    2015-12-01

    Monoclonal antibodies (mAbs) that bind and neutralize human pathogens have great therapeutic potential. Advances in automated screening and liquid handling have resulted in the ability to discover antigen-specific antibodies either directly from human blood or from various combinatorial libraries (phage, bacteria, or yeast). There remain, however, bottlenecks in the cloning, expression and evaluation of such lead antibodies identified in primary screens that hinder high-throughput screening. As such, "hit-to-lead identification" remains both expensive and time-consuming. By combining the advantages of overlap extension PCR (OE-PCR) and a genetically stable yet easily manipulatable microbial expression host Pichia pastoris, we have developed an automated pipeline for the rapid production and screening of full-length antigen-specific mAbs. Here, we demonstrate the speed, feasibility and cost-effectiveness of our approach by generating several broadly neutralizing antibodies against human immunodeficiency virus (HIV). © 2015 Wiley Periodicals, Inc.

  17. An automated pipeline for cortical surface generation and registration of the cerebral cortex

    Science.gov (United States)

    Li, Wen; Ibanez, Luis; Gelas, Arnaud; Yeo, B. T. Thomas; Niethammer, Marc; Andreasen, Nancy C.; Magnotta, Vincent A.

    2011-03-01

    The human cerebral cortex is one of the most complicated structures in the body. It has a highly convoluted structure with much of the cortical sheet buried in sulci. Based on cytoarchitectural and functional imaging studies, it is possible to segment the cerebral cortex into several subregions. While it is only possible to differentiate the true anatomical subregions based on cytoarchitecture, the surface morphometry aligns closely with the underlying cytoarchitecture and provides features that allow the surface of the cortex to be parcellated based on the sulcal and gyral patterns that are readily visible on the MR images. We have developed a fully automated pipeline for the generation and registration of cortical surfaces in the spherical domain. The pipeline initiates with the BRAINS AutoWorkup pipeline. Subsequently, topology correction and surface generation is performed to generate a genus zero surface and mapped to a sphere. Several surface features are then calculated to drive the registration between the atlas surface and other datasets. A spherical diffeomorphic demons algorithm is used to co-register an atlas surface onto a subject surface. A lobar based atlas of the cerebral cortex was created from a manual parcellation of the cortex. The atlas surface was then co-registered to five additional subjects using a spherical diffeomorphic demons algorithm. The labels from the atlas surface were warped on the subject surface and compared to the manual raters. The average Dice overlap index was 0.89 across all regions.

  18. Rnnotator: an automated de novo transcriptome assembly pipeline from stranded RNA-Seq reads

    Energy Technology Data Exchange (ETDEWEB)

    Martin, Jeffrey; Bruno, Vincent M.; Fang, Zhide; Meng, Xiandong; Blow, Matthew; Zhang, Tao; Sherlock, Gavin; Snyder, Michael; Wang, Zhong

    2010-11-19

    Background: Comprehensive annotation and quantification of transcriptomes are outstanding problems in functional genomics. While high throughput mRNA sequencing (RNA-Seq) has emerged as a powerful tool for addressing these problems, its success is dependent upon the availability and quality of reference genome sequences, thus limiting the organisms to which it can be applied. Results: Here, we describe Rnnotator, an automated software pipeline that generates transcript models by de novo assembly of RNA-Seq data without the need for a reference genome. We have applied the Rnnotator assembly pipeline to two yeast transcriptomes and compared the results to the reference gene catalogs of these organisms. The contigs produced by Rnnotator are highly accurate (95percent) and reconstruct full-length genes for the majority of the existing gene models (54.3percent). Furthermore, our analyses revealed many novel transcribed regions that are absent from well annotated genomes, suggesting Rnnotator serves as a complementary approach to analysis based on a reference genome for comprehensive transcriptomics. Conclusions: These results demonstrate that the Rnnotator pipeline is able to reconstruct full-length transcripts in the absence of a complete reference genome.

  19. SBpipe: a collection of pipelines for automating repetitive simulation and analysis tasks.

    Science.gov (United States)

    Dalle Pezze, Piero; Le Novère, Nicolas

    2017-04-10

    The rapid growth of the number of mathematical models in Systems Biology fostered the development of many tools to simulate and analyse them. The reliability and precision of these tasks often depend on multiple repetitions and they can be optimised if executed as pipelines. In addition, new formal analyses can be performed on these repeat sequences, revealing important insights about the accuracy of model predictions. Here we introduce SBpipe, an open source software tool for automating repetitive tasks in model building and simulation. Using basic YAML configuration files, SBpipe builds a sequence of repeated model simulations or parameter estimations, performs analyses from this generated sequence, and finally generates a LaTeX/PDF report. The parameter estimation pipeline offers analyses of parameter profile likelihood and parameter correlation using samples from the computed estimates. Specific pipelines for scanning of one or two model parameters at the same time are also provided. Pipelines can run on multicore computers, Sun Grid Engine (SGE), or Load Sharing Facility (LSF) clusters, speeding up the processes of model building and simulation. SBpipe can execute models implemented in COPASI, Python or coded in any other programming language using Python as a wrapper module. Future support for other software simulators can be dynamically added without affecting the current implementation. SBpipe allows users to automatically repeat the tasks of model simulation and parameter estimation, and extract robustness information from these repeat sequences in a solid and consistent manner, facilitating model development and analysis. The source code and documentation of this project are freely available at the web site: https://pdp10.github.io/sbpipe/ .

  20. TriAnnot: A Versatile and High Performance Pipeline for the Automated Annotation of Plant Genomes

    Science.gov (United States)

    Leroy, Philippe; Guilhot, Nicolas; Sakai, Hiroaki; Bernard, Aurélien; Choulet, Frédéric; Theil, Sébastien; Reboux, Sébastien; Amano, Naoki; Flutre, Timothée; Pelegrin, Céline; Ohyanagi, Hajime; Seidel, Michael; Giacomoni, Franck; Reichstadt, Mathieu; Alaux, Michael; Gicquello, Emmanuelle; Legeai, Fabrice; Cerutti, Lorenzo; Numa, Hisataka; Tanaka, Tsuyoshi; Mayer, Klaus; Itoh, Takeshi; Quesneville, Hadi; Feuillet, Catherine

    2012-01-01

    In support of the international effort to obtain a reference sequence of the bread wheat genome and to provide plant communities dealing with large and complex genomes with a versatile, easy-to-use online automated tool for annotation, we have developed the TriAnnot pipeline. Its modular architecture allows for the annotation and masking of transposable elements, the structural, and functional annotation of protein-coding genes with an evidence-based quality indexing, and the identification of conserved non-coding sequences and molecular markers. The TriAnnot pipeline is parallelized on a 712 CPU computing cluster that can run a 1-Gb sequence annotation in less than 5 days. It is accessible through a web interface for small scale analyses or through a server for large scale annotations. The performance of TriAnnot was evaluated in terms of sensitivity, specificity, and general fitness using curated reference sequence sets from rice and wheat. In less than 8 h, TriAnnot was able to predict more than 83% of the 3,748 CDS from rice chromosome 1 with a fitness of 67.4%. On a set of 12 reference Mb-sized contigs from wheat chromosome 3B, TriAnnot predicted and annotated 93.3% of the genes among which 54% were perfectly identified in accordance with the reference annotation. It also allowed the curation of 12 genes based on new biological evidences, increasing the percentage of perfect gene prediction to 63%. TriAnnot systematically showed a higher fitness than other annotation pipelines that are not improved for wheat. As it is easily adaptable to the annotation of other plant genomes, TriAnnot should become a useful resource for the annotation of large and complex genomes in the future. PMID:22645565

  1. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    Science.gov (United States)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  2. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines.

    Science.gov (United States)

    Leung, Wai Yi; Marschall, Tobias; Paudel, Yogesh; Falquet, Laurent; Mei, Hailiang; Schönhuth, Alexander; Maoz Moss, Tiffanie Yael

    2015-03-25

    Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans also for other species. Objectives of this work included: a) Creating an automated, standardized pipeline for SV prediction. b) Identifying the best tool(s) for SV prediction through benchmarking. c) Providing a statistically sound method for merging SV calls. The SV-AUTOPILOT meta-tool platform is an automated pipeline for standardization of SV prediction and SV tool development in paired-end next-generation sequencing (NGS) analysis. SV-AUTOPILOT comes in the form of a virtual machine, which includes all datasets, tools and algorithms presented here. The virtual machine easily allows one to add, replace and update genomes, SV callers and post-processing routines and therefore provides an easy, out-of-the-box environment for complex SV discovery tasks. SV-AUTOPILOT was used to make a direct comparison between 7 popular SV tools on the Arabidopsis thaliana genome using the Landsberg (Ler) ecotype as a standardized dataset. Recall and precision measurements suggest that Pindel and Clever were the most adaptable to this dataset across all size ranges while Delly performed well for SVs larger than 250 nucleotides. A novel, statistically-sound merging process, which can control the false discovery rate, reduced the false positive rate on the Arabidopsis benchmark dataset used here by >60%. SV-AUTOPILOT provides a meta-tool platform for future SV tool development and the benchmarking of tools on other genomes using a standardized pipeline. It optimizes detection of SVs in non-human genomes using statistically robust merging. The benchmarking in this study has demonstrated the power of 7 different SV tools for analyzing different size classes and types of structural variants. The optional merge

  3. SeqMule: automated pipeline for analysis of human exome/genome sequencing data.

    Science.gov (United States)

    Guo, Yunfei; Ding, Xiaolei; Shen, Yufeng; Lyon, Gholson J; Wang, Kai

    2015-09-18

    Next-generation sequencing (NGS) technology has greatly helped us identify disease-contributory variants for Mendelian diseases. However, users are often faced with issues such as software compatibility, complicated configuration, and no access to high-performance computing facility. Discrepancies exist among aligners and variant callers. We developed a computational pipeline, SeqMule, to perform automated variant calling from NGS data on human genomes and exomes. SeqMule integrates computational-cluster-free parallelization capability built on top of the variant callers, and facilitates normalization/intersection of variant calls to generate consensus set with high confidence. SeqMule integrates 5 alignment tools, 5 variant calling algorithms and accepts various combinations all by one-line command, therefore allowing highly flexible yet fully automated variant calling. In a modern machine (2 Intel Xeon X5650 CPUs, 48 GB memory), when fast turn-around is needed, SeqMule generates annotated VCF files in a day from a 30X whole-genome sequencing data set; when more accurate calling is needed, SeqMule generates consensus call set that improves over single callers, as measured by both Mendelian error rate and consistency. SeqMule supports Sun Grid Engine for parallel processing, offers turn-key solution for deployment on Amazon Web Services, allows quality check, Mendelian error check, consistency evaluation, HTML-based reports. SeqMule is available at http://seqmule.openbioinformatics.org.

  4. MG-Digger: an automated pipeline to search for giant virus-related sequences in metagenomes

    Directory of Open Access Journals (Sweden)

    Jonathan eVerneau

    2016-03-01

    Full Text Available The number of metagenomic studies conducted each year is growing dramatically. Storage and analysis of such big data is difficult and time-consuming. Interestingly, analysis shows that environmental and human metagenomes include a significant amount of non-annotated sequences, representing a ‘dark matter’. We established a bioinformatics pipeline that automatically detects metagenome reads matching query sequences from a given set and applied this tool to the detection of sequences matching large and giant DNA viral members of the proposed order Megavirales or virophages. A total of 1,045 environmental and human metagenomes (≈ 1 Terabase pairs were collected, processed and stored on our bioinformatics server. In addition, nucleotide and protein sequences from 93 Megavirales representatives, including 19 giant viruses of amoeba, and five virophages, were collected. The pipeline was generated by scripts written in Python language and entitled MG-Digger. Metagenomes previously found to contain megavirus-like sequences were tested as controls. MG-Digger was able to annotate hundreds of metagenome sequences as best matching those of giant viruses. These sequences were most often found to be similar to phycodnavirus or mimivirus sequences, but included reads related to recently available pandoraviruses, Pithovirus sibericum, and faustoviruses. Compared to other tools, MG-Digger combined stand-alone use on Linux or Windows operating systems through a user-friendly interface, implementation of ready-to-use customized metagenome databases and query sequence databases, adjustable parameters for BLAST searches, and creation of output files containing selected reads with best match identification. Compared to Metavir 2, a reference tool in viral metagenome analysis, MG-Digger detected 8% more true positive Megavirales-related reads in a control metagenome. The present work shows that massive, automated and recurrent analyses of metagenomes are

  5. ImagePlane: an automated image analysis pipeline for high-throughput screens using the planarian Schmidtea mediterranea.

    Science.gov (United States)

    Flygare, Steven; Campbell, Michael; Ross, Robert Mars; Moore, Barry; Yandell, Mark

    2013-08-01

    ImagePlane is a modular pipeline for automated, high-throughput image analysis and information extraction. Designed to support planarian research, ImagePlane offers a self-parameterizing adaptive thresholding algorithm; an algorithm that can automatically segment animals into anterior-posterior/left-right quadrants for automated identification of region-specific differences in gene and protein expression; and a novel algorithm for quantification of morphology of animals, independent of their orientations and sizes. ImagePlane also provides methods for automatic report generation, and its outputs can be easily imported into third-party tools such as R and Excel. Here we demonstrate the pipeline's utility for identification of genes involved in stem cell proliferation in the planarian Schmidtea mediterranea. Although designed to support planarian studies, ImagePlane will prove useful for cell-based studies as well.

  6. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI.

    Directory of Open Access Journals (Sweden)

    Nathan W Churchill

    Full Text Available BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the "pipeline" significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard "fixed" preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (<3 minutes each, demonstrating that with pipeline optimization, it is possible to obtain reliable results and brain-behaviour correlations in relatively small datasets.

  7. The Harvard Automated Processing Pipeline for Electroencephalography (HAPPE: Standardized Processing Software for Developmental and High-Artifact Data

    Directory of Open Access Journals (Sweden)

    Laurel J. Gabard-Durnam

    2018-02-01

    Full Text Available Electroenchephalography (EEG recordings collected with developmental populations present particular challenges from a data processing perspective. These EEGs have a high degree of artifact contamination and often short recording lengths. As both sample sizes and EEG channel densities increase, traditional processing approaches like manual data rejection are becoming unsustainable. Moreover, such subjective approaches preclude standardized metrics of data quality, despite the heightened importance of such measures for EEGs with high rates of initial artifact contamination. There is presently a paucity of automated resources for processing these EEG data and no consistent reporting of data quality measures. To address these challenges, we propose the Harvard Automated Processing Pipeline for EEG (HAPPE as a standardized, automated pipeline compatible with EEG recordings of variable lengths and artifact contamination levels, including high-artifact and short EEG recordings from young children or those with neurodevelopmental disorders. HAPPE processes event-related and resting-state EEG data from raw files through a series of filtering, artifact rejection, and re-referencing steps to processed EEG suitable for time-frequency-domain analyses. HAPPE also includes a post-processing report of data quality metrics to facilitate the evaluation and reporting of data quality in a standardized manner. Here, we describe each processing step in HAPPE, perform an example analysis with EEG files we have made freely available, and show that HAPPE outperforms seven alternative, widely-used processing approaches. HAPPE removes more artifact than all alternative approaches while simultaneously preserving greater or equivalent amounts of EEG signal in almost all instances. We also provide distributions of HAPPE's data quality metrics in an 867 file dataset as a reference distribution and in support of HAPPE's performance across EEG data with variable artifact

  8. CFSAN SNP Pipeline: an automated method for constructing SNP matrices from next-generation sequence data

    Directory of Open Access Journals (Sweden)

    Steve Davis

    2015-08-01

    Full Text Available The analysis of next-generation sequence (NGS data is often a fragmented step-wise process. For example, multiple pieces of software are typically needed to map NGS reads, extract variant sites, and construct a DNA sequence matrix containing only single nucleotide polymorphisms (i.e., a SNP matrix for a set of individuals. The management and chaining of these software pieces and their outputs can often be a cumbersome and difficult task. Here, we present CFSAN SNP Pipeline, which combines into a single package the mapping of NGS reads to a reference genome with Bowtie2, processing of those mapping (BAM files using SAMtools, identification of variant sites using VarScan, and production of a SNP matrix using custom Python scripts. We also introduce a Python package (CFSAN SNP Mutator that when given a reference genome will generate variants of known position against which we validate our pipeline. We created 1,000 simulated Salmonella enterica sp. enterica Serovar Agona genomes at 100× and 20× coverage, each containing 500 SNPs, 20 single-base insertions and 20 single-base deletions. For the 100× dataset, the CFSAN SNP Pipeline recovered 98.9% of the introduced SNPs and had a false positive rate of 1.04 × 10−6; for the 20× dataset 98.8% of SNPs were recovered and the false positive rate was 8.34 × 10−7. Based on these results, CFSAN SNP Pipeline is a robust and accurate tool that it is among the first to combine into a single executable the myriad steps required to produce a SNP matrix from NGS data. Such a tool is useful to those working in an applied setting (e.g., food safety traceback investigations as well as for those interested in evolutionary questions.

  9. Transcriptator: An Automated Computational Pipeline to Annotate Assembled Reads and Identify Non Coding RNA.

    Directory of Open Access Journals (Sweden)

    Kumar Parijat Tripathi

    Full Text Available RNA-seq is a new tool to measure RNA transcript counts, using high-throughput sequencing at an extraordinary accuracy. It provides quantitative means to explore the transcriptome of an organism of interest. However, interpreting this extremely large data into biological knowledge is a problem, and biologist-friendly tools are lacking. In our lab, we developed Transcriptator, a web application based on a computational Python pipeline with a user-friendly Java interface. This pipeline uses the web services available for BLAST (Basis Local Search Alignment Tool, QuickGO and DAVID (Database for Annotation, Visualization and Integrated Discovery tools. It offers a report on statistical analysis of functional and Gene Ontology (GO annotation's enrichment. It helps users to identify enriched biological themes, particularly GO terms, pathways, domains, gene/proteins features and protein-protein interactions related informations. It clusters the transcripts based on functional annotations and generates a tabular report for functional and gene ontology annotations for each submitted transcript to the web server. The implementation of QuickGo web-services in our pipeline enable the users to carry out GO-Slim analysis, whereas the integration of PORTRAIT (Prediction of transcriptomic non coding RNA (ncRNA by ab initio methods helps to identify the non coding RNAs and their regulatory role in transcriptome. In summary, Transcriptator is a useful software for both NGS and array data. It helps the users to characterize the de-novo assembled reads, obtained from NGS experiments for non-referenced organisms, while it also performs the functional enrichment analysis of differentially expressed transcripts/genes for both RNA-seq and micro-array experiments. It generates easy to read tables and interactive charts for better understanding of the data. The pipeline is modular in nature, and provides an opportunity to add new plugins in the future. Web application is

  10. An Automated, Adaptive Framework for Optimizing Preprocessing Pipelines in Task-Based Functional MRI

    Science.gov (United States)

    Churchill, Nathan W.; Spring, Robyn; Afshin-Pour, Babak; Dong, Fan; Strother, Stephen C.

    2015-01-01

    BOLD fMRI is sensitive to blood-oxygenation changes correlated with brain function; however, it is limited by relatively weak signal and significant noise confounds. Many preprocessing algorithms have been developed to control noise and improve signal detection in fMRI. Although the chosen set of preprocessing and analysis steps (the “pipeline”) significantly affects signal detection, pipelines are rarely quantitatively validated in the neuroimaging literature, due to complex preprocessing interactions. This paper outlines and validates an adaptive resampling framework for evaluating and optimizing preprocessing choices by optimizing data-driven metrics of task prediction and spatial reproducibility. Compared to standard “fixed” preprocessing pipelines, this optimization approach significantly improves independent validation measures of within-subject test-retest, and between-subject activation overlap, and behavioural prediction accuracy. We demonstrate that preprocessing choices function as implicit model regularizers, and that improvements due to pipeline optimization generalize across a range of simple to complex experimental tasks and analysis models. Results are shown for brief scanning sessions (behaviour correlations in relatively small datasets. PMID:26161667

  11. miRPursuit-a pipeline for automated analyses of small RNAs in model and nonmodel plants.

    Science.gov (United States)

    Chaves, Inês; Costa, Bruno Vasques; Rodrigues, Andreia S; Bohn, Andreas; Miguel, Célia M

    2017-08-01

    miRPursuit is a pipeline developed for running end-to-end analyses of high-throughput small RNA (sRNA) sequence data in model and nonmodel plants, from raw data to identified and annotated conserved and novel sequences. It consists of a series of UNIX shell scripts, which connect open-source sRNA analysis software. The involved parameters can be combined with convenient workflow management by users without advanced computational skills. miRPursuit presents several advantages when compared to other tools, including the possibility of processing several sRNA libraries in parallel, thus easily allowing a comparison of the differences in sRNA read accumulation among sRNA libraries. We validate miRPursuit by using datasets from a model plant and discuss its performance with the analysis of sRNAs from non-model species. © 2017 Federation of European Biochemical Societies.

  12. A machine learning pipeline for automated registration and classification of 3D lidar data

    Science.gov (United States)

    Rajagopal, Abhejit; Chellappan, Karthik; Chandrasekaran, Shivkumar; Brown, Andrew P.

    2017-05-01

    Despite the large availability of geospatial data, registration and exploitation of these datasets remains a persis- tent challenge in geoinformatics. Popular signal processing and machine learning algorithms, such as non-linear SVMs and neural networks, rely on well-formatted input models as well as reliable output labels, which are not always immediately available. In this paper we outline a pipeline for gathering, registering, and classifying initially unlabeled wide-area geospatial data. As an illustrative example, we demonstrate the training and test- ing of a convolutional neural network to recognize 3D models in the OGRIP 2007 LiDAR dataset using fuzzy labels derived from OpenStreetMap as well as other datasets available on OpenTopography.org. When auxiliary label information is required, various text and natural language processing filters are used to extract and cluster keywords useful for identifying potential target classes. A subset of these keywords are subsequently used to form multi-class labels, with no assumption of independence. Finally, we employ class-dependent geometry extraction routines to identify candidates from both training and testing datasets. Our regression networks are able to identify the presence of 6 structural classes, including roads, walls, and buildings, in volumes as big as 8000 m3 in as little as 1.2 seconds on a commodity 4-core Intel CPU. The presented framework is neither dataset nor sensor-modality limited due to the registration process, and is capable of multi-sensor data-fusion.

  13. Advancements in automated tissue segmentation pipeline for contrast-enhanced CT scans of adult and pediatric patients

    Science.gov (United States)

    Somasundaram, Elanchezhian; Kaufman, Robert; Brady, Samuel

    2017-03-01

    The development of a random forests machine learning technique is presented for fully-automated neck, chest, abdomen, and pelvis tissue segmentation of CT images using Trainable WEKA (Waikato Environment for Knowledge Analysis) Segmentation (TWS) plugin of FIJI (ImageJ, NIH). The use of a single classifier model to segment six tissue classes (lung, fat, muscle, solid organ, blood/contrast agent, bone) in the CT images is studied. An automated unbiased scheme to sample pixels from the training images and generate a balanced training dataset over the seven classes is also developed. Two independent training datasets are generated from a pool of 4 adult (>55 kg) and 3 pediatric patients (patient. Classifier training investigated 28 image filters comprising a total of 272 features. Highly correlated and insignificant features are eliminated using Correlated Feature Subset (CFS) selection with Best First Search (BFS) algorithms in WEKA. The 2 training models (from the 2 training datasets) had 74 and 71 input training features, respectively. The study also investigated the effect of varying the number of trees (25, 50, 100, and 200) in the random forest algorithm. The performance of the 2 classifier models are evaluated on inter-patient intra-slice, intrapatient inter-slice and inter-patient inter-slice test datasets. The Dice similarity coefficients (DSC) and confusion matrices are used to understand the performance of the classifiers across the tissue segments. The effect of number of features in the training input on the performance of the classifiers for tissue classes with less than optimal DSC values is also studied. The average DSC values for the two training models on the inter-patient intra-slice test data are: 0.98, 0.89, 0.87, 0.79, 0.68, and 0.84, for lung, fat, muscle, solid organ, blood/contrast agent, and bone, respectively. The study demonstrated that a robust segmentation accuracy for lung, muscle and fat tissue classes. For solid-organ, blood

  14. An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping.

    Science.gov (United States)

    Yu, Kang; Kirchgessner, Norbert; Grieder, Christoph; Walter, Achim; Hund, Andreas

    2017-01-01

    Robust segmentation of canopy cover (CC) from large amounts of images taken under different illumination/light conditions in the field is essential for high throughput field phenotyping (HTFP). We attempted to address this challenge by evaluating different vegetation indices and segmentation methods for analyzing images taken at varying illuminations throughout the early growth phase of wheat in the field. 40,000 images taken on 350 wheat genotypes in two consecutive years were assessed for this purpose. We proposed an image analysis pipeline that allowed for image segmentation using automated thresholding and machine learning based classification methods and for global quality control of the resulting CC time series. This pipeline enabled accurate classification of imaging light conditions into two illumination scenarios, i.e. high light-contrast (HLC) and low light-contrast (LLC), in a series of continuously collected images by employing a support vector machine (SVM) model. Accordingly, the scenario-specific pixel-based classification models employing decision tree and SVM algorithms were able to outperform the automated thresholding methods, as well as improved the segmentation accuracy compared to general models that did not discriminate illumination differences. The three-band vegetation difference index (NDI3) was enhanced for segmentation by incorporating the HSV-V and the CIE Lab-a color components, i.e. the product images NDI3*V and NDI3*a. Field illumination scenarios can be successfully identified by the proposed image analysis pipeline, and the illumination-specific image segmentation can improve the quantification of CC development. The integrated image analysis pipeline proposed in this study provides great potential for automatically delivering robust data in HTFP.

  15. Comparison of attitudes and beliefs regarding the causes of low back pain between UK students and International students studying at Sheffield Hallam University

    Directory of Open Access Journals (Sweden)

    Yousef Shanib

    2017-02-01

    Full Text Available Background & Aim Low Back Pain (LBP is a widespread problem. Very few past studies which focus on the attitudes and beliefs regarding the causes of LBP of UK and international students exist. This study compares attitudes and beliefs regarding the causes of low back pain between UK students and International students studying at Sheffield Hallam University. Methods The study involved 12 participants (6 UK and 6 international students studying at Sheffield Hallam University. Data was collected by conducting face to face semi structured, recorded interviews. Interviews were later transcribed verbatim. In order to analyse the data obtained, thematic analysis was carried out, using themes found in data transcriptions. Results Four main themes were identified from the data obtained from interviews. These were; personal health and medical related, work related, everyday day life and culture related and government policy and law related. Main themes identified consisted of other smaller themes. Conclusion Attitudes and beliefs belonging to UK and international students at Sheffield Hallam University are related to four main themes; personal health and medical, work, everyday day life and culture and government policy and law. The study identified differences in attitudes and beliefs between UK and International students. As students are the next generation of employees, the study could aid in increasing knowledge of causes of LBP of students in the UK and Internationally, therefore preventing low back pain incidences in the future.

  16. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    Science.gov (United States)

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes.

  17. An Automated Pipeline for Engineering Many-Enzyme Pathways: Computational Sequence Design, Pathway Expression-Flux Mapping, and Scalable Pathway Optimization.

    Science.gov (United States)

    Halper, Sean M; Cetnar, Daniel P; Salis, Howard M

    2018-01-01

    Engineering many-enzyme metabolic pathways suffers from the design curse of dimensionality. There are an astronomical number of synonymous DNA sequence choices, though relatively few will express an evolutionary robust, maximally productive pathway without metabolic bottlenecks. To solve this challenge, we have developed an integrated, automated computational-experimental pipeline that identifies a pathway's optimal DNA sequence without high-throughput screening or many cycles of design-build-test. The first step applies our Operon Calculator algorithm to design a host-specific evolutionary robust bacterial operon sequence with maximally tunable enzyme expression levels. The second step applies our RBS Library Calculator algorithm to systematically vary enzyme expression levels with the smallest-sized library. After characterizing a small number of constructed pathway variants, measurements are supplied to our Pathway Map Calculator algorithm, which then parameterizes a kinetic metabolic model that ultimately predicts the pathway's optimal enzyme expression levels and DNA sequences. Altogether, our algorithms provide the ability to efficiently map the pathway's sequence-expression-activity space and predict DNA sequences with desired metabolic fluxes. Here, we provide a step-by-step guide to applying the Pathway Optimization Pipeline on a desired multi-enzyme pathway in a bacterial host.

  18. Automated pipeline to analyze non-contact infrared images of the paraventricular nucleus specific leptin receptor knock-out mouse model

    Science.gov (United States)

    Diaz Martinez, Myriam; Ghamari-Langroudi, Masoud; Gifford, Aliya; Cone, Roger; Welch, E. B.

    2015-03-01

    Evidence of leptin resistance is indicated by elevated leptin levels together with other hallmarks of obesity such as a defect in energy homeostasis.1 As obesity is an increasing epidemic in the US, the investigation of mechanisms by which leptin resistance has a pathophysiological impact on energy is an intensive field of research.2 However, the manner in which leptin resistance contributes to the dysregulation of energy, specifically thermoregulation,3 is not known. The aim of this study was to investigate whether the leptin receptor expressed in paraventricular nucleus (PVN) neurons plays a role in thermoregulation at different temperatures. Non-contact infrared (NCIR) thermometry was employed to measure surface body temperature (SBT) of nonanesthetized mice with a specific deletion of the leptin receptor in the PVN after exposure to room (25 °C) and cold (4 °C) temperature. Dorsal side infrared images of wild type (LepRwtwt/sim1-Cre), heterozygous (LepRfloxwt/sim1-Cre) and knock-out (LepRfloxflox/sim1-Cre) mice were collected. Images were input to an automated post-processing pipeline developed in MATLAB to calculate average and maximum SBTs. Linear regression was used to evaluate the relationship between sex, cold exposure and leptin genotype with SBT measurements. Findings indicate that average SBT has a negative relationship to the LepRfloxflox/sim1-Cre genotype, the female sex and cold exposure. However, max SBT is affected by the LepRfloxflox/sim1-Cre genotype and the female sex. In conclusion this data suggests that leptin within the PVN may have a neuroendocrine role in thermoregulation and that NCIR thermometry combined with an automated imaging-processing pipeline is a promising approach to determine SBT in non-anesthetized mice.

  19. Pipeline engineering

    CERN Document Server

    Liu, Henry

    2003-01-01

    PART I: PIPE FLOWSINTRODUCTIONDefinition and Scope Brief History of PipelinesExisting Major PipelinesImportance of PipelinesFreight (Solids) Transport by PipelinesTypes of PipelinesComponents of PipelinesAdvantages of PipelinesReferencesSINGLE-PHASE INCOMPRESSIBLE NEWTONIAN FLUIDIntroductionFlow RegimesLocal Mean Velocity and Its Distribution (Velocity Profile)Flow Equations for One-Dimensional AnalysisHydraulic and Energy Grade LinesCavitation in Pipeline SystemsPipe in Series and ParallelInterconnected ReservoirsPipe NetworkUnsteady Flow in PipeSINGLE-PHASE COMPRESSIBLE FLOW IN PIPEFlow Ana

  20. metaBIT, an integrative and automated metagenomic pipeline for analysing microbial profiles from high-throughput sequencing shotgun data

    DEFF Research Database (Denmark)

    Louvel, Guillaume; Der Sarkissian, Clio; Hanghøj, Kristian Ebbesen

    2016-01-01

    Micro-organisms account for most of the Earth's biodiversity and yet remain largely unknown. The complexity and diversity of microbial communities present in clinical and environmental samples can now be robustly investigated in record times and prices thanks to recent advances in high......-throughput DNA sequencing (HTS). Here, we develop metaBIT, an open-source computational pipeline automatizing routine microbial profiling of shotgun HTS data. Customizable by the user at different stringency levels, it performs robust taxonomy-based assignment and relative abundance calculation of microbial taxa......, as well as cross-sample statistical analyses of microbial diversity distributions. We demonstrate the versatility of metaBIT within a range of published HTS data sets sampled from the environment (soil and seawater) and the human body (skin and gut), but also from archaeological specimens. We present...

  1. Whole genome sequencing of group A Streptococcus: development and evaluation of an automated pipeline for emmgene typing

    Directory of Open Access Journals (Sweden)

    Georgia Kapatai

    2017-04-01

    Full Text Available Streptococcus pyogenes group A Streptococcus (GAS is the most common cause of bacterial throat infections, and can cause mild to severe skin and soft tissue infections, including impetigo, erysipelas, necrotizing fasciitis, as well as systemic and fatal infections including septicaemia and meningitis. Estimated annual incidence for invasive group A streptococcal infection (iGAS in industrialised countries is approximately three per 100,000 per year. Typing is currently used in England and Wales to monitor bacterial strains of S. pyogenes causing invasive infections and those isolated from patients and healthcare/care workers in cluster and outbreak situations. Sequence analysis of the emm gene is the currently accepted gold standard methodology for GAS typing. A comprehensive database of emm types observed from superficial and invasive GAS strains from England and Wales informs outbreak control teams during investigations. Each year the Bacterial Reference Department, Public Health England (PHE receives approximately 3,000 GAS isolates from England and Wales. In April 2014 the Bacterial Reference Department, PHE began genomic sequencing of referred S. pyogenes isolates and those pertaining to selected elderly/nursing care or maternity clusters from 2010 to inform future reference services and outbreak analysis (n = 3, 047. In line with the modernizing strategy of PHE, we developed a novel bioinformatics pipeline that can predict emmtypes using whole genome sequence (WGS data. The efficiency of this method was measured by comparing the emmtype assigned by this method against the result from the current gold standard methodology; concordance to emmsubtype level was observed in 93.8% (2,852/3,040 of our cases, whereas in 2.4% (n = 72 of our cases concordance was observed to emm type level. The remaining 3.8% (n = 117 of our cases corresponded to novel types/subtypes, contamination, laboratory sample transcription errors or problems arising

  2. Slurry pipeline technology: an overview

    Energy Technology Data Exchange (ETDEWEB)

    Chapman, Jay P. [Pipeline Systems Incorporated (PSI), Belo Horizonte, MG (Brazil); Lima, Rafael; Pinto, Daniel; Vidal, Alisson [Ausenco do Brasil Engenharia Ltda., Nova Lima, MG (Brazil). PSI Div.

    2009-12-19

    Slurry pipelines represent an economical and environmentally friendly transportation means for many solid materials. This paper provides an over-view of the technology, its evolution and current Brazilian activity. Mineral resources are increasingly moving farther away from ports, processing plants and end use points, and slurry pipelines are an important mode of solids transport. Application guidelines are discussed. State-of-the-Art technical solutions such as pipeline system simulation, pipe materials, pumps, valves, automation, telecommunications, and construction techniques that have made the technology successful are presented. A discussion of where long distant slurry pipelines fit in a picture that also includes thickened and paste materials pipe lining is included. (author)

  3. Fish the ChIPs: a pipeline for automated genomic annotation of ChIP-Seq data

    Directory of Open Access Journals (Sweden)

    Minucci Saverio

    2011-10-01

    Full Text Available Abstract Background High-throughput sequencing is generating massive amounts of data at a pace that largely exceeds the throughput of data analysis routines. Here we introduce Fish the ChIPs (FC, a computational pipeline aimed at a broad public of users and designed to perform complete ChIP-Seq data analysis of an unlimited number of samples, thus increasing throughput, reproducibility and saving time. Results Starting from short read sequences, FC performs the following steps: 1 quality controls, 2 alignment to a reference genome, 3 peak calling, 4 genomic annotation, 5 generation of raw signal tracks for visualization on the UCSC and IGV genome browsers. FC exploits some of the fastest and most effective tools today available. Installation on a Mac platform requires very basic computational skills while configuration and usage are supported by a user-friendly graphic user interface. Alternatively, FC can be compiled from the source code on any Unix machine and then run with the possibility of customizing each single parameter through a simple configuration text file that can be generated using a dedicated user-friendly web-form. Considering the execution time, FC can be run on a desktop machine, even though the use of a computer cluster is recommended for analyses of large batches of data. FC is perfectly suited to work with data coming from Illumina Solexa Genome Analyzers or ABI SOLiD and its usage can potentially be extended to any sequencing platform. Conclusions Compared to existing tools, FC has two main advantages that make it suitable for a broad range of users. First of all, it can be installed and run by wet biologists on a Mac machine. Besides it can handle an unlimited number of samples, being convenient for large analyses. In this context, computational biologists can increase reproducibility of their ChIP-Seq data analyses while saving time for downstream analyses. Reviewers This article was reviewed by Gavin Huttley, George

  4. P56-M High-Throughput Genotyping of International HapMap Project Populations with Applied Biosystems TaqMan Drug Metabolism Genotyping Assays: An Automated Laboratory and Analysis Pipeline

    Science.gov (United States)

    Haque, K. A.; Wronka, L. M.; Dagnall, C. L.; Stefan, C. M.; Beerman, M. B.; Hicks, B. D.; Welch, R. A.

    2007-01-01

    Although high-density whole-genome SNP scans are available for association studies, the tagging SNP approach used to design many of these panels from International HapMap Project data may miss a substantial number of coding functional variations of drug metabolism enzymes (DME). In fact, more than 40 DME genes are not covered by the HapMap Project, probably due to the difficulties in assay design for these highly homologous gene families. Additionally, many of these technologies do not provide detection in a high number of known DME genes, leading to further gaps in whole-genome scans. Of the polymorphic putative functional DME variants not typed in Hap-Map, a large proportion is untagged by any combination of HapMap SNPs. Therefore, to correlate phenotypes to putative functional DME variations in pharmacogenomic studies, direct genotyping of these functional SNPs will be necessary. Applied Biosystems has developed a panel of N = 2394 TaqMan Drug Metabolism Genotyping Assays to interrogate putative functional variations in N = 220 DME genes. At the National Cancer Institute’s Core Genotyping Facility, an automated, high-throughput pipeline has been created to genotype these assays on the International HapMap Project population. DNA sample preparation and handling, assay set-up, genotype analysis, and data publishing at SNP500 Cancer Database (http://snp500cancer.nci.nih.gov), have all been automated. Using a series of custom-designed methods on five Beckman Coulter Biomek FXs, a Laboratory Information Management System, and analysis software, >650,000 genotypes have been obtained and analyzed by a single person in about 8 weeks. Using this pipeline, a completion rate of >99% and no Mendelian inheritance errors were observed. Furthermore, the CGF has implemented quality-controlled, automated pipelines for sample receiving, quantification, numerous DNA handling procedures, genotyping, and analysis for all samples and studies processed.

  5. Leadership Pipeline

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård

    2012-01-01

    Artiklen analyserer grundlaget for Leadership Pipeline modellen med henblik på en vurdering af substansen bag modellen, og perspektiverne for generalisering af modellen til en dansk organisatorisk kontekst.......Artiklen analyserer grundlaget for Leadership Pipeline modellen med henblik på en vurdering af substansen bag modellen, og perspektiverne for generalisering af modellen til en dansk organisatorisk kontekst....

  6. Leadership Pipeline

    DEFF Research Database (Denmark)

    Elmholdt, Claus Westergård

    2013-01-01

    I artiklen undersøges det empiriske grundlag for Leader- ship Pipeline. Først beskrives Leadership Pipeline modellen om le- delsesbaner og skilleveje i opadgående transitioner mellem orga- nisatoriske ledelsesniveauer (Freedman, 1998; Charan, Drotter and Noel, 2001). Dernæst sættes fokus på det...... forholdet mellem kontinuitet- og diskontinuitet i ledel- seskompetencer på tværs af organisatoriske niveauer præsenteres og diskuteres. Afslutningsvis diskuteres begrænsningerne i en kompetencebaseret tilgang til Leadership Pipeline, og det foreslås, at succesfuld ledelse i ligeså høj grad afhænger af...

  7. Pipeline Power

    OpenAIRE

    Hubert, Franz; Cobanli, Onur

    2012-01-01

    We use cooperative game theory to analyze the impact of three controversial pipeline projects on the power structure in the Eurasian trade of natural gas. Two of them, Nord Stream and South Stream, allow Russian gas to bypass transit countries, Ukraine and Belarus. Nord Stream’s strategic value turns out to be huge, justifying the high investment cost for Germany and Russia. The additional leverage obtained through South Stream, in contrast, appears small. The third project, Nabucco, aims at ...

  8. Underground pipeline corrosion

    CERN Document Server

    Orazem, Mark

    2014-01-01

    Underground pipelines transporting liquid petroleum products and natural gas are critical components of civil infrastructure, making corrosion prevention an essential part of asset-protection strategy. Underground Pipeline Corrosion provides a basic understanding of the problems associated with corrosion detection and mitigation, and of the state of the art in corrosion prevention. The topics covered in part one include: basic principles for corrosion in underground pipelines, AC-induced corrosion of underground pipelines, significance of corrosion in onshore oil and gas pipelines, n

  9. The e-MERLIN Data Reduction Pipeline

    OpenAIRE

    Megan Kirsty Argo

    2015-01-01

    Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS), the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency,...

  10. BPC Trenchless Pipeline System

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, Martin

    2011-03-15

    The BPC trenchless system features pipeline installation through directional drilling utilizing horizontal directional boring along with state of the art mapping and product coating integrity surveys after pipe placement. The trenchless system developed by BPC reduces environmental disturbances, with savings in reclaiming pipeline right-of-ways such as re-seeding and contouring. It is felt that with smaller pipelines of less than 1,000 metres, a reduction in environmental footprint is now technically feasible using this technology.

  11. Real-time implementation of cochlear implant speech processing pipeline on smartphones.

    Science.gov (United States)

    Parris, Shane; Torlak, Murat; Kehtarnavaz, Nasser

    2014-01-01

    This paper presents the real-time implementation of an adaptive speech processing pipeline for cochlear implants on the smartphone platform. The pipeline is capable of real-time classification of background noise environment and automated tuning of a noise suppression component based upon the detected background noise environment. This pipeline was previously implemented on the FDA-approved PDA platform for cochlear implant studies. The paper discusses the steps taken to achieve the real-time implementation of the pipeline on the smartphone platform. In addition, it includes the real-time timing as well as the noise suppression results when the entire pipeline was run on the smartphone platform.

  12. Bioinformatic pipelines in Python with Leaf

    Science.gov (United States)

    2013-01-01

    Background An incremental, loosely planned development approach is often used in bioinformatic studies when dealing with custom data analysis in a rapidly changing environment. Unfortunately, the lack of a rigorous software structuring can undermine the maintainability, communicability and replicability of the process. To ameliorate this problem we propose the Leaf system, the aim of which is to seamlessly introduce the pipeline formality on top of a dynamical development process with minimum overhead for the programmer, thus providing a simple layer of software structuring. Results Leaf includes a formal language for the definition of pipelines with code that can be transparently inserted into the user’s Python code. Its syntax is designed to visually highlight dependencies in the pipeline structure it defines. While encouraging the developer to think in terms of bioinformatic pipelines, Leaf supports a number of automated features including data and session persistence, consistency checks between steps of the analysis, processing optimization and publication of the analytic protocol in the form of a hypertext. Conclusions Leaf offers a powerful balance between plan-driven and change-driven development environments in the design, management and communication of bioinformatic pipelines. Its unique features make it a valuable alternative to other related tools. PMID:23786315

  13. Pipeline system operability review

    Energy Technology Data Exchange (ETDEWEB)

    Eriksson, Kjell [Det Norske Veritas (Norway); Davies, Ray [CC Technologies, Dublin, OH (United States)

    2005-07-01

    Pipeline operators are continuously working to improve the safety of their systems and operations. In the US both liquid and gas pipeline operators have worked with the regulators over many years to develop more systematic approaches to pipeline integrity management. To successfully manage pipeline integrity, vast amounts of data from different sources needs to be collected, overlaid and analyzed in order to assess the current condition and predict future degradation. The efforts undertaken by the operators has had a significant impact on pipeline safety, nevertheless, during recent years we have seen a number of major high profile accidents. One can therefore ask how effective the pipeline integrity management systems and processes are. This paper will present one methodology 'The Pipeline System Operability Review' that can evaluate and rate the effectiveness of both the management systems and procedures, as well as the technical condition of the hardware. The result from the review can be used to compare the performance of different pipelines within one operating company, as well as benchmark with international best practices. (author)

  14. Slurry pipeline hydrostatic testing

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy G.; Navarro Rojas, Luis Alejandro [BRASS Chile S.A., Santiago (Chile)

    2009-07-01

    The transportation of concentrates and tailings through long distance pipeline has been proven in recent years to be the most economic, environmentally friendly and secure means of transporting of mine products. This success has led to an increase in the demand for long distance pipeline throughout the mining industry. In year 2007 alone, a total of over 500 km of pipeline has been installed in South America alone and over 800 km are in the planning stages. As more pipelines are being installed, the need to ensure its operating integrity is ever increasing. Hydrostatic testing of long distance pipeline is one of the most economical and expeditious way to proving the operational integrity of the pipe. The intent of this paper is to show the sound reasoning behind construction hydro testing and the economic benefit it presents. It will show how hydro test pressures are determined based on ASME B31.11 criteria. (author)

  15. Slurry pipeline design approach

    Energy Technology Data Exchange (ETDEWEB)

    Betinol, Roy; Navarro R, Luis [Brass Chile S.A., Santiago (Chile)

    2009-12-19

    Compared to other engineering technologies, the design of a commercial long distance Slurry Pipeline design is a relatively new engineering concept which gained more recognition in the mid 1960 's. Slurry pipeline was first introduced to reduce cost in transporting coal to power generating units. Since then this technology has caught-up worldwide to transport other minerals such as limestone, copper, zinc and iron. In South America, the use of pipeline is commonly practiced in the transport of Copper (Chile, Peru and Argentina), Iron (Chile and Brazil), Zinc (Peru) and Bauxite (Brazil). As more mining operations expand and new mine facilities are opened, the design of the long distance slurry pipeline will continuously present a commercially viable option. The intent of this paper is to present the design process and discuss any new techniques and approach used today to ensure a better, safer and economical slurry pipeline. (author)

  16. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    Directory of Open Access Journals (Sweden)

    Brian O'Farrell

    Full Text Available Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of 200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  17. Transforming microbial genotyping: a robotic pipeline for genotyping bacterial strains.

    Science.gov (United States)

    O'Farrell, Brian; Haase, Jana K; Velayudhan, Vimalkumar; Murphy, Ronan A; Achtman, Mark

    2012-01-01

    Microbial genotyping increasingly deals with large numbers of samples, and data are commonly evaluated by unstructured approaches, such as spread-sheets. The efficiency, reliability and throughput of genotyping would benefit from the automation of manual manipulations within the context of sophisticated data storage. We developed a medium- throughput genotyping pipeline for MultiLocus Sequence Typing (MLST) of bacterial pathogens. This pipeline was implemented through a combination of four automated liquid handling systems, a Laboratory Information Management System (LIMS) consisting of a variety of dedicated commercial operating systems and programs, including a Sample Management System, plus numerous Python scripts. All tubes and microwell racks were bar-coded and their locations and status were recorded in the LIMS. We also created a hierarchical set of items that could be used to represent bacterial species, their products and experiments. The LIMS allowed reliable, semi-automated, traceable bacterial genotyping from initial single colony isolation and sub-cultivation through DNA extraction and normalization to PCRs, sequencing and MLST sequence trace evaluation. We also describe robotic sequencing to facilitate cherrypicking of sequence dropouts. This pipeline is user-friendly, with a throughput of 96 strains within 10 working days at a total cost of 200,000 items were processed by two to three people. Our sophisticated automated pipeline can be implemented by a small microbiology group without extensive external support, and provides a general framework for semi-automated bacterial genotyping of large numbers of samples at low cost.

  18. Central oxygen pipeline failure

    African Journals Online (AJOL)

    the newly replaced main oxygen valve was slowly opened to allow pipeline pressure to build. ... The gas then passes through high-pressure regulators, which regulate the ... 1Department of Anaesthesiology and Sedation & Pain control, UWC.

  19. Chechnya: the pipeline front

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1999-11-01

    This article examines the impact of the Russian campaign against Chechnya on projects for oil and gas pipelines from the new Caspian republics, which are seeking financial support. Topics discussed include the pipeline transport of oil from Azerbaijan through Chechnya to the Black Sea, the use of oil money to finance the war, the push for non-Russian export routes, the financing of pipelines, the impact of the war on the supply of Russian and Turkmenistan gas to Turkey, the proposed construction of the Trans Caspian pipeline, the weakening of trust between Russia and its neighbours, and the potential for trans Caucasus republics to look to western backers due to the instability of the North Caucasus. (UK)

  20. The e-MERLIN Data Reduction Pipeline

    Directory of Open Access Journals (Sweden)

    Megan Kirsty Argo

    2015-01-01

    Full Text Available Written in Python and utilising ParselTongue to interface with the Astronomical Image Processing System (AIPS, the e-MERLIN data reduction pipeline is intended to automate the procedures required in processing and calibrating radio astronomy data from the e-MERLIN correlator. Driven by a plain text file of input parameters, the pipeline is modular and can be run in stages by the user, depending on requirements. The software includes options to load raw data, average in time and/or frequency, flag known sources of interference, flag more comprehensively with SERPent, carry out some or all of the calibration procedures (including self-calibration, and image in either normal or wide-field mode. It also optionally produces a number of useful diagnostic plots at various stages so that the quality of the data can be assessed. The software is available for download from the e-MERLIN website or via Github.

  1. Automatic pipeline operation using Petri Nets

    Energy Technology Data Exchange (ETDEWEB)

    Moreira, Guilherme O. [PETROBRAS TRANSPORTE S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    A pipeline operation requires several actions, attention and time from the control room operator in each of its operating phases. This article proposition is to use automation as something more than a remote control, drastically reducing the number of repetitive and routine actions needed from the operator to start and stop the system, granting more time for system supervision, decision making during critical conditions and avoiding errors caused due to the need of several actions being executed in a short period of time. To achieve these objectives the pipeline operation will be modeled as a Petri Net consisting of states, event and actions. A methodology for converting this Petri Net into a Ladder controller code will also be proposed. (author)

  2. 75 FR 13342 - Pipeline Safety: Workshop on Distribution Pipeline Construction

    Science.gov (United States)

    2010-03-19

    ... practices. (3) Construction Issues, Joining, Horizontal Directional Drill/ Boring, Excavation Damage. (4... Construction AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of... Safety Representatives (NAPSR) on new distribution pipeline construction. The workshop will allow...

  3. A common registration-to-publication automated pipeline for nomenclatural acts for higher plants (International Plant Names Index, IPNI), fungi (Index Fungorum, MycoBank) and animals (ZooBank)

    NARCIS (Netherlands)

    Robert, Vincent

    2016-01-01

    Collaborative effort among four lead indexes of taxon names and nomenclatural acts (International Plant Name Index (IPNI), Index Fungorum, MycoBank and ZooBank) and the journals PhytoKeys, MycoKeys and ZooKeys to create an automated, pre-publication, registration workflow, based on a

  4. Long term pipeline monitoring in geomechanically sensitive environments

    Energy Technology Data Exchange (ETDEWEB)

    Weir-Jones, I.; Sun, M. [Weir-Jones Engineering Consultants Ltd. (Canada)

    2011-07-01

    In the oil industry, monitoring pipeline structural integrity is necessary for both regulatory and environmental purposes. Weir-Jones Engineering Consultants developed an automated structural integrity monitoring (SIM) system, data on strain, displacement and temperature are continuously acquired and automatically transmitted to the monitoring personnel. The aim of this paper is to present this technology and its implementation on one of Inter Pipeline Fund's lines as well. The automated SIM equipment was installed on a new 42'' line at the crossing of the Clearwater River close to Fort McMurray. Results showed that this technology is a good way to monitor pipelines in remote locations, environmentally sensitive areas, river and embankment crossings and in locations where external forces can put the pipeline at risk; but it should not be used otherwise as that would not be cost effective. This paper described the developed automated SIM and showed that it should only be used in specific locations to be cost effective.

  5. Transient cavitation in pipelines

    NARCIS (Netherlands)

    Kranenburg, C.

    1974-01-01

    The aim of the present study is to set up a one-dimensional mathematical model, which describes the transient flow in pipelines, taking into account the influence of cavitation and free gas. The flow will be conceived of as a three-phase flow of the liquid, its vapour and non-condensible gas. The

  6. The Photometry Pipeline of the Watcher Robotic Telescope

    Directory of Open Access Journals (Sweden)

    A. Ferrero

    2010-01-01

    Full Text Available The Watcher robotic telescope was developed primarily to perform rapid optical follow-up observations of Gamma-Ray Bursts (GRBs. Secondary scientific goals include blazar monitoring and variable star studies. An automated photometry pipeline to rapidly analyse data from Watcher has been implemented. Details of the procedures to get image zero-point, source instrumental measurement, and limiting magnitude are presented. Sources of uncertainty are assessed and the performance of the pipeline is tested by comparison with a number of catalogue sources.

  7. Stress analysis of vibrating pipelines

    Science.gov (United States)

    Zachwieja, Janusz

    2017-03-01

    The pipelines are subject to various constraints variable in time. Those vibrations, if not monitored for amplitude and frequency, may result in both the fatigue damage in the pipeline profile at high stress concentration and the damage to the pipeline supports. If the constraint forces are known, the system response may be determined with high accuracy using analytical or numerical methods. In most cases, it may be difficult to determine the constraint parameters, since the industrial pipeline vibrations occur due to the dynamic effects of the medium in the pipeline. In that case, a vibration analysis is a suitable alternative method to determine the stress strain state in the pipeline profile. Monitoring the pipeline vibration levels involves a comparison between the measured vibration parameters and the permissible values as depicted in the graphs for a specific pipeline type. Unfortunately, in most cases, the studies relate to the petrochemical industry and thus large diameter, long and straight pipelines. For a pipeline section supported on both ends, the response in any profile at the entire section length can be determined by measuring the vibration parameters at two different profiles between the pipeline supports. For a straight pipeline section, the bending moments, variable in time, at the ends of the analysed section are a source of the pipe excitation. If a straight pipe section supported on both ends is excited by the bending moments in the support profile, the starting point for the stress analysis are the strains, determined from the Euler-Bernoulli equation. In practice, it is easier to determine the displacement using the experimental methods, since the factors causing vibrations are unknown. The industrial system pipelines, unlike the transfer pipelines, are straight sections at some points only, which makes it more difficult to formulate the equation of motion. In those cases, numerical methods can be used to determine stresses using the

  8. Historical analysis of US pipeline accidents triggered by natural hazards

    Science.gov (United States)

    Girgin, Serkan; Krausmann, Elisabeth

    2015-04-01

    Natural hazards, such as earthquakes, floods, landslides, or lightning, can initiate accidents in oil and gas pipelines with potentially major consequences on the population or the environment due to toxic releases, fires and explosions. Accidents of this type are also referred to as Natech events. Many major accidents highlight the risk associated with natural-hazard impact on pipelines transporting dangerous substances. For instance, in the USA in 1994, flooding of the San Jacinto River caused the rupture of 8 and the undermining of 29 pipelines by the floodwaters. About 5.5 million litres of petroleum and related products were spilled into the river and ignited. As a results, 547 people were injured and significant environmental damage occurred. Post-incident analysis is a valuable tool for better understanding the causes, dynamics and impacts of pipeline Natech accidents in support of future accident prevention and mitigation. Therefore, data on onshore hazardous-liquid pipeline accidents collected by the US Pipeline and Hazardous Materials Safety Administration (PHMSA) was analysed. For this purpose, a database-driven incident data analysis system was developed to aid the rapid review and categorization of PHMSA incident reports. Using an automated data-mining process followed by a peer review of the incident records and supported by natural hazard databases and external information sources, the pipeline Natechs were identified. As a by-product of the data-collection process, the database now includes over 800,000 incidents from all causes in industrial and transportation activities, which are automatically classified in the same way as the PHMSA record. This presentation describes the data collection and reviewing steps conducted during the study, provides information on the developed database and data analysis tools, and reports the findings of a statistical analysis of the identified hazardous liquid pipeline incidents in terms of accident dynamics and

  9. Instrumented Pipeline Initiative

    Energy Technology Data Exchange (ETDEWEB)

    Thomas Piro; Michael Ream

    2010-07-31

    This report summarizes technical progress achieved during the cooperative agreement between Concurrent Technologies Corporation (CTC) and U.S. Department of Energy to address the need for a for low-cost monitoring and inspection sensor system as identified in the Department of Energy (DOE) National Gas Infrastructure Research & Development (R&D) Delivery Reliability Program Roadmap.. The Instrumented Pipeline Initiative (IPI) achieved the objective by researching technologies for the monitoring of pipeline delivery integrity, through a ubiquitous network of sensors and controllers to detect and diagnose incipient defects, leaks, and failures. This report is organized by tasks as detailed in the Statement of Project Objectives (SOPO). The sections all state the objective and approach before detailing results of work.

  10. Pipeline corridors through wetlands

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, R.E.; Wilkey, P.L. [Argonne National Lab., IL (United States); Isaacson, H.R. [Gas Research Institute (United States)

    1992-12-01

    This paper presents preliminary findings from six vegetational surveys of gas pipeline rights-of-way (ROW) through wetlands and quantifies the impacts of a 20-year-old pipeline ROW through a boreal forest wetland. Six sites of various ages were surveyed in ecosystems ranging from coastal marsh to forested wetland. At all sites except one, both the number and the percentage of wetland species on the Row approximated or exceeded those in the adjacent natural area. The boreal forest study showed that (1) adjacent natural wetland areas were not altered in type; (2) water sheet flow restriction had been reversed by nature; (3) no nonnative plant species invaded the natural area; (4) three-quarters of the ROW area was a wetland, and (5) the ROW increased diversity.

  11. Pipeline corridors through wetlands

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, R.E.; Wilkey, P.L. (Argonne National Lab., IL (United States)); Isaacson, H.R. (Gas Research Institute (United States))

    1992-01-01

    This paper presents preliminary findings from six vegetational surveys of gas pipeline rights-of-way (ROW) through wetlands and quantifies the impacts of a 20-year-old pipeline ROW through a boreal forest wetland. Six sites of various ages were surveyed in ecosystems ranging from coastal marsh to forested wetland. At all sites except one, both the number and the percentage of wetland species on the Row approximated or exceeded those in the adjacent natural area. The boreal forest study showed that (1) adjacent natural wetland areas were not altered in type; (2) water sheet flow restriction had been reversed by nature; (3) no nonnative plant species invaded the natural area; (4) three-quarters of the ROW area was a wetland, and (5) the ROW increased diversity.

  12. Pipelined Asynchronous Cache Design

    OpenAIRE

    Nyströem, Mika

    1997-01-01

    This thesis describes the development of pipelined asynchronous cache memories. The work is done in the context of the performance characteristics of memories and transistor logic of a late 1990's high-performance asynchronous microprocessor. We describe the general framework of asynchronous memory systems, caching, and those system characteristics that make caching of growing importance and keep it an interesting research topic. Finally, we present the main contribution of this work, whi...

  13. Introducing Library Pipeline

    Directory of Open Access Journals (Sweden)

    Brett Bonfield

    2014-11-01

    Full Text Available In Brief: We’re creating a nonprofit, Library Pipeline, that will operate independently from In the Library with the Lead Pipe, but will have similar and complementary aims: increasing and diversifying professional development; improving strategies and collaboration; fostering more innovation and start-ups, and encouraging LIS-related publishing and publications. In the Library with the Lead Pipe is […

  14. Protein Identification Pipeline for the Homology Driven Proteomics

    Science.gov (United States)

    Junqueira, Magno; Spirin, Victor; Balbuena, Tiago Santana; Thomas, Henrik; Adzhubei, Ivan; Sunyaev, Shamil; Shevchenko, Andrej

    2008-01-01

    Homology-driven proteomics is a major tool to characterize proteomes of organisms with unsequenced genomes. This paper addresses practical aspects of automated homology–driven protein identifications by LC-MS/MS on a hybrid LTQ Orbitrap mass spectrometer. All essential software elements supporting the presented pipeline are either hosted at the publicly accessible web server, or are available for free download. PMID:18639657

  15. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Bill Bruce; Nancy Porter; George Ritter; Matt Boring; Mark Lozev; Ian Harris; Bill Mohr; Dennis Harwig; Robin Gordon; Chris Neary; Mike Sullivan

    2005-07-20

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Principal conclusions from a survey of natural gas transmission industry pipeline operators can be summarized in terms of the following performance requirements for internal repair: (1) Use of internal repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) The most common size range for 80% to 90% of operators surveyed is 508 mm (20 in.) to 762 mm (30 in.), with 95% using 558.8 mm (22 in.) pipe. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections without

  16. Hydrodynamics of ocean pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Yoshihara, S.; Toyoda, S.; Venkataramana, K.; Aiko, Y. (Kagoshima University, Kagoshima (Japan). Faculty of Engineering)

    1993-09-30

    This paper describes the current forces acting on cylindrical models in a steady flow, corresponding to the cases of rigid and large diameter pipelines in real seas. The models were placed in a circulating water channel normal to the direction of flow. The strains in the models were recorded using strain gauges, from which fluid forces in the horizontal and vertical directions were obtained. The drag coefficient, lift coefficient, and Straul number were calculated, and were illustrated against the Reynolds number. Consequently, the drag force was found to increase with flow velocity. In addition, it was shown that the variation of lift force was more complex and was affected by the eddies and other forms of turbulence around the models. For the model which consists of two pipes held together, it was found that the fluid forces were greater on the upstream side. It was provided that the fluid forces were also affected by the orientation of the pipelines. Furthermore, it was clarified that the values of hydrodynamic coefficients and Straul number were similar to the results for vertical cylinders in uniform flows. 5 refs., 16 figs.

  17. Equipment orders for pipelines and storage

    Energy Technology Data Exchange (ETDEWEB)

    1977-09-30

    Jiskoot Autocontrol Ltd.'s Dutch subsidiary, Jiskoot Automation N.V., has won two orders from West German firms for the construction of automatic crude oil sampling systems. For Nord-West Kavernenges. m.b.H., Jiskoot will supply a system to sample oil stored in a disused salt mine, i.e., the type 310 sampler which employs Buxton/BASEEFA certified electrical components throughout; for Mannesmann-Export A.G., it will supply a Type 330 PTB sampler, to be installed on a pipeline in Iraq. A second major contract has been won in Czechoslovakia by Serck Controls of Warwickshire, U.K. It totals 340,000 English pounds and covers the supply and installation of a telemtry and remote control system for supervision of the 30 km Litvinov-to-Kralupy gas pipeline. The previous system was commissioned in 1975. More than 260 Rotork Syncropak valve actuators, worth about 250,000 English pounds, have been ordered from this Bath-based firm, for use on a natural gas collection project in Iran.

  18. Shipping Information Pipeline

    DEFF Research Database (Denmark)

    Jensen, Thomas

    , while contemporary research proposes Information Infrastructures (II) as a new IT artifact to be researched. Correspondingly, this thesis applies the concept of and design theory for II to improve containerized shipping. Activity Theory has guided the analysis of containerized shipping, following......This thesis applies theoretical perspectives from the Information Systems (IS) research field to propose how Information Technology (IT) can improve containerized shipping. This question is addressed by developing a set of design principles for an information infrastructure for sharing shipping...... information named the Shipping Information Pipeline (SIP). Review of the literature revealed that IS research prescribed a set of meta-design principles, including digitalization and digital collaboration by implementation of Inter-Organizational Systems based on Electronic Data Interchange (EDI) messages...

  19. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; Nancy Porter; Mike Sullivan; Chris Neary

    2004-04-12

    The two broad categories of deposited weld metal repair and fiber-reinforced composite liner repair technologies were reviewed for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Preliminary test programs were developed for both deposited weld metal repair and for fiber-reinforced composite liner repair. Evaluation trials have been conducted using a modified fiber-reinforced composite liner provided by RolaTube and pipe sections without liners. All pipe section specimens failed in areas of simulated damage. Pipe sections containing fiber-reinforced composite liners failed at pressures marginally greater than the pipe sections without liners. The next step is to evaluate a liner material with a modulus of elasticity approximately 95% of the modulus of elasticity for steel. Preliminary welding parameters were developed for deposited weld metal repair in preparation of the receipt of Pacific Gas & Electric's internal pipeline welding repair system (that was designed specifically for 559 mm (22 in.) diameter pipe) and the receipt of 559 mm (22 in.) pipe sections from Panhandle Eastern. The next steps are to transfer welding parameters to the PG&E system and to pressure test repaired pipe sections to failure. A survey of pipeline operators was conducted to better understand the needs and performance requirements of the natural gas transmission industry regarding internal repair. Completed surveys contained the following principal conclusions: (1) Use of internal weld repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling (HDD) when a new bore must be created

  20. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; George Ritter; Bill Mohr; Matt Boring; Nancy Porter; Mike Sullivan; Chris Neary

    2004-08-17

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Principal conclusions from a survey of natural gas transmission industry pipeline operators can be summarized in terms of the following performance requirements for internal repair: (1) Use of internal repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) The most common size range for 80% to 90% of operators surveyed is 508 mm (20 in.) to 762 mm (30 in.), with 95% using 558.8 mm (22 in.) pipe. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections without liners

  1. INTERNAL REPAIR OF PIPELINES

    Energy Technology Data Exchange (ETDEWEB)

    Robin Gordon; Bill Bruce; Ian Harris; Dennis Harwig; George Ritter; Bill Mohr; Matt Boring; Nancy Porter; Mike Sullivan; Chris Neary

    2004-12-31

    The two broad categories of fiber-reinforced composite liner repair and deposited weld metal repair technologies were reviewed and evaluated for potential application for internal repair of gas transmission pipelines. Both are used to some extent for other applications and could be further developed for internal, local, structural repair of gas transmission pipelines. Principal conclusions from a survey of natural gas transmission industry pipeline operators can be summarized in terms of the following performance requirements for internal repair: (1) Use of internal repair is most attractive for river crossings, under other bodies of water, in difficult soil conditions, under highways, under congested intersections, and under railway crossings. (2) Internal pipe repair offers a strong potential advantage to the high cost of horizontal direct drilling when a new bore must be created to solve a leak or other problem. (3) Typical travel distances can be divided into three distinct groups: up to 305 m (1,000 ft.); between 305 m and 610 m (1,000 ft. and 2,000 ft.); and beyond 914 m (3,000 ft.). All three groups require pig-based systems. A despooled umbilical system would suffice for the first two groups which represents 81% of survey respondents. The third group would require an onboard self-contained power unit for propulsion and welding/liner repair energy needs. (4) The most common size range for 80% to 90% of operators surveyed is 508 mm (20 in.) to 762 mm (30 in.), with 95% using 558.8 mm (22 in.) pipe. Evaluation trials were conducted on pipe sections with simulated corrosion damage repaired with glass fiber-reinforced composite liners, carbon fiber-reinforced composite liners, and weld deposition. Additional un-repaired pipe sections were evaluated in the virgin condition and with simulated damage. Hydrostatic failure pressures for pipe sections repaired with glass fiber-reinforced composite liner were only marginally greater than that of pipe sections without

  2. Northern pipelines : challenges and needs

    Energy Technology Data Exchange (ETDEWEB)

    Dean, D.; Brownie, D. [ProLog Canada Inc., Calgary, AB (Canada); Fafara, R. [TransCanada PipeLines Ltd., Calgary, AB (Canada)

    2007-07-01

    Working Group 10 presented experiences acquired from the operation of pipeline systems in a northern environment. There are currently 3 pipelines operating north of 60, notably the Shiha gas pipeline near Fort Liard, the Ikhil gas pipeline in Inuvik and the Norman Wells oil pipeline. Each has its unique commissioning, operating and maintenance challenges, as well as specific training and logistical support requirements for the use of in-line inspection tools and other forms of integrity assessment. The effectiveness of cathodic protection systems in a permafrost northern environment was also discussed. It was noted that the delay of the Mackenzie Gas Pipeline Project by two to three years due to joint regulatory review may lead to resource constraints for the project as well as competition for already scarce human resources. The issue of a potential timing conflict with the Alaskan Pipeline Project was also addressed as well as land use issues for routing of supply roads. Integrity monitoring and assessment issues were outlined with reference to pipe soil interaction monitoring in discontinuous permafrost; south facing denuded slope stability; base lining projects; and reclamation issues. It was noted that automatic welding and inspection will increase productivity, while reducing the need for manual labour. In response to anticipated training needs, companies are planning to involve and train Aboriginal labour and will provide camp living conditions that will attract labour. tabs., figs.

  3. Subsea pipeline operational risk management

    Energy Technology Data Exchange (ETDEWEB)

    Bell, R.L.; Lanan, G.A.

    1996-12-31

    Resources used for inspection, maintenance, and repair of a subsea pipeline must be allocated efficiently in order to operate it in the most cost effective manner. Operational risk management aids in resource allocation through the use of risk assessments and cost/benefit analyses. It identifies those areas where attention must be focused in order to reduce risk. When they are identified, a company`s resources (i.e., personnel, equipment, money, and time) can then be used for inspection, maintenance, and/or repair of the pipeline. The results are cost effective risk reduction and pipeline operation with minimum expenditure.

  4. Natural gas pipeline technology overview.

    Energy Technology Data Exchange (ETDEWEB)

    Folga, S. M.; Decision and Information Sciences

    2007-11-01

    The United States relies on natural gas for one-quarter of its energy needs. In 2001 alone, the nation consumed 21.5 trillion cubic feet of natural gas. A large portion of natural gas pipeline capacity within the United States is directed from major production areas in Texas and Louisiana, Wyoming, and other states to markets in the western, eastern, and midwestern regions of the country. In the past 10 years, increasing levels of gas from Canada have also been brought into these markets (EIA 2007). The United States has several major natural gas production basins and an extensive natural gas pipeline network, with almost 95% of U.S. natural gas imports coming from Canada. At present, the gas pipeline infrastructure is more developed between Canada and the United States than between Mexico and the United States. Gas flows from Canada to the United States through several major pipelines feeding U.S. markets in the Midwest, Northeast, Pacific Northwest, and California. Some key examples are the Alliance Pipeline, the Northern Border Pipeline, the Maritimes & Northeast Pipeline, the TransCanada Pipeline System, and Westcoast Energy pipelines. Major connections join Texas and northeastern Mexico, with additional connections to Arizona and between California and Baja California, Mexico (INGAA 2007). Of the natural gas consumed in the United States, 85% is produced domestically. Figure 1.1-1 shows the complex North American natural gas network. The pipeline transmission system--the 'interstate highway' for natural gas--consists of 180,000 miles of high-strength steel pipe varying in diameter, normally between 30 and 36 inches in diameter. The primary function of the transmission pipeline company is to move huge amounts of natural gas thousands of miles from producing regions to local natural gas utility delivery points. These delivery points, called 'city gate stations', are usually owned by distribution companies, although some are owned by

  5. Effort problem of chemical pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Okrajni, J.; Ciesla, M.; Mutwil, K. [Silesian Technical University, Katowice (Poland)

    1998-12-31

    The problem of the technical state assessment of the chemical pipelines working under mechanical and thermal loading has been shown in the paper. The pipelines effort after the long time operating period has been analysed. Material geometrical and loading conditions of the crack initiation and crack growth process in the chosen object has been discussed. Areas of the maximal effort have been determined. The material structure charges after the long time operating period have been described. Mechanisms of the crack initiation and crack growth in the pipeline elements have been analysed and mutual relations between the chemical and mechanical influences have been shown. (orig.) 16 refs.

  6. Pipeline integrity handbook risk management and evaluation

    CERN Document Server

    Singh, Ramesh

    2013-01-01

    Based on over 40 years of experience in the field, Ramesh Singh goes beyond corrosion control, providing techniques for addressing present and future integrity issues. Pipeline Integrity Handbook provides pipeline engineers with the tools to evaluate and inspect pipelines, safeguard the life cycle of their pipeline asset and ensure that they are optimizing delivery and capability. Presented in easy-to-use, step-by-step order, Pipeline Integrity Handbook is a quick reference for day-to-day use in identifying key pipeline degradation mechanisms and threats to pipeline integrity. The book begins

  7. Logistics aspects of petroleum pipeline operations

    Directory of Open Access Journals (Sweden)

    W. J. Pienaar

    2010-11-01

    Full Text Available The paper identifies, assesses and describes the logistics aspects of the commercial operation of petroleum pipelines. The nature of petroleum-product supply chains, in which pipelines play a role, is outlined and the types of petroleum pipeline systems are described. An outline is presented of the nature of the logistics activities of petroleum pipeline operations. The reasons for the cost efficiency of petroleum pipeline operations are given. The relative modal service effectiveness of petroleum pipeline transport, based on the most pertinent service performance measures, is offered. The segments in the petroleum-products supply chain where pipelines can play an efficient and effective role are identified.

  8. PIPELINES AS COMMUNICATION NETWORK LINKS

    Energy Technology Data Exchange (ETDEWEB)

    Kelvin T. Erickson; Ann Miller; E. Keith Stanek; C.H. Wu; Shari Dunn-Norman

    2005-03-14

    This report presents the results of an investigation into two methods of using the natural gas pipeline as a communication medium. The work addressed the need to develop secure system monitoring and control techniques between the field and control centers and to robotic devices in the pipeline. In the first method, the pipeline was treated as a microwave waveguide. In the second method, the pipe was treated as a leaky feeder or a multi-ground neutral and the signal was directly injected onto the metal pipe. These methods were tested on existing pipeline loops at UMR and Batelle. The results reported in this report indicate the feasibility of both methods. In addition, a few suitable communication link protocols for this network were analyzed.

  9. Pipeline for Contraceptive Development

    Science.gov (United States)

    Blithe, Diana L.

    2016-01-01

    The high rates of unplanned pregnancy reflect unmet need for effective contraceptive methods for women, especially for individuals with health risks such as obesity, diabetes, hypertension, and other conditions that may contraindicate use of an estrogen-containing product. Improvements in safety, user convenience, acceptability and availability of products remain important goals of the contraceptive development program. Another important goal is to minimize the impact of the products on the environment. Development of new methods for male contraception has the potential to address many of these issues with regard to safety for women who have contraindications to effective contraceptive methods but want to protect against pregnancy. It also will address a huge unmet need for men who want to control their fertility. Products under development for men would not introduce eco-toxic hormones in the waste water. Investment in contraceptive research to identify new products for women has been limited in the pharmaceutical industry relative to investment in drug development for other indications. Pharmaceutical R&D for male contraception was active in the 1990’s but was abandoned over a decade ago. The Eunice Kennedy Shriver National Institute of Child Health and Human Development (NICHD) has supported a contraceptive development program since 1969. Through a variety of programs including research grants and contracts, NICHD has developed a pipeline of new targets/products for male and female contraception. A number of lead candidates are under evaluation in the NICHD Contraceptive Clinical Trials Network (CCTN) (1–3). PMID:27523300

  10. Aozan: an automated post-sequencing data-processing pipeline.

    Science.gov (United States)

    Perrin, Sandrine; Firmo, Cyril; Lemoine, Sophie; Le Crom, Stéphane; Jourdren, Laurent

    2017-07-15

    Data management and quality control of output from Illumina sequencers is a disk space- and time-consuming task. Thus, we developed Aozan to automatically handle data transfer, demultiplexing, conversion and quality control once a run has finished. This software greatly improves run data management and the monitoring of run statistics via automatic emails and HTML web reports. Aozan is implemented in Java and Python, supported on Linux systems, and distributed under the GPLv3 License at: http://www.outils.genomique.biologie.ens.fr/aozan/ . Aozan source code is available on GitHub: https://github.com/GenomicParisCentre/aozan . aozan@biologie.ens.fr.

  11. 77 FR 19799 - Pipeline Safety: Pipeline Damage Prevention Programs

    Science.gov (United States)

    2012-04-02

    ... included excavation damage prevention. The Integrity Management for Gas Distribution, Report of Phase I...) The American Gas Association (AGA) The American Petroleum Institute (API) The American Public Gas... Locating Services, API, AOPL, INGAA, and several pipeline operators commented that PHMSA should develop the...

  12. Thermal calculations of underground oil pipelines

    Directory of Open Access Journals (Sweden)

    Moiseev Boris

    2017-01-01

    Full Text Available Operation of oil pipelines in the frozen soil causes heat exchange between the pipeline and the soil and formation of a melt zone which leads to deformation of pipelines. Terms of construction and operation of oil pipelines are greatly related to their temperature conditions. In this regard it is necessary to know the laws of formation of thawing halos around oil pipelines. Thus, elucidation of laws of formation of thawing halos around oil pipelines and determination of optimal conditions for their installation during construction in areas of permafrost in the north of Tyumen region is a very urgent task. The authors developed an algorithm and a computer program for construction of the temperature field of the frozen soil. Some problems have been solved basing on the obtained dependences and graphs of the dependence were constructed. Research and calculations made on the underground oil pipeline construction allowed the authors to give recommendations aimed at increasing the reliability of oil pipelines.

  13. Method and system for pipeline communication

    Science.gov (United States)

    Richardson,; John, G [Idaho Falls, ID

    2008-01-29

    A pipeline communication system and method includes a pipeline having a surface extending along at least a portion of the length of the pipeline. A conductive bus is formed to and extends along a portion of the surface of the pipeline. The conductive bus includes a first conductive trace and a second conductive trace with the first and second conductive traces being adapted to conformally couple with a pipeline at the surface extending along at least a portion of the length of the pipeline. A transmitter for sending information along the conductive bus on the pipeline is coupled thereto and a receiver for receiving the information from the conductive bus on the pipeline is also couple to the conductive bus.

  14. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  15. California Natural Gas Pipelines: A Brief Guide

    Energy Technology Data Exchange (ETDEWEB)

    Neuscamman, Stephanie [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Price, Don [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Pezzola, Genny [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Glascoe, Lee [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-01-22

    The purpose of this document is to familiarize the reader with the general configuration and operation of the natural gas pipelines in California and to discuss potential LLNL contributions that would support the Partnership for the 21st Century collaboration. First, pipeline infrastructure will be reviewed. Then, recent pipeline events will be examined. Selected current pipeline industry research will be summarized. Finally, industry acronyms are listed for reference.

  16. Risks associated with South African energy pipelines

    OpenAIRE

    2012-01-01

    M.Comm. The demand for products which are distributed through pipelines has increased worldwide over the last decade. These increases in demand have irrevocably impacted upon top management’s perceptions of risks associated with energy pipeline supply chains. Even in South Africa, the increase in demand for products such as diesel, jet fuel and petrol, which are supplied through the energy pipeline supply chains have increased the risks associated with energy pipeline supply chains. This s...

  17. Pipelines cathodic protection design methodologies for impressed ...

    African Journals Online (AJOL)

    ... X42 pipeline with total external surface area of 226224m2. The computation further showed the current requirement was attainable with connection of 3620 anodes to set up a natural potential between sacrificial anode and pipeline. Keywords: Cathodic protection, corrosion, impressed current, pipeline, sacrificial anodes ...

  18. Customer service drives pipelines` reorganization

    Energy Technology Data Exchange (ETDEWEB)

    Share, J.

    1997-06-01

    The concept behind formation of Enron Transportation and Storage tells plenty about this new gas industry. When executives at the Enron Gas Pipeline Group considered plans last year to streamline operations by merging the support functions of Transwestern Pipeline and their other wholly owned pipeline company, Northern Natural Gas, seamless customer service was foremost on their agenda. Instead of worrying about whether employees would favor one pipeline over the other, perhaps to the detriment of customers, they simply created a new organization that everyone would swear the same allegiance to. The 17,000-mile, 4.1 Bcf/d Northern system serves the upper Midwest market and two major expansion projects were completed there last year. Transwestern is a 2,700-mile system with an eastward capacity of 1 Bcf/d and westward of 1.5 Bcf/, that traditionally served California markets. It also ties into Texas intrastate markets and, thanks to expansion of the San Juan lateral, to southern Rocky Mountain supplies. Although Enron Corp. continues to position itself as a full-service energy company, the Gas Pipeline Group continues to fuel much of corporate`s net income, which was $584 million last year. With ET and S comprising a significant portion of GPG`s income, it was vital that the merger of Northern`s 950 employees with Transwestern`s 250 indeed be a seamless one. It was not easy either psychologically or geographically with main offices in Omaha, NE and Houston as well as operations centers in Minneapolis, MN; Amarillo, TX; W. Des Moines, IA; and Albuquerque, NM. But the results have been gratifying, according to William R. Cordes, President of ET and S and Nancy L. Gardner, Executive Vice President of Strategic Initiatives.

  19. XPIWIT--an XML pipeline wrapper for the Insight Toolkit.

    Science.gov (United States)

    Bartschat, Andreas; Hübner, Eduard; Reischl, Markus; Mikut, Ralf; Stegmaier, Johannes

    2016-01-15

    The Insight Toolkit offers plenty of features for multidimensional image analysis. Current implementations, however, often suffer either from a lack of flexibility due to hard-coded C++ pipelines for a certain task or by slow execution times, e.g. caused by inefficient implementations or multiple read/write operations for separate filter execution. We present an XML-based wrapper application for the Insight Toolkit that combines the performance of a pure C++ implementation with an easy-to-use graphical setup of dynamic image analysis pipelines. Created XML pipelines can be interpreted and executed by XPIWIT in console mode either locally or on large clusters. We successfully applied the software tool for the automated analysis of terabyte-scale, time-resolved 3D image data of zebrafish embryos. XPIWIT is implemented in C++ using the Insight Toolkit and the Qt SDK. It has been successfully compiled and tested under Windows and Unix-based systems. Software and documentation are distributed under Apache 2.0 license and are publicly available for download at https://bitbucket.org/jstegmaier/xpiwit/downloads/. johannes.stegmaier@kit.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. METRIC EVALUATION PIPELINE FOR 3D MODELING OF URBAN SCENES

    Directory of Open Access Journals (Sweden)

    M. Bosch

    2017-05-01

    Full Text Available Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  1. Metric Evaluation Pipeline for 3d Modeling of Urban Scenes

    Science.gov (United States)

    Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.

    2017-05-01

    Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.

  2. Neuroimaging Study Designs, Computational Analyses and Data Provenance Using the LONI Pipeline

    Science.gov (United States)

    Dinov, Ivo; Lozev, Kamen; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Pierce, Jonathan; Zamanyan, Alen; Chakrapani, Shruthi; Van Horn, John; Parker, D. Stott; Magsipoc, Rico; Leung, Kelvin; Gutman, Boris; Woods, Roger; Toga, Arthur

    2010-01-01

    Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges—management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu. PMID:20927408

  3. Slaying Hydra: A Python-Based Reduction Pipeline for the Hydra Multi-Object Spectrograph

    Science.gov (United States)

    Seifert, Richard; Mann, Andrew

    2018-01-01

    We present a Python-based data reduction pipeline for the Hydra Multi-Object Spectrograph on the WIYN 3.5 m telescope, an instrument which enables simultaneous spectroscopy of up to 93 targets. The reduction steps carried out include flat-fielding, dynamic fiber tracing, wavelength calibration, optimal fiber extraction, and sky subtraction. The pipeline also supports the use of sky lines to correct for zero-point offsets between fibers. To account for the moving parts on the instrument and telescope, fiber positions and wavelength solutions are derived in real-time for each dataset. The end result is a one-dimensional spectrum for each target fiber. Quick and fully automated, the pipeline enables on-the-fly reduction while observing, and has been known to outperform the IRAF pipeline by more accurately reproducing known RVs. While Hydra has many configurations in both high- and low-resolution, the pipeline was developed and tested with only one high-resolution mode. In the future we plan to expand the pipeline to work in most commonly used modes.

  4. Neuroimaging study designs, computational analyses and data provenance using the LONI pipeline.

    Directory of Open Access Journals (Sweden)

    Ivo Dinov

    2010-09-01

    Full Text Available Modern computational neuroscience employs diverse software tools and multidisciplinary expertise to analyze heterogeneous brain data. The classical problems of gathering meaningful data, fitting specific models, and discovering appropriate analysis and visualization tools give way to a new class of computational challenges--management of large and incongruous data, integration and interoperability of computational resources, and data provenance. We designed, implemented and validated a new paradigm for addressing these challenges in the neuroimaging field. Our solution is based on the LONI Pipeline environment [3], [4], a graphical workflow environment for constructing and executing complex data processing protocols. We developed study-design, database and visual language programming functionalities within the LONI Pipeline that enable the construction of complete, elaborate and robust graphical workflows for analyzing neuroimaging and other data. These workflows facilitate open sharing and communication of data and metadata, concrete processing protocols, result validation, and study replication among different investigators and research groups. The LONI Pipeline features include distributed grid-enabled infrastructure, virtualized execution environment, efficient integration, data provenance, validation and distribution of new computational tools, automated data format conversion, and an intuitive graphical user interface. We demonstrate the new LONI Pipeline features using large scale neuroimaging studies based on data from the International Consortium for Brain Mapping [5] and the Alzheimer's Disease Neuroimaging Initiative [6]. User guides, forums, instructions and downloads of the LONI Pipeline environment are available at http://pipeline.loni.ucla.edu.

  5. Probabilistic Matrix Factorization for Automated Machine Learning

    OpenAIRE

    Fusi, Nicolo; Elibol, Huseyn Melih

    2017-01-01

    In order to achieve state-of-the-art performance, modern machine learning techniques require careful data pre-processing and hyperparameter tuning. Moreover, given the ever increasing number of machine learning models being developed, model selection is becoming increasingly important. Automating the selection and tuning of machine learning pipelines consisting of data pre-processing methods and machine learning models, has long been one of the goals of the machine learning community. In this...

  6. Upstream pipelines : inspection, corrosion and integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Paez, J.; Stephenson, M. [Talisman Energy Inc., Calgary, AB (Canada)] (comps.)

    2009-07-01

    Accurate inspection techniques are needed to ensure the integrity of pipelines. This working group discussed methods of reducing pipeline failures for a variety of pipes. A summary of recent pipeline performance statistics was presented, as well as details of third party damage and fiberglass pipe failures. A batch inhibitor joint industry project was described. The session demonstrated that integrity program need to be developed at the field-level as well as at the upper management level. Fiberglass pipeline failures are significant problem for pipeline operators. Corrosion monitoring, pigging and specific budgets are needed in order to ensure the successful management of pipeline integrity. New software developed to predict pipeline corrosion rates was discussed, and methods of determining mole fractions and flow regimes were presented. The sessions included updates from regulators and standards agencies as well as discussions of best practices, regulations, codes and standards related to pipeline integrity. The working group was divided into 4 sessions: (1) updates since 2007 with input from the Canadian Association of Petroleum Producers (CAPP) and the Upstream Pipeline Integrity Management Association (UPIMA); (2) integrity of non-metallic pipelines; (3) upstream pipeline integrity issues; and (4) hot topics. tabs., figs.

  7. Russia: the pipeline diplomacy; Russie: la diplomatie du pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Bourdillon, Y

    2005-01-15

    First world producer of oil and gas, Russia wishes to use its mastery of energy distribution to recover its great power status. The oil and gas pipelines network is the basement used by Russia to build up its hegemony in Europe. The Russian oil and gas companies are also carrying out a long-term strategy of international expansion, in particular thanks to investments in the neighboring countries for the building of new infrastructures or the purchase of oil refineries. (J.S.)

  8. Marketing automation

    National Research Council Canada - National Science Library

    Raluca Dania Todor

    2016-01-01

      The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand...

  9. A Midas Plugin to Enable Construction of Reproducible Web-based Image Processing Pipelines

    Directory of Open Access Journals (Sweden)

    Michael eGrauer

    2013-12-01

    Full Text Available Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based UI, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  10. A midas plugin to enable construction of reproducible web-based image processing pipelines.

    Science.gov (United States)

    Grauer, Michael; Reynolds, Patrick; Hoogstoel, Marion; Budin, Francois; Styner, Martin A; Oguz, Ipek

    2013-01-01

    Image processing is an important quantitative technique for neuroscience researchers, but difficult for those who lack experience in the field. In this paper we present a web-based platform that allows an expert to create a brain image processing pipeline, enabling execution of that pipeline even by those biomedical researchers with limited image processing knowledge. These tools are implemented as a plugin for Midas, an open-source toolkit for creating web based scientific data storage and processing platforms. Using this plugin, an image processing expert can construct a pipeline, create a web-based User Interface, manage jobs, and visualize intermediate results. Pipelines are executed on a grid computing platform using BatchMake and HTCondor. This represents a new capability for biomedical researchers and offers an innovative platform for scientific collaboration. Current tools work well, but can be inaccessible for those lacking image processing expertise. Using this plugin, researchers in collaboration with image processing experts can create workflows with reasonable default settings and streamlined user interfaces, and data can be processed easily from a lab environment without the need for a powerful desktop computer. This platform allows simplified troubleshooting, centralized maintenance, and easy data sharing with collaborators. These capabilities enable reproducible science by sharing datasets and processing pipelines between collaborators. In this paper, we present a description of this innovative Midas plugin, along with results obtained from building and executing several ITK based image processing workflows for diffusion weighted MRI (DW MRI) of rodent brain images, as well as recommendations for building automated image processing pipelines. Although the particular image processing pipelines developed were focused on rodent brain MRI, the presented plugin can be used to support any executable or script-based pipeline.

  11. Analytic prognostic for petrochemical pipelines

    CERN Document Server

    Jaoude, Abdo Abou; El-Tawil, Khaled; Noura, Hassan; Ouladsine, Mustapha

    2012-01-01

    Pipelines tubes are part of vital mechanical systems largely used in petrochemical industries. They serve to transport natural gases or liquids. They are cylindrical tubes and are submitted to the risks of corrosion due to high PH concentrations of the transported liquids in addition to fatigue cracks due to the alternation of pressure-depression of gas along the time, initiating therefore in the tubes body micro-cracks that can propagate abruptly to lead to failure. The development of the prognostic process for such systems increases largely their performance and their availability, as well decreases the global cost of their missions. Therefore, this paper deals with a new prognostic approach to improve the performance of these pipelines. Only the first mode of crack, that is, the opening mode, is considered.

  12. Exchanger system for anhydrite pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Penet, A.

    1980-12-01

    This paper deals with the problem of roadway behaviour affecting airways, bottom roads and top roads in advancing faces in the HBNPC area. The support work in these zones used to be undertaken manually and involved a lot of man hours. These operations are now mechanized in the form of pneumatic stowing with anhydrite. The latter is delivered by pipeline using compressed air. The 'exchanger' is a sort of branching system in the pipeline enabling the flow to be switched from the top end of the face to the bottom end and vice versa (diagram). The associated operations can be undertaken with ease. The effect the system has on costs will soon be known. (In French)

  13. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The prese...

  14. NEW EUROPEAN PIPELINE PROJECT EASTRING

    OpenAIRE

    Karch, Lukáš; Varga, Augustín

    2018-01-01

    This paper focuses on the idea of a new European gas pipeline project called Eastring promoted by company Eustream. Eustream is the Slovak gas transmission system operator and has been one of the key players in European gas transmission from Russia to Europe in the last decades. Previous Russian-Ukrainian crisis resulted in reviewing gas flow directions from Russia to Europe in order to enhance the security of gas supplies to Europe. Russia plans to stop gas transmission to Europe via Ukraine...

  15. Pipeline purging principles and practice

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J.E.; Svedeman, S.J.; Kuhl, C.A. [Southwest Research Inst., San Antonio, TX (United States); Gregor, J.G. [Gas Research Inst., Chicago, IL (United States); Lambeth, A.K. [Texas Eastern Transmission Corp., Houston, TX (United States)

    1996-12-31

    Gas purging, a process of displacing one gas by another gas, occurs on a routine basis in the natural gas industry when pipelines are purged into and out of service. In a project sponsored by the Gas Research Institute and in cooperation with the American Gas Association (A.G.A.) the purging practices as outlined in the A.G.A.`s Purging Principles and Practices manual are being reviewed because many of today`s pipeline purging operations occur under conditions not addressed directly in the manual. The program focus is on the purging procedures outlined in Chapter 8 of the manual entitled Gas Transmission and Distribution pipes. The technical objective of the project is to develop an understanding of the scientific principles upon which safe, practical purging practices can be based. Direct displacement and inert gas slug purging operations are explained in terms of dispersion and mixing parameters and their relationship to the gas velocity. Field data is compared to the results of an analytical mixing model. Computer software for planning safe and cost effective pipeline purges has been developed. Finally, recommendations for revising Chapter 8 of the A.G.A. manual are presented.

  16. Shore approach of Camarupim pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Bernardi, Tiaraju P.; Oliveira Neto, Vasco A. de; Siqueira, Jakson [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Camarupim Field is located in the northern portion of Espirito Santo Basin and was discovered from the drilling of the well 1-ESS-164 in 2006. It is a gas field which start of the production is in mid of 2009. The production unit will be a FPSO (Floating Production, Storage and Offloading) and the gas will flow through a pipeline with diameter ranging from 12 inches and 24 inches with approximately 60 km long, from the FPSO Cidade de Sao Mateus to UTGC (Unit for Treatment of Gas Cacimbas-Linhares-ES). The FPSO will have processing capacity of 10MMm3/day of gas. Due to the approach of the pipeline in the continental portion, located in an environmental protection area and place of spawning of sea turtles, the connection between the stretch of sea and land pipeline running through a shore approach, known and proven technique of horizontal directional drilling about 950m in length. This paper will be shown the assumptions adopted, the technique employed, the challenges faced by the team and the lessons learned to build the directional hole. (author)

  17. DEVELOPMENT OF SOFTWARE SYSTEM FOR MONITORING OF STRESS CORROSION CRACKING OF THE PIPELINE UNDER TENSION

    Directory of Open Access Journals (Sweden)

    Z. K. Abaev

    2016-01-01

    Full Text Available The software and hardware development tendency, providing the automated monitoring and control of basic and auxiliary technological processes of gas transportation via system of main gas pipelines has been revealed. The article discusses the stages of creation of the software of system of monitoring corrosion cracking under tension (SCC. The new useful adequate regression models development determining the risk level of LCC is shown. A ranking sections algorithm of main gas pipeline (MG on the propensity to SCC is presented. Adequate developed regression equation determining the LCC risk level has been developed. To count the main gas pipeline lifetime the variable rank of the danger of SCC (RSCC on the basis of methods of fuzzy logic is proposed to use. Implementation of the fuzzy model was carried out using the graphical tools developed in MATLAB using the expansion pack Fuzzy Logic Toolbox. The working algorithm of developed program and the screen forms are presented.

  18. Extending the Fermi-LAT data processing pipeline to the grid

    Energy Technology Data Exchange (ETDEWEB)

    Zimmer, S. [Stockholm Univ., Stockholm (Sweden); The Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden); Arrabito, L. [Univ. Montpellier 2, Montpellier (France); Glanzman, T. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Johnson, T. [SLAC National Accelerator Lab., Menlo Park, CA (United States); Lavalley, C. [Univ. Montpellier 2, Montpellier (France); Tsaregorodtsev, A. [Centre de Physique des Particules de Marseille, Marseille (France)

    2015-05-12

    The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.

  19. A subtraction pipeline for automatic detection of new appearing multiple sclerosis lesions in longitudinal studies

    Energy Technology Data Exchange (ETDEWEB)

    Ganiler, Onur; Oliver, Arnau; Diez, Yago; Freixenet, Jordi; Llado, Xavier [University of Girona, VICOROB Computer Vision and Robotics Group, Girona (Spain); Vilanova, Joan C. [Girona Magnetic Resonance Center, Girona (Spain); Beltran, Brigitte [Dr. Josep Trueta University Hospital, Institut d' Investigacio Biomedica de Girona, Girona (Spain); Ramio-Torrenta, Lluis [Dr. Josep Trueta University Hospital, Institut d' Investigacio Biomedica de Girona, Multiple Sclerosis and Neuroimmunology Unit, Girona (Spain); Rovira, Alex [Vall d' Hebron University Hospital, Magnetic Resonance Unit, Department of Radiology, Barcelona (Spain)

    2014-05-15

    Time-series analysis of magnetic resonance images (MRI) is of great value for multiple sclerosis (MS) diagnosis and follow-up. In this paper, we present an unsupervised subtraction approach which incorporates multisequence information to deal with the detection of new MS lesions in longitudinal studies. The proposed pipeline for detecting new lesions consists of the following steps: skull stripping, bias field correction, histogram matching, registration, white matter masking, image subtraction, automated thresholding, and postprocessing. We also combine the results of PD-w and T2-w images to reduce false positive detections. Experimental tests are performed in 20 MS patients with two temporal studies separated 12 (12M) or 48 (48M) months in time. The pipeline achieves very good performance obtaining an overall sensitivity of 0.83 and 0.77 with a false discovery rate (FDR) of 0.14 and 0.18 for the 12M and 48M datasets, respectively. The most difficult situation for the pipeline is the detection of very small lesions where the obtained sensitivity is lower and the FDR higher. Our fully automated approach is robust and accurate, allowing detection of new appearing MS lesions. We believe that the pipeline can be applied to large collections of images and also be easily adapted to monitor other brain pathologies. (orig.)

  20. Automated Wormscan

    Science.gov (United States)

    Puckering, Timothy; Thompson, Jake; Sathyamurthy, Sushruth; Sukumar, Sinduja; Shapira, Tirosh; Ebert, Paul

    2017-01-01

    There has been a recent surge of interest in computer-aided rapid data acquisition to increase the potential throughput and reduce the labour costs of large scale Caenorhabditis elegans studies. We present Automated WormScan, a low-cost, high-throughput automated system using commercial photo scanners, which is extremely easy to implement and use, capable of scoring tens of thousands of organisms per hour with minimal operator input, and is scalable. The method does not rely on software training for image recognition, but uses the generation of difference images from sequential scans to identify moving objects. This approach results in robust identification of worms with little computational demand. We demonstrate the utility of the system by conducting toxicity, growth and fecundity assays, which demonstrate the consistency of our automated system, the quality of the data relative to manual scoring methods and congruity with previously published results. PMID:28413617

  1. MetAMOS: a modular and open source metagenomic assembly and analysis pipeline

    OpenAIRE

    Treangen, Todd J.; Koren, Sergey; Sommer, Daniel D; Liu, Bo; Astrovskaya, Irina; Ondov, Brian; Darling, Aaron E.; Phillippy, Adam M; Pop, Mihai

    2013-01-01

    Abstract We describe MetAMOS, an open source and modular metagenomic assembly and analysis pipeline. MetAMOS represents an important step towards fully automated metagenomic analysis, starting with next-generation sequencing reads and producing genomic scaffolds, open-reading frames and taxonomic or functional annotations. MetAMOS can aid in reducing assembly errors, commonly encountered when assembling metagenomic samples, and improves taxonomic assignment accuracy while also reducing com...

  2. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    OpenAIRE

    Christian Held; Tim Nattkemper; Ralf Palmisano; Thomas Wittenberg

    2013-01-01

    Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the p...

  3. Land use planning for pipelines : a guideline for local authorities, developers and pipeline operators

    Energy Technology Data Exchange (ETDEWEB)

    Kalra, S. (ed.) [Canadian Standards Association, Mississauga, ON (Canada)

    2004-08-01

    This document is intended for local authorities who deal with issues of land regulation and public safety near pipelines. Input was provided by representatives of the pipeline industry, government agencies, municipalities, planners, developers and academics. The document emphasizes the importance of continual communication by key stakeholders to reduce the potential for conflicts and costs, and to ensure that available access is maintained along the pipeline's right-of-way so that pipelines can be constructed, maintained or operated in an efficient manner. This document also outlines the significance of recognizing land uses adjacent to the pipeline right-of-way. It provides guidelines in the following subject areas: (1) roles and responsibilities for key stakeholders, (2) the pipeline industry, (3) products transported in pipelines, (4) land use planning issues regarding pipelines, and (5) sources of additional information. refs., figs.

  4. Transmission pipeline calculations and simulations manual

    CERN Document Server

    Menon, E Shashi

    2014-01-01

    Transmission Pipeline Calculations and Simulations Manual is a valuable time- and money-saving tool to quickly pinpoint the essential formulae, equations, and calculations needed for transmission pipeline routing and construction decisions. The manual's three-part treatment starts with gas and petroleum data tables, followed by self-contained chapters concerning applications. Case studies at the end of each chapter provide practical experience for problem solving. Topics in this book include pressure and temperature profile of natural gas pipelines, how to size pipelines for specified f

  5. A quick guide to pipeline engineering

    CERN Document Server

    Alkazraji, D

    2008-01-01

    Pipeline engineering requires an understanding of a wide range of topics. Operators must take into account numerous pipeline codes and standards, calculation approaches, and reference materials in order to make accurate and informed decisions.A Quick Guide to Pipeline Engineering provides concise, easy-to-use, and accessible information on onshore and offshore pipeline engineering. Topics covered include: design; construction; testing; operation and maintenance; and decommissioning.Basic principles are discussed and clear guidance on regulations is provided, in a way that will

  6. Tubular lining material for pipelines having bends

    Energy Technology Data Exchange (ETDEWEB)

    Moringa, A.; Sakaguchi, Y.; Hyodo, M.; Yagi, I.

    1987-03-24

    A tubular lining material for pipelines having bends or curved portions comprises a tubular textile jacket made of warps and wefts woven in a tubular form overlaid with a coating of a flexible synthetic resin. It is applicable onto the inner surface of a pipeline having bends or curved portions in such manner that the tubular lining material with a binder onto the inner surface thereof is inserted into the pipeline and allowed to advance within the pipeline, with or without the aid of a leading rope-like elongated element, while turning the tubular lining material inside out under fluid pressure. In this manner the tubular lining material is applied onto the inner surface of the pipeline with the binder being interposed between the pipeline and the tubular lining material. The lining material is characterized in that a part of all of the warps are comprised of an elastic yarn around which, over the full length thereof, a synthetic fiber yarn or yarns have been left-and/or right-handedly coiled. This tubular lining material is particularly suitable for lining a pipeline having an inner diameter of 25-200 mm and a plurality of bends, such as gas service pipelines or house pipelines, without occurrence of wrinkles in the lining material in a bend.

  7. Gemini Planet Imager observational calibrations XI: pipeline improvements and enhanced calibrations after two years on sky

    Science.gov (United States)

    Perrin, Marshall D.; Ingraham, Patrick; Follette, Katherine B.; Maire, Jérôme; Wang, Jason J.; Savransky, Dmitry; Arriaga, Pauline; Bailey, Vanessa P.; Bruzzone, Sebastian; Chilcote, Jeffrey K.; De Rosa, Robert J.; Draper, Zachary H.; Fitzgerald, Michael P.; Greenbaum, Alexandra Z.; Hung, Li-Wei; Konopacky, Quinn; Macintosh, Bruce; Marchis, Franck; Marois, Christian; Millar-Blanchaer, Maxwell A.; Nielsen, Eric; Rajan, Abhijith; Rameau, Julien; Rantakyro, Fredrik T.; Ruffio, Jean-Baptiste; Ward-Duong, Kimberly; Wolff, Schuyler G.; Zalesky, Joseph

    2016-08-01

    The Gemini Planet Imager has been successfully obtaining images and spectra of exoplanets, brown dwarfs, and debris and protoplanetary circumstellar disks using its integral field spectrograph and polarimeter. GPI observations are transformed from raw data into high-quality astrometrically and photometrically calibrated datacubes using the GPI Data Reduction Pipeline, an open-source software framework continuously developed by our team and available to the community. It uses a flexible system of reduction recipes composed of individual primitive steps, allowing substantial customization of processing depending upon science goals. This paper provides a broad overview of the GPI pipeline, summarizes key lessons learned, and describes improved calibration methods and new capabilities available in the latest version. Enhanced automation better supports observations at the telescope with streamlined and rapid data processing, for instance through real-time assessments of contrast performance and more automated calibration file processing. We have also incorporated the GPI Data Reduction Pipeline as one component in a larger automated data system to support the GPI Exoplanet Survey campaign, while retaining its flexibility and stand-alone capabilities to support the broader GPI observer community. Several accompanying papers describe in more detail specific aspects of the calibration of GPI data in both spectral and polarimetric modes.

  8. 77 FR 45417 - Pipeline Safety: Inspection and Protection of Pipeline Facilities After Railway Accidents

    Science.gov (United States)

    2012-07-31

    ..., depth and location of the pipelines so that the movement of heavy equipment and debris on the right-of... the presence, depth and location of the pipelines so that the movement of heavy equipment on the right... cover permitted by the current Federal and industry pipeline construction standards, it likely would...

  9. The PREP Pipeline: Standardized preprocessing for large-scale EEG analysis

    Directory of Open Access Journals (Sweden)

    Nima eBigdelys Shamlo

    2015-06-01

    Full Text Available The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode/.

  10. The PREP pipeline: standardized preprocessing for large-scale EEG analysis

    Science.gov (United States)

    Bigdely-Shamlo, Nima; Mullen, Tim; Kothe, Christian; Su, Kyung-Min; Robbins, Kay A.

    2015-01-01

    The technology to collect brain imaging and physiological measures has become portable and ubiquitous, opening the possibility of large-scale analysis of real-world human imaging. By its nature, such data is large and complex, making automated processing essential. This paper shows how lack of attention to the very early stages of an EEG preprocessing pipeline can reduce the signal-to-noise ratio and introduce unwanted artifacts into the data, particularly for computations done in single precision. We demonstrate that ordinary average referencing improves the signal-to-noise ratio, but that noisy channels can contaminate the results. We also show that identification of noisy channels depends on the reference and examine the complex interaction of filtering, noisy channel identification, and referencing. We introduce a multi-stage robust referencing scheme to deal with the noisy channel-reference interaction. We propose a standardized early-stage EEG processing pipeline (PREP) and discuss the application of the pipeline to more than 600 EEG datasets. The pipeline includes an automatically generated report for each dataset processed. Users can download the PREP pipeline as a freely available MATLAB library from http://eegstudy.org/prepcode. PMID:26150785

  11. Addressing the workforce pipeline challenge

    Energy Technology Data Exchange (ETDEWEB)

    Leonard Bond; Kevin Kostelnik; Richard Holman

    2006-11-01

    A secure and affordable energy supply is essential for achieving U.S. national security, in continuing U.S. prosperity and in laying the foundations to enable future economic growth. To meet this goal the next generation energy workforce in the U.S., in particular those needed to support instrumentation, controls and advanced operations and maintenance, is a critical element. The workforce is aging and a new workforce pipeline, to support both current generation and new build has yet to be established. The paper reviews the challenges and some actions being taken to address this need.

  12. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  13. Oil and Natural Gas Pipelines, North America, 2010, Platts

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Oil and Natural Gas Pipeline geospatial data layer contains gathering, interstate, and intrastate natural gas pipelines, crude and product oil pipelines, and...

  14. Anaesthetic machine pipeline inlet pressure gauges do not always measure pipeline pressure.

    Science.gov (United States)

    Craig, D B; Longmuir, J

    1980-09-01

    Some anaesthetic gas machines have pipeline inlet pressure gauges which indicate the higher of either pipeline pressure, or machine circuit pressure (the pressure distal to the pressure reducing valve, and proximal to the flowmeter control valve). Failure by the operator to appreciate this feature may in specific circumstances lead to a delayed appreciation of pipeline malfunction or disconnection. The Canadian Standards Association Z168.3-M1980 Anaesthetic Gas Machine Standard requires pipeline inlet gauges which measure only pipeline (hose) pressure. Existing machines should be modified to accommodate this requirement.

  15. Oil pipeline energy consumption and efficiency

    Energy Technology Data Exchange (ETDEWEB)

    Hooker, J.N.

    1981-01-01

    This report describes an investigation of energy consumption and efficiency of oil pipelines in the US in 1978. It is based on a simulation of the actual movement of oil on a very detailed representation of the pipeline network, and it uses engineering equations to calculate the energy that pipeline pumps must have exerted on the oil to move it in this manner. The efficiencies of pumps and drivers are estimated so as to arrive at the amount of energy consumed at pumping stations. The throughput in each pipeline segment is estimated by distributing each pipeline company's reported oil movements over its segments in proportions predicted by regression equations that show typical throughput and throughput capacity as functions of pipe diameter. The form of the equations is justified by a generalized cost-engineering study of pipelining, and their parameters are estimated using new techniques developed for the purpose. A simplified model of flow scheduling is chosen on the basis of actual energy use data obtained from a few companies. The study yields energy consumption and intensiveness estimates for crude oil trunk lines, crude oil gathering lines and oil products lines, for the nation as well as by state and by pipe diameter. It characterizes the efficiency of typical pipelines of various diameters operating at capacity. Ancillary results include estimates of oil movements by state and by diameter and approximate pipeline capacity utilization nationwide.

  16. Testing the School-to-Prison Pipeline

    Science.gov (United States)

    Owens, Emily G.

    2017-01-01

    The School-to-Prison Pipeline is a social phenomenon where students become formally involved with the criminal justice system as a result of school policies that use law enforcement, rather than discipline, to address behavioral problems. A potentially important part of the School-to-Prison Pipeline is the use of sworn School Resource Officers…

  17. pipelines cathodic protection design methodologies for impressed ...

    African Journals Online (AJOL)

    HOD

    total external surface area of 226224m2. The computation further showed the current requirement was attainable with connection of 3620 anodes to set up a natural potential between sacrificial anode and pipeline. Key words: Cathodic protection, corrosion, impressed current, pipeline, sacrificial anodes. 1. INTRODUCTION.

  18. Offshore Pipeline Locations in the Gulf of Mexico, Geographic NAD27, MMS (2007) [pipelines_vectors_mms_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the lines of the pipeline in the GOM. All pipelines existing in the databases...

  19. Offshore Pipeline Locations in the Gulf of Mexico, Geographic NAD27, MMS (2007) [pipelines_points_mms_2007

    Data.gov (United States)

    Louisiana Geographic Information Center — Offshore Minerals Management Pipeline Locations for the Gulf of Mexico (GOM). Contains the points of the pipeline in the GOM. All pipelines existing in the databases...

  20. Efficiency improvements in pipeline transportation systems

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.; Horton, J. F.

    1977-09-09

    This report identifies potential energy-conservative pipeline innovations that are most energy- and cost-effective and formulates recommendations for the R, D, and D programs needed to exploit those opportunities. From a candidate field of over twenty classes of efficiency improvements, eight systems are recommended for pursuit. Most of these possess two highly important attributes: large potential energy savings and broad applicability outside the pipeline industry. The R, D, and D program for each improvement and the recommended immediate next step are described. The eight technologies recommended for R, D, and D are gas-fired combined cycle compressor station; internally cooled internal combustion engine; methanol-coal slurry pipeline; methanol-coal slurry-fired and coal-fired engines; indirect-fired coal-burning combined-cycle pump station; fuel-cell pump station; drag-reducing additives in liquid pipelines; and internal coatings in pipelines.

  1. The Hyper Suprime-Cam software pipeline

    Science.gov (United States)

    Bosch, James; Armstrong, Robert; Bickerton, Steven; Furusawa, Hisanori; Ikeda, Hiroyuki; Koike, Michitaro; Lupton, Robert; Mineo, Sogo; Price, Paul; Takata, Tadafumi; Tanaka, Masayuki; Yasuda, Naoki; AlSayyad, Yusra; Becker, Andrew C.; Coulton, William; Coupon, Jean; Garmilla, Jose; Huang, Song; Krughoff, K. Simon; Lang, Dustin; Leauthaud, Alexie; Lim, Kian-Tat; Lust, Nate B.; MacArthur, Lauren A.; Mandelbaum, Rachel; Miyatake, Hironao; Miyazaki, Satoshi; Murata, Ryoma; More, Surhud; Okura, Yuki; Owen, Russell; Swinbank, John D.; Strauss, Michael A.; Yamada, Yoshihiko; Yamanoi, Hitomi

    2018-01-01

    In this paper, we describe the optical imaging data processing pipeline developed for the Subaru Telescope's Hyper Suprime-Cam (HSC) instrument. The HSC Pipeline builds on the prototype pipeline being developed by the Large Synoptic Survey Telescope's Data Management system, adding customizations for HSC, large-scale processing capabilities, and novel algorithms that have since been reincorporated into the LSST codebase. While designed primarily to reduce HSC Subaru Strategic Program (SSP) data, it is also the recommended pipeline for reducing general-observer HSC data. The HSC pipeline includes high-level processing steps that generate coadded images and science-ready catalogs as well as low-level detrending and image characterizations.

  2. Failure modes for pipelines in landslide areas

    Energy Technology Data Exchange (ETDEWEB)

    Bruschi, R.; Spinazze, M.; Tomassini, D. [Snamprogetti SpA, Fano (Italy); Cuscuna, S.; Venzi, S. [SNAM SpA, San Donato Milanese (Italy)

    1995-12-31

    In recent years a number of incidences of pipelines affected by slow soil movements have been reported in the relevant literature. Further related issues such as soil-pipe interaction have been studied both theoretically and through experimental surveys, along with the environmental conditions which are responsible for hazard to the pipeline integrity. A suitable design criteria under these circumstances has been discussed by several authors, in particular in relation to a limit state approach and hence a strain based criteria. The scope of this paper is to describe the failure mechanisms which may affect the pipeline in the presence of slow soil movements impacting on the pipeline, both in the longitudinal and transverse direction. Particular attention is paid to environmental, geometric and structural parameters which steer the process towards one or other failure mechanism. Criteria for deciding upon remedial measures required to guarantee the structural integrity of the pipeline, both in the short and in the long term, are discussed.

  3. optimization for trenchless reconstruction of pipelines

    Directory of Open Access Journals (Sweden)

    Zhmakov Gennadiy Nikolaevich

    2015-01-01

    Full Text Available Today the technologies of trenchless reconstruction of pipelines are becoming and more widely used in Russia and abroad. One of the most perspective is methods is shock-free destruction of the old pipeline being replaced with the help of hydraulic installations with working mechanism representing a cutting unit with knife disks and a conic expander. A construction of a working mechanism, which allows making trenchless reconstruction of pipelines of different diameters, is optimized and patented and its developmental prototype is manufactured. The dependence of pipeline cutting force from knifes obtusion of the working mechanisms. The cutting force of old steel pipelines with obtuse knife increases proportional to the value of its obtusion. Two stands for endurance tests of the knifes in laboratory environment are offered and patented.

  4. Pipeline integrity: ILI baseline data for QRA

    Energy Technology Data Exchange (ETDEWEB)

    Porter, Todd R. [Tuboscope Pipeline Services, Houston, TX (United States)]. E-mail: tporter@varco.com; Silva, Jose Augusto Pereira da [Pipeway Engenharia, Rio de Janeiro, RJ (Brazil)]. E-mail: guto@pipeway.com; Marr, James [MARR and Associates, Calgary, AB (Canada)]. E-mail: jmarr@marr-associates.com

    2003-07-01

    The initial phase of a pipeline integrity management program (IMP) is conducting a baseline assessment of the pipeline system and segments as part of Quantitative Risk Assessment (QRA). This gives the operator's integrity team the opportunity to identify critical areas and deficiencies in the protection, maintenance, and mitigation strategies. As a part of data gathering and integration of a wide variety of sources, in-line inspection (ILI) data is a key element. In order to move forward in the integrity program development and execution, the baseline geometry of the pipeline must be determined with accuracy and confidence. From this, all subsequent analysis and conclusions will be derived. Tuboscope Pipeline Services (TPS), in conjunction with Pipeway Engenharia of Brazil, operate ILI inertial navigation system (INS) and Caliper geometry tools, to address this integrity requirement. This INS and Caliper ILI tool data provides pipeline trajectory at centimeter level resolution and sub-metre 3D position accuracy along with internal geometry - ovality, dents, misalignment, and wrinkle/buckle characterization. Global strain can be derived from precise INS curvature measurements and departure from the initial pipeline state. Accurate pipeline elevation profile data is essential in the identification of sag/over bend sections for fluid dynamic and hydrostatic calculations. This data, along with pipeline construction, operations, direct assessment and maintenance data is integrated in LinaViewPRO{sup TM}, a pipeline data management system for decision support functions, and subsequent QRA operations. This technology provides the baseline for an informed, accurate and confident integrity management program. This paper/presentation will detail these aspects of an effective IMP, and experience will be presented, showing the benefits for liquid and gas pipeline systems. (author)

  5. PANDA: a pipeline toolbox for analyzing brain diffusion images

    Directory of Open Access Journals (Sweden)

    Zaixu eCui

    2013-02-01

    Full Text Available Diffusion magnetic resonance imaging (dMRI is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named Pipeline for Analyzing braiN Diffusion imAges (PANDA for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL, Pipeline System for Octave and Matlab (PSOM, Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics (e.g., FA and MD that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI, allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  6. PANDA: a pipeline toolbox for analyzing brain diffusion images.

    Science.gov (United States)

    Cui, Zaixu; Zhong, Suyu; Xu, Pengfei; He, Yong; Gong, Gaolang

    2013-01-01

    Diffusion magnetic resonance imaging (dMRI) is widely used in both scientific research and clinical practice in in-vivo studies of the human brain. While a number of post-processing packages have been developed, fully automated processing of dMRI datasets remains challenging. Here, we developed a MATLAB toolbox named "Pipeline for Analyzing braiN Diffusion imAges" (PANDA) for fully automated processing of brain diffusion images. The processing modules of a few established packages, including FMRIB Software Library (FSL), Pipeline System for Octave and Matlab (PSOM), Diffusion Toolkit and MRIcron, were employed in PANDA. Using any number of raw dMRI datasets from different subjects, in either DICOM or NIfTI format, PANDA can automatically perform a series of steps to process DICOM/NIfTI to diffusion metrics [e.g., fractional anisotropy (FA) and mean diffusivity (MD)] that are ready for statistical analysis at the voxel-level, the atlas-level and the Tract-Based Spatial Statistics (TBSS)-level and can finish the construction of anatomical brain networks for all subjects. In particular, PANDA can process different subjects in parallel, using multiple cores either in a single computer or in a distributed computing environment, thus greatly reducing the time cost when dealing with a large number of datasets. In addition, PANDA has a friendly graphical user interface (GUI), allowing the user to be interactive and to adjust the input/output settings, as well as the processing parameters. As an open-source package, PANDA is freely available at http://www.nitrc.org/projects/panda/. This novel toolbox is expected to substantially simplify the image processing of dMRI datasets and facilitate human structural connectome studies.

  7. Pipeline Decommissioning Trial AWE Berkshire UK - 13619

    Energy Technology Data Exchange (ETDEWEB)

    Agnew, Kieran [AWE, Aldermaston, Reading, RG7 4PR (United Kingdom)

    2013-07-01

    This Paper details the implementation of a 'Decommissioning Trial' to assess the feasibility of decommissioning the redundant pipeline operated by AWE located in Berkshire UK. The paper also presents the tool box of decommissioning techniques that were developed during the decommissioning trial. Constructed in the 1950's and operated until 2005, AWE used a pipeline for the authorised discharge of treated effluent. Now redundant, the pipeline is under a care and surveillance regime awaiting decommissioning. The pipeline is some 18.5 km in length and extends from AWE site to the River Thames. Along its route the pipeline passes along and under several major roads, railway lines and rivers as well as travelling through woodland, agricultural land and residential areas. Currently under care and surveillance AWE is considering a number of options for decommissioning the pipeline. One option is to remove the pipeline. In order to assist option evaluation and assess the feasibility of removing the pipeline a decommissioning trial was undertaken and sections of the pipeline were removed within the AWE site. The objectives of the decommissioning trial were to: - Demonstrate to stakeholders that the pipeline can be removed safely, securely and cleanly - Develop a 'tool box' of methods that could be deployed to remove the pipeline - Replicate the conditions and environments encountered along the route of the pipeline The onsite trial was also designed to replicate the physical prevailing conditions and constraints encountered along the remainder of its route i.e. working along a narrow corridor, working in close proximity to roads, working in proximity to above ground and underground services (e.g. Gas, Water, Electricity). By undertaking the decommissioning trial AWE have successfully demonstrated the pipeline can be decommissioned in a safe, secure and clean manor and have developed a tool box of decommissioning techniques. The tool box of includes

  8. The Very Large Array Data Processing Pipeline

    Science.gov (United States)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  9. Current pipelines for neglected diseases.

    Directory of Open Access Journals (Sweden)

    Paolo di Procolo

    2014-09-01

    Full Text Available This paper scrutinises pipelines for Neglected Diseases (NDs, through freely accessible and at-least-weekly updated trials databases. It updates to 2012 data provided by recent publications, and integrates these analyses with information on location of trials coordinators and patients recruitment status. Additionally, it provides (i disease-specific information to better understand the rational of investments in NDs, (ii yearly data, to understand the investment trends. The search identified 650 clinical studies. Leishmaniasis, Arbovirus infection, and Dengue are the top three diseases by number of clinical studies. Disease diffusion risk seems to be the most important driver of the clinical trials target choice, whereas the role played by disease prevalence and unmet need is controversial. Number of trials is stable between 2005 and 2010, with an increase in the last two years. Patient recruitment was completed for most studies (57.6%, and Phases II and III account for 35% and 28% of trials, respectively. The primary purpose of clinical investigations is prevention (49.3%, especially for infectious diseases with mosquitoes and sand flies as the vector, and treatment (43.2%, which is the primary target for parasitic diseases Research centres and public organisations are the most important clinical studies sponsors (58.9%, followed by the pharmaceutical industry (24.1%, foundations and non-governmental organisations (9.3%. Many coordinator centres are located in less affluent countries (43.7%, whereas OECD countries and BRICS account for 34.7% and 17.5% of trials, respectively. Information was partially missing for some parameters. Notwithstanding, and despite its descriptive nature, this research has enhanced the evidence of the literature on pipelines for NDs. Future contributions may further investigate whether trials metrics are consistent with the characteristics of the interested countries and the explicative variables of trials location

  10. Current pipelines for neglected diseases.

    Science.gov (United States)

    di Procolo, Paolo; Jommi, Claudio

    2014-09-01

    This paper scrutinises pipelines for Neglected Diseases (NDs), through freely accessible and at-least-weekly updated trials databases. It updates to 2012 data provided by recent publications, and integrates these analyses with information on location of trials coordinators and patients recruitment status. Additionally, it provides (i) disease-specific information to better understand the rational of investments in NDs, (ii) yearly data, to understand the investment trends. The search identified 650 clinical studies. Leishmaniasis, Arbovirus infection, and Dengue are the top three diseases by number of clinical studies. Disease diffusion risk seems to be the most important driver of the clinical trials target choice, whereas the role played by disease prevalence and unmet need is controversial. Number of trials is stable between 2005 and 2010, with an increase in the last two years. Patient recruitment was completed for most studies (57.6%), and Phases II and III account for 35% and 28% of trials, respectively. The primary purpose of clinical investigations is prevention (49.3%), especially for infectious diseases with mosquitoes and sand flies as the vector, and treatment (43.2%), which is the primary target for parasitic diseases Research centres and public organisations are the most important clinical studies sponsors (58.9%), followed by the pharmaceutical industry (24.1%), foundations and non-governmental organisations (9.3%). Many coordinator centres are located in less affluent countries (43.7%), whereas OECD countries and BRICS account for 34.7% and 17.5% of trials, respectively. Information was partially missing for some parameters. Notwithstanding, and despite its descriptive nature, this research has enhanced the evidence of the literature on pipelines for NDs. Future contributions may further investigate whether trials metrics are consistent with the characteristics of the interested countries and the explicative variables of trials location, target

  11. ASAP: an environment for automated preprocessing of sequencing data

    Directory of Open Access Journals (Sweden)

    Torstenson Eric S

    2013-01-01

    Full Text Available Abstract Background Next-generation sequencing (NGS has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Findings Advanced Sequence Automated Pipeline (ASAP was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. Conclusions ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  12. ASAP: an environment for automated preprocessing of sequencing data.

    Science.gov (United States)

    Torstenson, Eric S; Li, Bingshan; Li, Chun

    2013-01-04

    Next-generation sequencing (NGS) has yielded an unprecedented amount of data for genetics research. It is a daunting task to process the data from raw sequence reads to variant calls and manually processing this data can significantly delay downstream analysis and increase the possibility for human error. The research community has produced tools to properly prepare sequence data for analysis and established guidelines on how to apply those tools to achieve the best results, however, existing pipeline programs to automate the process through its entirety are either inaccessible to investigators, or web-based and require a certain amount of administrative expertise to set up. Advanced Sequence Automated Pipeline (ASAP) was developed to provide a framework for automating the translation of sequencing data into annotated variant calls with the goal of minimizing user involvement without the need for dedicated hardware or administrative rights. ASAP works both on computer clusters and on standalone machines with minimal human involvement and maintains high data integrity, while allowing complete control over the configuration of its component programs. It offers an easy-to-use interface for submitting and tracking jobs as well as resuming failed jobs. It also provides tools for quality checking and for dividing jobs into pieces for maximum throughput. ASAP provides an environment for building an automated pipeline for NGS data preprocessing. This environment is flexible for use and future development. It is freely available at http://biostat.mc.vanderbilt.edu/ASAP.

  13. Optimal hub location in pipeline networks

    Energy Technology Data Exchange (ETDEWEB)

    Dott, D.R.; Wirasinghe, S.C.; Chakma, A. [Univ. of Calgary, Alberta (Canada)

    1996-12-31

    This paper discusses optimization strategies and techniques for the location of natural gas marketing hubs in the North American gas pipeline network. A hub is a facility at which inbound and outbound network links meet and freight is redirected towards their destinations. Common examples of hubs used in the gas pipeline industry include gas plants, interconnects and market centers. Characteristics of the gas pipeline industry which are relevant to the optimization of transportation costs using hubs are presented. Allocation techniques for solving location-allocation problems are discussed. An outline of the research in process by the authors in the field of optimal gas hub location concludes the paper.

  14. Molgenis-impute: imputation pipeline in a box.

    Science.gov (United States)

    Kanterakis, Alexandros; Deelen, Patrick; van Dijk, Freerk; Byelas, Heorhiy; Dijkstra, Martijn; Swertz, Morris A

    2015-08-19

    Genotype imputation is an important procedure in current genomic analysis such as genome-wide association studies, meta-analyses and fine mapping. Although high quality tools are available that perform the steps of this process, considerable effort and expertise is required to set up and run a best practice imputation pipeline, particularly for larger genotype datasets, where imputation has to scale out in parallel on computer clusters. Here we present MOLGENIS-impute, an 'imputation in a box' solution that seamlessly and transparently automates the set up and running of all the steps of the imputation process. These steps include genome build liftover (liftovering), genotype phasing with SHAPEIT2, quality control, sample and chromosomal chunking/merging, and imputation with IMPUTE2. MOLGENIS-impute builds on MOLGENIS-compute, a simple pipeline management platform for submission and monitoring of bioinformatics tasks in High Performance Computing (HPC) environments like local/cloud servers, clusters and grids. All the required tools, data and scripts are downloaded and installed in a single step. Researchers with diverse backgrounds and expertise have tested MOLGENIS-impute on different locations and imputed over 30,000 samples so far using the 1,000 Genomes Project and new Genome of the Netherlands data as the imputation reference. The tests have been performed on PBS/SGE clusters, cloud VMs and in a grid HPC environment. MOLGENIS-impute gives priority to the ease of setting up, configuring and running an imputation. It has minimal dependencies and wraps the pipeline in a simple command line interface, without sacrificing flexibility to adapt or limiting the options of underlying imputation tools. It does not require knowledge of a workflow system or programming, and is targeted at researchers who just want to apply best practices in imputation via simple commands. It is built on the MOLGENIS compute workflow framework to enable customization with additional

  15. 49 CFR 192.513 - Test requirements for plastic pipelines.

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 3 2010-10-01 2010-10-01 false Test requirements for plastic pipelines. 192.513 Section 192.513 Transportation Other Regulations Relating to Transportation (Continued) PIPELINE AND... Test requirements for plastic pipelines. (a) Each segment of a plastic pipeline must be tested in...

  16. 49 CFR 195.57 - Filing offshore pipeline condition reports.

    Science.gov (United States)

    2010-10-01

    ... TRANSPORTATION OF HAZARDOUS LIQUIDS BY PIPELINE Annual, Accident, and Safety-Related Condition Reporting § 195.57... inspected. (5) Length and date of installation of each exposed pipeline segment, and location; including, if... tract. (6) Length and date of installation of each pipeline segment, if different from a pipeline...

  17. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Science.gov (United States)

    Held, Christian; Nattkemper, Tim; Palmisano, Ralf; Wittenberg, Thomas

    2013-01-01

    Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline's modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum. PMID:23766941

  18. Development of a full-length human protein production pipeline.

    Science.gov (United States)

    Saul, Justin; Petritis, Brianne; Sau, Sujay; Rauf, Femina; Gaskin, Michael; Ober-Reynolds, Benjamin; Mineyev, Irina; Magee, Mitch; Chaput, John; Qiu, Ji; LaBaer, Joshua

    2014-08-01

    There are many proteomic applications that require large collections of purified protein, but parallel production of large numbers of different proteins remains a very challenging task. To help meet the needs of the scientific community, we have developed a human protein production pipeline. Using high-throughput (HT) methods, we transferred the genes of 31 full-length proteins into three expression vectors, and expressed the collection as N-terminal HaloTag fusion proteins in Escherichia coli and two commercial cell-free (CF) systems, wheat germ extract (WGE) and HeLa cell extract (HCE). Expression was assessed by labeling the fusion proteins specifically and covalently with a fluorescent HaloTag ligand and detecting its fluorescence on a LabChip(®) GX microfluidic capillary gel electrophoresis instrument. This automated, HT assay provided both qualitative and quantitative assessment of recombinant protein. E. coli was only capable of expressing 20% of the test collection in the supernatant fraction with ≥20 μg yields, whereas CF systems had ≥83% success rates. We purified expressed proteins using an automated HaloTag purification method. We purified 20, 33, and 42% of the test collection from E. coli, WGE, and HCE, respectively, with yields ≥1 μg and ≥90% purity. Based on these observations, we have developed a triage strategy for producing full-length human proteins in these three expression systems. © 2014 The Protein Society.

  19. Regular pipeline maintenance of gas pipeline using technical operational diagnostics methods

    Energy Technology Data Exchange (ETDEWEB)

    Volentic, J. [Gas Transportation Department, Slovensky plynarensky priemysel, Slovak Gas Industry, Bratislava (Slovakia)

    1997-12-31

    Slovensky plynarensky priemysel (SPP) has operated 17 487 km of gas pipelines in 1995. The length of the long-line pipelines reached 5 191 km, distribution network was 12 296 km. The international transit system of long-line gas pipelines ranged 1 939 km of pipelines of various dimensions. The described scale of transport and distribution system represents a multibillion investments stored in the ground, which are exposed to the environmental influences and to pipeline operational stresses. In spite of all technical and maintenance arrangements, which have to be performed upon operating gas pipelines, the gradual ageing takes place anyway, expressed in degradation process both in steel tube, as well as in the anti-corrosion coating. Within a certain time horizon, a consistent and regular application of methods and means of in-service technical diagnostics and rehabilitation of existing pipeline systems make it possible to save substantial investment funds, postponing the need in funds for a complex or partial reconstruction or a new construction of a specific gas section. The purpose of this presentation is to report on the implementation of the programme of in-service technical diagnostics of gas pipelines within the framework of regular maintenance of SPP s.p. Bratislava high pressure gas pipelines. (orig.) 6 refs.

  20. Pipelines in Louisiana, Geographic NAD83, USGS (1999) [pipelines_la_usgs_1999

    Data.gov (United States)

    Louisiana Geographic Information Center — This dataset contains vector line map information of various pipelines throughout the State of Louisiana. The vector data contain selected base categories of...

  1. Fuzzy logic for pipelines risk assessment

    Directory of Open Access Journals (Sweden)

    Ali Alidoosti

    2012-08-01

    Full Text Available Pipelines systems are identified to be the safest way of transporting oil and natural gas. One of the most important aspects in developing pipeline systems is determining the potential risks that implementers may encounter. Therefore, risk analysis can determine critical risk items to allocate the limited resources and time. Risk Analysis and Management for Critical Asset Protection (RAMCAP is one of the best methodologies for assessing the security risks. However, the most challenging problem in this method is uncertainty. Therefore, fuzzy set theory is used to model the uncertainty. Thus, Fuzzy RAMCAP is introduced in order to risk analysis and management for pipeline systems. Finally, a notional example from pipeline systems is provided to demonstrate an application of the proposed methodology.

  2. GLAST (FERMI) Data-Processing Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Flath, Daniel L.; Johnson, Tony S.; Turri, Massimiliano; Heidenreich, Karen A.; /SLAC

    2011-08-12

    The Data Processing Pipeline ('Pipeline') has been developed for the Gamma-Ray Large Area Space Telescope (GLAST) which launched June 11, 2008. It generically processes graphs of dependent tasks, maintaining a full record of its state, history and data products. The Pipeline is used to automatically process the data down-linked from the satellite and to deliver science products to the GLAST collaboration and the Science Support Center and has been in continuous use since launch with great success. The pipeline handles up to 2000 concurrent jobs and in reconstructing science data produces approximately 750GB of data products using 1/2 CPU-year of processing time per day.

  3. Citizenship program in near communities of pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Mascarenhas, Carina R.; Vilas Boas, Ianne P. [TELSAN Engenharia, Belo Horizonte, MG (Brazil); Bourscheid, Pitagoras [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-12-19

    During the construction of a pipeline, the IENE - Engineering Unit of PETROBRAS, responsible for the construction and erection of pipelines and related plants in northeastern Brazil, crossed more than 7 states and 250 counties, had implemented a social responsibility program, in special a citizenship program. This action was the result of community studies located near of the pipelines AID - Direct Influence Area (438 yards right and left of the pipeline) and through the evidence that those locations were poor and have no personal documents and citizen position in society. This paper intents to share the experience of IENE about its citizen program that worked in three big lines: community mobilization; citizenship qualification; and citizenship board. This last one, turns possible to people obtains theirs personal documents and exercise the plenitude of citizenship. (author)

  4. Epoxy Pipelining Composition and Method of Manufacture.

    Science.gov (United States)

    1994-12-14

    ARLINGTON VA 22217-5660 "Accesion For NTIS CRA &I DTIC TAB El Unannounced i- Justification .- ---- ----- By ---- ----- Distribution I Availability Codes... corrosion and erosion. The pipelining 10 composition forms a barrier which prevents the leaching of, for example, metals from 11 pipes. This invention...invention, 13 more particularly, relates to an epoxy resin/curing agent corrosion -resistant network 14 pipelining composition suitable for the in

  5. A review of bioinformatic pipeline frameworks

    Science.gov (United States)

    2017-01-01

    Abstract High-throughput bioinformatic analyses increasingly rely on pipeline frameworks to process sequence and metadata. Modern implementations of these frameworks differ on three key dimensions: using an implicit or explicit syntax, using a configuration, convention or class-based design paradigm and offering a command line or workbench interface. Here I survey and compare the design philosophies of several current pipeline frameworks. I provide practical recommendations based on analysis requirements and the user base. PMID:27013646

  6. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  7. GRAVITY PIPELINE TRANSPORT FOR HARDENING FILLING MIXTURES

    Directory of Open Access Journals (Sweden)

    Leonid KROUPNIK

    2015-12-01

    Full Text Available In underground mining of solid minerals becoming increasingly common development system with stowing hardening mixtures. In this case the natural ore array after it is replaced by an artificial excavation of solidified filling mixture consisting of binder, aggregates and water. Such a mixture is prepared on the surface on special stowing complexes and transported underground at special stowing pipelines. However, it is transported to the horizons of a few kilometers, which requires a sustainable mode of motion of such a mixture in the pipeline. Hardening stowing mixture changes its rheological characteristics over time, which complicates the calculation of the parameters of pipeline transportation. The article suggests a method of determining the initial parameters of such mixtures: the status coefficient, indicator of transportability, coefficient of hydrodynamic resistance to motion of the mixture. These indicators characterize the mixture in terms of the possibility to transport it through pipes. On the basis of these indicators is proposed methodology for calculating the parameters of pipeline transport hardening filling mixtures in drift mode when traffic on the horizontal part of the mixture under pressure column of the mixture in the vertical part of the backfill of the pipeline. This technique allows stable operation is guaranteed to provide pipeline transportation.

  8. The Ruptured Pipeline: Analysis of the Mining Engineering Faculty Pipeline

    Science.gov (United States)

    Poulton, M.

    2011-12-01

    The booming commodities markets of the past seven years have created an enormous demand for economic geologists, mining engineers, and extractive metallurgists. The mining sector has largely been recession proof due to demand drivers coming from developing rather than developed nations. The strong demand for new hires as well as mid-career hires has exposed the weakness of the U.S. university supply pipeline for these career fields. A survey of mining and metallurgical engineering faculty and graduate students was conducted in 2010 at the request of the Society for Mining, Metallurgy, and Exploration. The goals of the surveys were to determine the demographics of the U.S. faculty in mining and metallurgical engineering, the expected faculty turn over by 2010 and the potential supply of graduate students as the future professorate. All Mining Engineering and Metallurgical Engineering degrees in the U.S. are accredited by the Accreditation Board for Engineering and Technology (ABET) and the specific courses required are set by the sponsoring professional society, Society for Mining, Metallurgy, and Exploration. There are 13 universities in the U.S. that offer a degree in Mining Engineering accredited as Mining Engineering and 1 university that grants a Mining Engineering degree accredited under general engineering program requirements. Faculty numbers are approximately 87 tenure track positions with a total undergraduate enrollment of slightly over 1,000 in the 2008-2009 academic year. There are approximately 262 graduate students in mining engineering in the U.S. including 87 Ph.D. students. Mining Engineering department heads have identified 14 positions open in 2010 and 18 positions expected to be open in the next 5 years and an additional 21 positions open by 2020. The current survey predicts a 56% turn over in mining faculty ranks over the next 10 years but a retirement of 100% of senior faculty over 10 years. 63% of graduate students say they are interested in

  9. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  10. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  11. Launching genomics into the cloud: deployment of Mercury, a next generation sequence analysis pipeline.

    Science.gov (United States)

    Reid, Jeffrey G; Carroll, Andrew; Veeraraghavan, Narayanan; Dahdouli, Mahmoud; Sundquist, Andreas; English, Adam; Bainbridge, Matthew; White, Simon; Salerno, William; Buhay, Christian; Yu, Fuli; Muzny, Donna; Daly, Richard; Duyk, Geoff; Gibbs, Richard A; Boerwinkle, Eric

    2014-01-29

    Massively parallel DNA sequencing generates staggering amounts of data. Decreasing cost, increasing throughput, and improved annotation have expanded the diversity of genomics applications in research and clinical practice. This expanding scale creates analytical challenges: accommodating peak compute demand, coordinating secure access for multiple analysts, and sharing validated tools and results. To address these challenges, we have developed the Mercury analysis pipeline and deployed it in local hardware and the Amazon Web Services cloud via the DNAnexus platform. Mercury is an automated, flexible, and extensible analysis workflow that provides accurate and reproducible genomic results at scales ranging from individuals to large cohorts. By taking advantage of cloud computing and with Mercury implemented on the DNAnexus platform, we have demonstrated a powerful combination of a robust and fully validated software pipeline and a scalable computational resource that, to date, we have applied to more than 10,000 whole genome and whole exome samples.

  12. Overview of interstate hydrogen pipeline systems.

    Energy Technology Data Exchange (ETDEWEB)

    Gillette, J .L.; Kolpa, R. L

    2008-02-01

    The use of hydrogen in the energy sector of the United States is projected to increase significantly in the future. Current uses are predominantly in the petroleum refining sector, with hydrogen also being used in the manufacture of chemicals and other specialized products. Growth in hydrogen consumption is likely to appear in the refining sector, where greater quantities of hydrogen will be required as the quality of the raw crude decreases, and in the mining and processing of tar sands and other energy resources that are not currently used at a significant level. Furthermore, the use of hydrogen as a transportation fuel has been proposed both by automobile manufacturers and the federal government. Assuming that the use of hydrogen will significantly increase in the future, there would be a corresponding need to transport this material. A variety of production technologies are available for making hydrogen, and there are equally varied raw materials. Potential raw materials include natural gas, coal, nuclear fuel, and renewables such as solar, wind, or wave energy. As these raw materials are not uniformly distributed throughout the United States, it would be necessary to transport either the raw materials or the hydrogen long distances to the appropriate markets. While hydrogen may be transported in a number of possible forms, pipelines currently appear to be the most economical means of moving it in large quantities over great distances. One means of controlling hydrogen pipeline costs is to use common rights-of-way (ROWs) whenever feasible. For that reason, information on hydrogen pipelines is the focus of this document. Many of the features of hydrogen pipelines are similar to those of natural gas pipelines. Furthermore, as hydrogen pipeline networks expand, many of the same construction and operating features of natural gas networks would be replicated. As a result, the description of hydrogen pipelines will be very similar to that of natural gas pipelines

  13. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    OpenAIRE

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; Pareto, Deborah; Vilanova, Joan C.; Ramió-Torrentà, LLuís; Sastre-Garriga, Jaume; Montalban, Xavier; Rovira, Àlex; Lladó, Xavier

    2015-01-01

    Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS) lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. Fir...

  14. EzMap: a simple pipeline for reproducible analysis of the human virome.

    Science.gov (United States)

    Czeczko, Patrick; Greenway, Steven C; de Koning, A P Jason

    2017-08-15

    In solid-organ transplant recipients, a delicate balance between immunosuppression and immunocompetence must be achieved, which can be difficult to monitor in real-time. Shotgun sequencing of cell-free DNA (cfDNA) has been recently proposed as a new way to indirectly assess immune function in transplant recipients through analysis of the status of the human virome. To facilitate exploration of the utility of the human virome as an indicator of immune status, and to enable rapid, straightforward analyses by clinicians, we developed a fully automated computational pipeline, EzMap, for performing metagenomic analysis of the human virome. EzMap combines a number of tools to clean, filter, and subtract WGS reads by mapping to a reference human assembly. The relative abundance of each virus present is estimated using a maximum likelihood approach that accounts for genome size, and results are presented with interactive visualizations and taxonomy-based summaries that enable rapid insights. The pipeline is automated to run on both workstations and computing clusters for all steps. EzMap automates an otherwise tedious and time-consuming protocol and aims to facilitate rapid and reproducible insights from cfDNA. EzMap is freely available at https://github.com/dekoning-lab/ezmap. jason.dekoning@ucalgary.ca. Supplementary data are available at Bioinformatics online.

  15. Transformation pipelines for PROJ.4

    Science.gov (United States)

    Knudsen, Thomas; Evers, Kristian

    2017-04-01

    Convert the geographic coordinates to 3D cartesian geocentric coordinates Apply a Helmert transformation from ED50 to ETRS89 Convert back from cartesian to geographic coordinates Finally project the geographic coordinates to UTM zone 32 planar coordinates. The homology between these steps and a Unix shell style pipeline is evident. With this as its main architectural inspiration, the primary feature of our implementation is a pipeline driver, that takes as its user supplied arguments, a series of elementary operations, which it strings together in order to implement the full transformation needed. Also, we have added a number of elementary geodetic operations, including Helmert transformations, general high order polynomial shifts (2D Horner's scheme) and the abridged Molodensky transformation. In anticipation of upcoming support for full time-varying transformations, we also introduce a 4D spatiotemporal data type, and a programming interface (API) for handling this. With these improvements in place, we assert that PROJ.4 is now well on its way from being a mostly-map-projection library, to becoming an almost-generic-geodetic-transformation library.

  16. Planning for purging and loading of a newly constructed gas pipeline system using a pipeline simulator

    Energy Technology Data Exchange (ETDEWEB)

    Mohitpour, M.; Kazakoff, J.; Jenkins, A.; Montemurro, D. [TransCanada Corp., Calgary, AB (Canada)

    2000-07-01

    A brief review of purging and loading of a gas pipeline was presented with a summary of current industry practices. If purging is done when pipelines are put into service, it involves the displacement of air or nitrogen by high pressure natural gas into one end of the pipeline section. If purging is done when a pipeline goes out of service, it involves the displacement of natural gas by air or other neutral gases. Current practices give no consideration to minimize the emission of methane gas into the atmosphere. This paper described a simplified purging calculation method and a simulation technique using commercially available software for planning purging and loading operations of gas pipeline systems. The hydraulic-based simulation technique made it possible to minimize the gas to air interface and minimize the emission of methane gas. The simulation also helped to predict the timing of purging and loading of the pipeline. An example was presented of the newly constructed Mayakan Pipeline in Mexico to demonstrate how the process was developed. Simulation results were favourably compared with field data collected during the actual purging and loading of the pipeline. 11 refs., 5 tabs., 8 figs.

  17. 76 FR 18750 - Humble Gas Pipeline Company; Cobra Pipeline Ltd.; Notice of Baseline Filings

    Science.gov (United States)

    2011-04-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Humble Gas Pipeline Company; Cobra Pipeline Ltd.; Notice of Baseline Filings Take notice that on March 28, 2011, the applicants listed above submitted a revised baseline filing of...

  18. Integrated surface management for pipeline construction: The Mid-America Pipeline Company Four Corners Project

    Science.gov (United States)

    Maria L. Sonett

    1999-01-01

    Integrated surface management techniques for pipeline construction through arid and semi-arid rangeland ecosystems are presented in a case history of a 412-mile pipeline construction project in New Mexico. Planning, implementation and monitoring for restoration of surface hydrology, soil stabilization, soil cover, and plant species succession are discussed. Planning...

  19. The Dangers of Pipeline Thinking: How the School-to-Prison Pipeline Metaphor Squeezes out Complexity

    Science.gov (United States)

    McGrew, Ken

    2016-01-01

    In this essay Ken McGrew critically examines the "school-to-prison pipeline" metaphor and associated literature. The origins and influence of the metaphor are compared with the origins and influence of the competing "prison industrial complex" concept. Specific weaknesses in the "pipeline literature" are examined.…

  20. 76 FR 44985 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding

    Science.gov (United States)

    2011-07-27

    ... River in the past few months. While the cause of the accident is still under investigation, ExxonMobil.... Coordinate with other pipeline operators in the flood area and establish emergency response centers to act as... others involved in post-flood restoration activities of the presence of pipelines and the risks posed by...

  1. 78 FR 41991 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by Flooding

    Science.gov (United States)

    2013-07-12

    ... forces sufficient to cause a failure. These forces are increased by the accumulation of debris against... other pipeline operators in the flood area and establish emergency response centers to act as a liaison... involved in post-flood restoration activities of the presence of pipelines and the risks posed by reduced...

  2. 75 FR 4134 - Pipeline Safety: Leak Detection on Hazardous Liquid Pipelines

    Science.gov (United States)

    2010-01-26

    ... safety study on pipeline Supervisory Control and Data Acquisition (SCADA) systems (NTSB/SS-05/02). The... indications of a leak on the SCADA interface was the impetus for this study. The NTSB examined 13 hazardous... large pipeline breaks. The line balance processes incorporating SCADA or other technology are geared to...

  3. 77 FR 32631 - Lion Oil Trading & Transportation, Inc., Magnolia Pipeline Company, and El Dorado Pipeline...

    Science.gov (United States)

    2012-06-01

    ... Energy Regulatory Commission Lion Oil Trading & Transportation, Inc., Magnolia Pipeline Company, and El... 385.202 (2011), Lion Oil Trading & Transportation, Inc., Magnolia Pipeline Company, and El Dorado... to Lion subject to certain conditions at specified price spreads. In support of the request for...

  4. Gap-free segmentation of vascular networks with automatic image processing pipeline.

    Science.gov (United States)

    Hsu, Chih-Yang; Ghaffari, Mahsa; Alaraj, Ali; Flannery, Michael; Zhou, Xiaohong Joe; Linninger, Andreas

    2017-03-01

    Current image processing techniques capture large vessels reliably but often fail to preserve connectivity in bifurcations and small vessels. Imaging artifacts and noise can create gaps and discontinuity of intensity that hinders segmentation of vascular trees. However, topological analysis of vascular trees require proper connectivity without gaps, loops or dangling segments. Proper tree connectivity is also important for high quality rendering of surface meshes for scientific visualization or 3D printing. We present a fully automated vessel enhancement pipeline with automated parameter settings for vessel enhancement of tree-like structures from customary imaging sources, including 3D rotational angiography, magnetic resonance angiography, magnetic resonance venography, and computed tomography angiography. The output of the filter pipeline is a vessel-enhanced image which is ideal for generating anatomical consistent network representations of the cerebral angioarchitecture for further topological or statistical analysis. The filter pipeline combined with computational modeling can potentially improve computer-aided diagnosis of cerebrovascular diseases by delivering biometrics and anatomy of the vasculature. It may serve as the first step in fully automatic epidemiological analysis of large clinical datasets. The automatic analysis would enable rigorous statistical comparison of biometrics in subject-specific vascular trees. The robust and accurate image segmentation using a validated filter pipeline would also eliminate operator dependency that has been observed in manual segmentation. Moreover, manual segmentation is time prohibitive given that vascular trees have more than thousands of segments and bifurcations so that interactive segmentation consumes excessive human resources. Subject-specific trees are a first step toward patient-specific hemodynamic simulations for assessing treatment outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. User-independent diffusion tensor imaging analysis pipelines in a rat model presenting ventriculomegalia: A comparison study.

    Science.gov (United States)

    Akakpo, Luis; Pierre, Wyston C; Jin, Chen; Londono, Irène; Pouliot, Philippe; Lodygensky, Gregory A

    2017-11-01

    Automated analysis of diffusion tensor imaging (DTI) data is an appealing way to process large datasets in an unbiased manner. However, automation can sometimes be linked to a lack of interpretability. Two whole-brain, automated and voxelwise methods exist: voxel-based analysis (VBA) and tract-based spatial statistics (TBSS). In VBA, the amount of smoothing has been shown to influence the results. TBSS is free of this step, but a projection procedure is introduced to correct for residual misalignments. This projection assigns the local highest fractional anisotropy (FA) value to the mean FA skeleton, which represents white matter tract centers. For both methods, the normalization procedure has a major impact. These issues are well documented in humans but, to our knowledge, not in rodents. In this study, we assessed the quality of three different registration algorithms (ANTs SyN, DTI-TK and FNIRT) using study-specific templates and their impact on automated analysis methods (VBA and TBSS) in a rat pup model of diffuse white matter injury presenting large unilateral deformations. VBA and TBSS results were stable and anatomically coherent across the three pipelines. For VBA, in regions around the large deformations, interpretability was limited because of the increased partial volume effect. With TBSS, two of the three pipelines found a significant decrease in axial diffusivity (AD) at the known injury site. These results demonstrate that automated voxelwise analyses can be used in an animal model with large deformations. Copyright © 2017 John Wiley & Sons, Ltd.

  6. A novel approach to pipeline tensioner modeling

    Energy Technology Data Exchange (ETDEWEB)

    O' Grady, Robert; Ilie, Daniel; Lane, Michael [MCS Software Division, Galway (Ireland)

    2009-07-01

    As subsea pipeline developments continue to move into deep and ultra-deep water locations, there is an increasing need for the accurate prediction of expected pipeline fatigue life. A significant factor that must be considered as part of this process is the fatigue damage sustained by the pipeline during installation. The magnitude of this installation-related damage is governed by a number of different agents, one of which is the dynamic behavior of the tensioner systems during pipe-laying operations. There are a variety of traditional finite element methods for representing dynamic tensioner behavior. These existing methods, while basic in nature, have been proven to provide adequate forecasts in terms of the dynamic variation in typical installation parameters such as top tension and sagbend/overbend strain. However due to the simplicity of these current approaches, some of them tend to over-estimate the frequency of tensioner pay out/in under dynamic loading. This excessive level of pay out/in motion results in the prediction of additional stress cycles at certain roller beds, which in turn leads to the prediction of unrealistic fatigue damage to the pipeline. This unwarranted fatigue damage then equates to an over-conservative value for the accumulated damage experienced by a pipeline weld during installation, and so leads to a reduction in the estimated fatigue life for the pipeline. This paper describes a novel approach to tensioner modeling which allows for greater control over the velocity of dynamic tensioner pay out/in and so provides a more accurate estimation of fatigue damage experienced by the pipeline during installation. The paper reports on a case study, as outlined in the proceeding section, in which a comparison is made between results from this new tensioner model and from a more conventional approach. The comparison considers typical installation parameters as well as an in-depth look at the predicted fatigue damage for the two methods

  7. Pipeline risk assessment and control: Nigerian National Petroleum Pipeline network experience

    Energy Technology Data Exchange (ETDEWEB)

    Adubi, F.A.; Egho, P.I. [Pipelines and Products Marketing Company Ltd., Nigerian National Petroleum Corporation, Lagos (Nigeria)

    1992-12-31

    Third party encroachment and corrosion were identified as major causes of pipeline failure in Nigeria. The multi-faceted approach developed by the Nigerian National Petroleum Pipeline Corporation for effective assessment and control of risks to pipelines was described. In essence, information provided from each activity is used to complement information from other activities. This approach led to a better understanding of pipeline status and reduced risk of failures. Aerial surveillance was intensified in order to detect illegal activities, halt them, and remedy any damage. Efforts were also made to intensify corrosion monitoring and pipeline integrity surveys to avoid premature failures. Cathodic protection equipment was found to be only partially effective. due to vandalism and uncontrolled bush burning. 10 figs., 1 ref.

  8. Pipeline four-dimension management is the trend of pipeline integrity management in the future

    Energy Technology Data Exchange (ETDEWEB)

    Shaohua, Dong; Feifan; Zhongchen, Han [China National Petroleum Corporation (CNPC), Beijing (China)

    2009-07-01

    Pipeline integrity management is essential for today's operators to operate their pipelines safety and cost effectively. The latest developments of pipeline integrity management around the world are involved with change of regulation, industry standard and innovation of technology. And who know the trend of PIM in the future, which can be answered in the paper. As a result, the concept of P4DM was set up firstly in the world. The paper analyzed the pipeline HSE management, pipeline integrity management (PIM) and asset integrity management (AIM), the problem of management was produced, and also the Pipeline 4-dimension Management (P4DM) theory was brought forward. According to P4DM, from the hierarchy of P4DM, the management elements, fields, space and time was analyzed. The main content is P4DM integrate the space geography location and time, control and manage the pipeline system in whole process, anywhere and anytime. It includes the pipeline integrity, pipeline operation and emergency, which is integrated by IT system. It come true that the idea, solution, technology, organization, manager alternately intelligently control the process of management. What the paper talks about included the definition of pipeline 4D management, the research develop of P4DM, the theory of P4DM, the relationship between P4DM and PIM, the technology basis of P4DM, how to perform the P4DM and conclusion. The P4DM was produced, which provide the development direction of PIM in the future, and also provide the new ideas for PetroChina in the field of technology and management. (author)

  9. JGI Plant Genomics Gene Annotation Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Shu, Shengqiang; Rokhsar, Dan; Goodstein, David; Hayes, David; Mitros, Therese

    2014-07-14

    Plant genomes vary in size and are highly complex with a high amount of repeats, genome duplication and tandem duplication. Gene encodes a wealth of information useful in studying organism and it is critical to have high quality and stable gene annotation. Thanks to advancement of sequencing technology, many plant species genomes have been sequenced and transcriptomes are also sequenced. To use these vastly large amounts of sequence data to make gene annotation or re-annotation in a timely fashion, an automatic pipeline is needed. JGI plant genomics gene annotation pipeline, called integrated gene call (IGC), is our effort toward this aim with aid of a RNA-seq transcriptome assembly pipeline. It utilizes several gene predictors based on homolog peptides and transcript ORFs. See Methods for detail. Here we present genome annotation of JGI flagship green plants produced by this pipeline plus Arabidopsis and rice except for chlamy which is done by a third party. The genome annotations of these species and others are used in our gene family build pipeline and accessible via JGI Phytozome portal whose URL and front page snapshot are shown below.

  10. Bauxite slurry pipeline: start up operation

    Energy Technology Data Exchange (ETDEWEB)

    Othon, Otilio; Babosa, Eder; Edvan, Francisco; Brittes, Geraldo; Melo, Gerson; Janir, Joao; Favacho, Orlando; Leao, Marcos; Farias, Obadias [Vale, Rio de Janeiro, RJ (Brazil); Goncalves, Nilton [Anglo Ferrous Brazil S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The mine of Miltonia is located in Paragominas-PA, in the north of Brazil. Bauxite slurry pipeline starts at the Mine of Miltonia and finishes in the draining installation of Alunorte refinery at the port of Barcarena-PA, located approximately 244km away from the mine. The pipeline runs over seven cities and passes below four great rivers stream beds. The system was designed for an underground 24 inches OD steel pipe to carry 9.9 million dry metric tonnes per annum (dMTAs) of 50.5% solid concentration bauxite slurry, using only one pumping station. The system is composed by four storage tanks and six piston diaphragm pumps, supplying a flow of 1680 m3/h. There is a cathodic protection system along the pipeline extension to prevent external corrosion and five pressure monitoring stations to control hydraulic conditions, there is also a fiber optic cable interconnection between pump station and terminal station. Pipeline Systems Incorporated (PSI) was the designer and followed the commissioning program of the start up operations. This paper will describe the beginning of the pipeline operations, technical aspects of the project, the operational experiences acquired in these two years, the faced problems and also the future planning. (author)

  11. Diagnostics and reliability of pipeline systems

    CERN Document Server

    Timashev, Sviatoslav

    2016-01-01

    The book contains solutions to fundamental problems which arise due to the logic of development of specific branches of science, which are related to pipeline safety, but mainly are subordinate to the needs of pipeline transportation.          The book deploys important but not yet solved aspects of reliability and safety assurance of pipeline systems, which are vital aspects not only for the oil and gas industry and, in general, fuel and energy industries , but also to virtually all contemporary industries and technologies. The volume will be useful to specialists and experts in the field of diagnostics/ inspection, monitoring, reliability and safety of critical infrastructures. First and foremost, it will be useful to the decision making persons —operators of different types of pipelines, pipeline diagnostics/inspection vendors, and designers of in-line –inspection (ILI) tools, industrial and ecological safety specialists, as well as to researchers and graduate students.

  12. Bad Actors Criticality Assessment for Pipeline system

    Science.gov (United States)

    Nasir, Meseret; Chong, Kit wee; Osman, Sabtuni; Siaw Khur, Wee

    2015-04-01

    Failure of a pipeline system could bring huge economic loss. In order to mitigate such catastrophic loss, it is required to evaluate and rank the impact of each bad actor of the pipeline system. In this study, bad actors are known as the root causes or any potential factor leading to the system downtime. Fault Tree Analysis (FTA) is used to analyze the probability of occurrence for each bad actor. Bimbaum's Importance and criticality measure (BICM) is also employed to rank the impact of each bad actor on the pipeline system failure. The results demonstrate that internal corrosion; external corrosion and construction damage are critical and highly contribute to the pipeline system failure with 48.0%, 12.4% and 6.0% respectively. Thus, a minor improvement in internal corrosion; external corrosion and construction damage would bring significant changes in the pipeline system performance and reliability. These results could also be useful to develop efficient maintenance strategy by identifying the critical bad actors.

  13. Underwater pipeline impact localization using piezoceramic transducers

    Science.gov (United States)

    Zhu, Junxiao; Ho, Siu Chun Michael; Patil, Devendra; Wang, Ning; Hirsch, Rachel; Song, Gangbing

    2017-10-01

    Reports indicated that impact events accounted for 47% of offshore pipeline failures, which calls for impact detection and localization for subsea pipelines. In this paper, an innovative method for rapid localization of impacts on underwater pipelines utilizing a novel determination technique for both arrival-time and group velocity (ATGV) of ultrasonic guided waves with lead zirconate titanate (PZT) transducers is described. PZT transducers mounted on the outer surface of a model pipeline were utilized to measure ultrasonic guided waves generated by impact events. Based on the signals from PZT sensors, the ATGV technique integrates wavelet decomposition, Hilbert transform and statistical analysis to pinpoint the arrival-time of the designated ultrasonic guided waves with a specific group velocity. Experimental results have verified the effectiveness and the localization accuracy for eight impact points along a model underwater pipeline. All estimations errors were small and were comparable with the wavelength of the designated ultrasonic guided waves. Furthermore, the method is robust against the low frequency structural vibration introduced by other external forces.

  14. Hazard identification studies applied to oil pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Savio, Augusto; Alpert, Melina L. [TECNA S.A., Buenos Aires (Argentina)], e-mail: asavio@tecna.com, e-mail: malpert@tecna.com

    2008-07-01

    In order to assess risks inherent to an Oil Pipeline, it is imperative to analyze what happens 'outside the process'. HAZID (HAZard IDentification) studies are mainly carried out for this purpose. HAZID is a formal study which identifies hazards and risks associated to an operation or facility and enable its acceptability assessment. It is a brainstorming exercise guided by a typical 'Checklist', divided into four Sections: External, Facilities and Health Hazards and Issues pertaining to Project Execution, which are further subdivided into Hazard Categories. For each Category, there are 'Guide-words' and 'Prompts'. Even if an Oil Pipeline Risk Assessment can be performed by means of the above referred 'Checklist', carrying out the actual process can become lengthy and annoying due to the lack of specificity. This work aims at presenting the most suitable 'Checklist' for the identification of Oil Pipeline Risk Assessment, although it could be used for Gas Pipeline Risk Assessment too. Prepared ad hoc, this list, is based on the spill causes established by CONCAWE (CONservation of Clean Air Water in Europe). Performing Oil Pipeline Risk Assessment by means of specially formulated Checklist enables the Study Team to easily identify risks, shortens execution time and provides both accuracy and specificity. (author)

  15. Automated identification of RNA 3D modules with discriminative power in RNA structural alignments

    DEFF Research Database (Denmark)

    Theis, Corinna; Höner zu Siederdissen, Christian; Hofacker, Ivo L.

    2013-01-01

    interest in matching structural modules known from one molecule to other molecules for which the 3D structure is not known yet. We have created a pipeline, metaRNAmodules, which completely automates extracting putative modules from the FR3D database and mapping of such modules to Rfam alignments to obtain...

  16. The Dark Energy Survey Image Processing Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Morganson, E.; et al.

    2018-01-09

    The Dark Energy Survey (DES) is a five-year optical imaging campaign with the goal of understanding the origin of cosmic acceleration. DES performs a 5000 square degree survey of the southern sky in five optical bands (g,r,i,z,Y) to a depth of ~24th magnitude. Contemporaneously, DES performs a deep, time-domain survey in four optical bands (g,r,i,z) over 27 square degrees. DES exposures are processed nightly with an evolving data reduction pipeline and evaluated for image quality to determine if they need to be retaken. Difference imaging and transient source detection are also performed in the time domain component nightly. On a bi-annual basis, DES exposures are reprocessed with a refined pipeline and coadded to maximize imaging depth. Here we describe the DES image processing pipeline in support of DES science, as a reference for users of archival DES data, and as a guide for future astronomical surveys.

  17. Digitally assisted pipeline ADCs theory and implementation

    CERN Document Server

    Murmann, Boris

    2007-01-01

    List of Figures. List of Tables. Acknowledgements. Preface. 1: Introduction. 1. Motivation. 2. Overview. 3. Chapter Organization. 2: Performance Trends. 1. Introduction. 2. Digital Performance Trends. 3. ADC Performance Trends. 3: Scaling Analysis. 1. Introduction. 2. Basic Device Scaling from a Digital Perspective. 3. Technology Metrics for Analog Circuits. 4. Scaling Impact on Matching-Limited Circuits. 5. Scaling Impact on Noise-Limited Circuits. 4: Improving Analog Circuit Efficiency. 1. Introduction. 2. Analog Circuit Challenges. 3. The Cost of Feedback. 4. Two-Stage Feedback Amplifier vs. Open-Loop Gain Stage. 5. Discussion. 5: Open-Loop Pipelined ADCs. 1. A Brief Review of Pipelined ADCs. 2. Conventional Stage Implementation. 3. Open-Loop Pipeline Stages. 4. Alternative Transconductor Implementations. 6: Digital Nonlinearity Correction. 1. Overview. 2. Error Model and Digital Correction. 3. Alternative Error Models. 7: Statistics-Based Parameter Estimation. 1. Introduction. 2. Modulation Approach. 3. R...

  18. V-GAP: Viral genome assembly pipeline

    KAUST Repository

    Nakamura, Yoji

    2015-10-22

    Next-generation sequencing technologies have allowed the rapid determination of the complete genomes of many organisms. Although shotgun sequences from large genome organisms are still difficult to reconstruct perfect contigs each of which represents a full chromosome, those from small genomes have been assembled successfully into a very small number of contigs. In this study, we show that shotgun reads from phage genomes can be reconstructed into a single contig by controlling the number of read sequences used in de novo assembly. We have developed a pipeline to assemble small viral genomes with good reliability using a resampling method from shotgun data. This pipeline, named V-GAP (Viral Genome Assembly Pipeline), will contribute to the rapid genome typing of viruses, which are highly divergent, and thus will meet the increasing need for viral genome comparisons in metagenomic studies.

  19. Hubble Legacy Archive (HLA) Pipeline Progression

    Science.gov (United States)

    Anderson, Rachel E.; Casertano, S.; Lindsay, K.

    2013-01-01

    The HLA maintains a strong commitment to continuing improvement of our Hubble Space Telescope data processing pipelines with the goal of generating better science-ready data products. The HLA image processing pipeline is transitioning from the use of MultiDrizzle to AstroDrizzle for image registration and combination. It is expected that this change will allow for the creation of higher quality science products with improved astrometric solutions. Headerlets, a newly developed tool for AstroDrizzle, will be utilized and made available to simplify access to multiple astrometric solutions for a given data set. The capabilities of AstroDrizzle will allow for functionally simplified data processing, standardizing and streamlining the data reduction process and making it easier for users to reproduce our results. We are beginning with the HLA WFC3 data processing pipeline, and then plan to extend its application to other HST instrument data.

  20. 5. symposium of pipeline technology. Proceedings; 5. Symposium Pipelinetechnik. Vortraege

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-07-01

    Innovations in the technical systems of rules for pipelines. Data and speech transmission at pipelines via light wiring. Dynamic leak detection and localisation. Future integrity as part of the total pipeline integrity. Assessment of pipeline flaws using the European SINTAP procedure. Automatic registration of alternating pressure loads. Rehabilitation of a crude oil pipeline. Online Inspektion von ''Non-piggable pipelines''. New technologies for the inspection of longitudinal welds and longitudinal oriented defects in pipelines. Construction of intelligent pigs with regard to the assessment of defects in pipelines. (orig./GL) [German] Aus dem Inhalt: Neuerungen im technischen Regelwerk fuer Fernleitungen. Daten- und Sprachuebertragung an Fernleitungen ueber Lichtwellenleiter. Dynamische Leckerkennung und -ortung. Future intergrity as part of the total pipeline integrity. Fehlerstellenbewertung an einer Pipeline durch Anwendung des SINTAP-Verfahrens. Automatische Lastwechselerfassung. Sanierung einer Rohoelpipeline. Online Inspektion von ''Non-piggable pipelines''. Neue Technologien im Bereich der Inspektion von Laengsnaehten und laengsorientierten Fehlern in Pipelines. Neubau von intelligenten Molchen unter dem Aspekt der Bewertung von Fehlern in Pipelines. (orig./GL)

  1. Self lubrication of bitumen froth in pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, D.D. [Univ. of Minnesota, Minneapolis, MN (United States)

    1997-12-31

    In this paper I will review the main properties of water lubricated pipelines and explain some new features which have emerged from studies of self-lubrication of Syncrudes` bitumen froth. When heavy oils are lubricated with water, the water and oil are continuously injected into a pipeline and the water is stable when in a lubricating sheath around the oil core. In the case of bitumen froth obtained from the Alberta tar sands, the water is dispersed in the bitumen and it is liberated at the wall under shear; water injection is not necessary because the froth is self-lubricating.

  2. QUANTITATIVE RISK MAPPING OF URBAN GAS PIPELINE NETWORKS USING GIS

    National Research Council Canada - National Science Library

    P. Azari; M. Karimi

    2017-01-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network...

  3. 75 FR 38799 - ETC Tiger Pipeline, LLC; Notice of Application

    Science.gov (United States)

    2010-07-06

    ...] ETC Tiger Pipeline, LLC; Notice of Application June 25, 2010. Take notice that on June 15, 2010, ETC Tiger Pipeline, LLC (ETC Tiger), 711 Louisiana Street, Suite 900, Houston, Texas 77002, filed an...

  4. Distributed acoustic sensing for pipeline monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Hill, David; McEwen-King, Magnus [OptaSense, QinetiQ Ltd., London (United Kingdom)

    2009-07-01

    Optical fibre is deployed widely across the oil and gas industry. As well as being deployed regularly to provide high bandwidth telecommunications and infrastructure for SCADA it is increasingly being used to sense pressure, temperature and strain along buried pipelines, on subsea pipelines and downhole. In this paper we present results from the latest sensing capability using standard optical fibre to detect acoustic signals along the entire length of a pipeline. In Distributed Acoustic Sensing (DAS) an optical fibre is used for both sensing and telemetry. In this paper we present results from the OptaSense{sup TM} system which has been used to detect third party intervention (TPI) along buried pipelines. In a typical deployment the system is connected to an existing standard single-mode fibre, up to 50km in length, and was used to independently listen to the acoustic / seismic activity at every 10 meter interval. We will show that through the use of advanced array processing of the independent, simultaneously sampled channels it is possible to detect and locate activity within the vicinity of the pipeline and through sophisticated acoustic signal processing to obtain the acoustic signature to classify the type of activity. By combining spare fibre capacity in existing buried fibre optic cables; processing and display techniques commonly found in sonar; and state-of-the-art in fibre-optic distributed acoustic sensing, we will describe the new monitoring capabilities that are available to the pipeline operator. Without the expense of retrofitting sensors to the pipeline, this technology can provide a high performance, rapidly deployable and cost effective method of providing gapless and persistent monitoring of a pipeline. We will show how this approach can be used to detect, classify and locate activity such as; third party interference (including activity indicative of illegal hot tapping); real time tracking of pigs; and leak detection. We will also show how an

  5. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2006-09-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  6. 76 FR 68828 - Pipeline Safety: Emergency Responder Forum

    Science.gov (United States)

    2011-11-07

    ... Administration [Docket ID PHMSA-2011-0295] Pipeline Safety: Emergency Responder Forum AGENCY: Pipeline and Hazardous Materials Safety Administration (PHMSA), DOT. ACTION: Notice of Forum. SUMMARY: PHMSA is co-sponsoring a one-day Emergency Responder Forum with the National Association of Pipeline Safety...

  7. legal and institutional framework for promoting oil pipeline security ...

    African Journals Online (AJOL)

    RAYAN_

    The hacking into pipelines to steal crude oil to refine later and sell abroad is illegal bunkering. Pipeline vandalism is the intentional destruction of pipelines, platforms loading barge and other facilities for selfish .... of funnels, drilling tools and plastic hoses to syphon the products. Also, only a few cases of vandalism occurred.

  8. Legal and instututional framework for promoting oil pipeline security ...

    African Journals Online (AJOL)

    ... of the security agencies saddled with the overall responsibility for managing as well as safeguarding the pipelines to ensure their productivity; and overhauling the entire security apparatus put in place to protect oil pipelines in Nigeria by having a sustainable and strategic approach to dealing with oil pipeline insecurity.

  9. 18 CFR 2.57 - Temporary certificates-pipeline companies.

    Science.gov (United States)

    2010-04-01

    ...-pipeline companies. 2.57 Section 2.57 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY... Policy and Interpretations Under the Natural Gas Act § 2.57 Temporary certificates—pipeline companies... the proposed construction is of major proportions. Pipeline companies are accordingly urged to conduct...

  10. 75 FR 73160 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-11-29

    ... Natural Gas Facilities. OMB Control Number: 2137-0578. Type of Request: Renewal of a currently approved... Safety-Related Conditions on Gas, Hazardous Liquid, and Carbon Dioxide Pipelines and Liquefied Natural... Dioxide Pipelines and Liquefied Natural Gas Facilities.'' The Pipeline Safety Laws (49 U.S.C. 60132...

  11. 75 FR 66046 - Capacity Transfers on Intrastate Natural Gas Pipelines

    Science.gov (United States)

    2010-10-27

    ... Energy Regulatory Commission 18 CFR Part 284 Capacity Transfers on Intrastate Natural Gas Pipelines... capacity on intrastate natural gas pipelines providing interstate transportation and storage services under section 311 of the Natural Gas Policy Act of 1978 and Hinshaw pipelines providing such services pursuant...

  12. Testing the GONG ring-diagram pipeline with HMI Dopplergrams

    Energy Technology Data Exchange (ETDEWEB)

    Jain, Kiran; Tripathy, S C; Hernandez, I Gonzalez; Kholikov, S; Hill, F; Komm, R [GONG, National Solar Observatory, Tucson, AZ 85719-4933 (United States); Bogart, R [HEPL, Stanford University, Stanford, CA 94305-4085 (United States); Haber, D, E-mail: kjain@noao.edu [JILA, University of Colorado, Boulder, CO 80309-0440 (United States)

    2011-01-01

    The GONG ring-diagram pipeline was developed to analyze GONG+ Dopplergrams in order to extract information about solar subsurface flows and has been extensively tested for this purpose. Here we present preliminary results obtained by analyzing the HMI Dopplergrams with the GONG pipeline and compare them with those obtained from the HMI ring-diagram pipeline.

  13. 77 FR 46155 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-08-02

    ... information collections relate to the pipeline integrity management requirements for gas transmission pipeline... comments received will be posted without change to http://www.regulations.gov , including any personal... activity. PHMSA requests comments on the following information collections: 1. Title: Pipeline Integrity...

  14. Development of a Free-Swimming Acoustic Tool for Liquid Pipeline Leak Detection Including Evaluation for Natural Gas Pipeline Applications

    Science.gov (United States)

    2010-08-01

    Significant financial and environmental consequences often result from line leakage of oil product pipelines. Product can escape into the surrounding soil as even the smallest leak can lead to rupture of the pipeline. From a health perspective, water...

  15. Total pipeline integrity management system implemented for KOC pipelines - a case study

    Energy Technology Data Exchange (ETDEWEB)

    Isaac, M. Robb [NDT Middle East FZE (Kuwait)], email: Robb.Isaac@ndt-global.com; Al-Sulaiman, Saleh; Sharma, Sandeep [Kuwait Oil Company (Kuwait)], email: ssulaima@kockw.com, email: sasharma@kockw.com; Martin, Monty R. [NDT Systems and services Inc. (Canada)], email: Monty.Martin@ndt-global.com

    2010-07-01

    Kuwait Oil Company (KOC) is a subsidiary of Kuwait Petroleum Corporation and they both own and operate the whole oil and gas pipeline network in Kuwait. KOC transit system consists of hundreds of pipelines, thousands of wellhead flow lines and offshore lines. Since many data were missing, in 2005 KOC implemented a total pipeline integrity management system (TPIMS) to conduct an integrity assessment of its facilities. This study aims at providing the results of TPIMS's implementation at KOC. The project digitalized and centralized information relevant to the integrity of the pipeline network while reducing the effort required in terms of mitigating hazards and threats to the facilities. Results showed that the implementation of the TPIMS at KOC made it possible to manage information in a single environment. This study highlighted the benefits of implementing the TPIMS for efficient planning and utilization of data.

  16. Modeling and monitoring of pipelines and networks advanced tools for automatic monitoring and supervision of pipelines

    CERN Document Server

    Torres, Lizeth

    2017-01-01

    This book focuses on the analysis and design of advanced techniques for on-line automatic computational monitoring of pipelines and pipe networks. It discusses how to improve the systems’ security considering mathematical models of the flow, historical flow rate and pressure data, with the main goal of reducing the number of sensors installed along a pipeline. The techniques presented in the book have been implemented in digital systems to enhance the abilities of the pipeline network’s operators in recognizing anomalies. A real leak scenario in a Mexican water pipeline is used to illustrate the benefits of these techniques in locating the position of a leak. Intended for an interdisciplinary audience, the book addresses researchers and professionals in the areas of mechanical, civil and control engineering. It covers topics on fluid mechanics, instrumentation, automatic control, signal processing, computing, construction and diagnostic technologies.

  17. Marine pipeline study for underwater crossing of Georgia and Malaspina straits: twin ten-inch pipeline

    National Research Council Canada - National Science Library

    1981-01-01

    A feasibility study was conducted for the installation of twin 273.1 mm diameter submarine pipelines across the Straits of Georgia and Malaspina between Powell River and Little River areas in British Columbia...

  18. Lay Pipeline Abandonment Head during Some

    African Journals Online (AJOL)

    2016-12-01

    Dec 1, 2016 ... computations and analysis do not cover adequately the effect of number of cyclic wave loading on the girth welds on long exposure period, especially as certain degree of weld surface and buried imperfections are often allowed during pipeline fabrication. In a normal practice, stoppage of offshore pipe-lay.

  19. Non-destructive Testing of Pipelines

    CERN Document Server

    Annila, L

    2001-01-01

    This paper shall present different, contemporarily available non-destructive testing (NDT) methods of pipelines and compare them to each other from the technical and economical point of view. An evaluation of their suitability for CERN activities, based on the opinions and experience of various specialists at CERN (LHC, ST, TIS), is also introduced.

  20. System Reliability Assessment of Offshore Pipelines

    NARCIS (Netherlands)

    Mustaffa, Z.

    2011-01-01

    The title of this thesis, System Reliability Assessment of Offshore Pipelines, portrays the application of probabilistic methods in assessing the reliability of these structures. The main intention of this thesis is to identify, apply and judge the suitability of the probabilistic methods in

  1. Pipeline : Hebrew data from ETCBC to Github

    NARCIS (Netherlands)

    Roorda, Dirk

    2017-01-01

    This pipeline delivers, among other good things, a file bhsa_xx.mql.bz2 which contains all ETCBC data and research additions to it. The form is MQL, compressed, and the size is less than 30 MB. Where ever you have Emdros installed, you can query this data. If you take this file from the continuous

  2. Safe purging of natural gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Perkins, T.K. (Arco Oil and Gas Co. (US)); Euchner, J.A. (Nynex Corp. (US))

    1988-11-01

    When a newly constructed natural gas pipeline is put into service, it can be safely purged of air by injection of a slug of inert gas, such as N/sub 2/. The method of sizing the required slug is based on a model of dispersion in turbulent flow in conjunction with flammability limits.

  3. An integrative bioinformatics pipeline for the genomewide ...

    Indian Academy of Sciences (India)

    2013-12-06

    Dec 6, 2013 ... [Fang W., Zhou N., Li D., Chen Z., Jiang P. and Zhang D. 2013 An integrative bioinformatics pipeline for the genomewide identification of novel porcine microRNA genes. J. Genet. 92, 587–593]. Introduction. MicroRNA (miRNA) is a pivotal type of noncoding RNA gene in posttranscriptional gene regulation ...

  4. Internal Corrosion Detection in Liquids Pipelines

    Science.gov (United States)

    2012-01-01

    PHMSA project DTRS56-05-T-0005 "Development of ICDA for Liquid Petroleum Pipelines" led to the development of a Direct Assessment (DA) protocol to prioritize locations of possible internal corrosion. The underlying basis LP-ICDA is simple; corrosion ...

  5. pipelines cathodic protection design methodologies for impressed ...

    African Journals Online (AJOL)

    HOD

    The current and voltage requirements were achievable by installing six transformer rectifiers with minimum direct current output of 500amperes and direct voltage output of 200volts. But the sacrificial anode design calculation indicated direct current output of 2262240mA was needed to marginally polarize the X42 pipeline ...

  6. Pipeline or Personal Preference: Women in Engineering

    Science.gov (United States)

    Schreuders, P. D.; Mannon, S. E.; Rutherford, B.

    2009-01-01

    Although the number of women in the engineering field has increased since the 1960s, those increases have largely stagnated over the last few years. This paper re-examines the pipeline for bringing women into engineering and, based on survey data, examines the attitudes, motivations, and interests of 969 male and female engineering students.…

  7. The Effect of Landslide on Gas Pipeline

    Directory of Open Access Journals (Sweden)

    Valkovič Vojtech

    2016-11-01

    Full Text Available The present paper deals with the calculation of stresses on the pipeline system embedded on a flexible substrate which is burdened by a landslide. As well as taking into account the probability of the influences acting on the pipe as wall thickness, and others.

  8. Commissioning of a new helium pipeline

    Science.gov (United States)

    2000-01-01

    At the commissioning of a new high-pressure helium pipeline at Kennedy Space Center, participants cut the lines to helium-filled balloons. From left, they are Center Director Roy Bridges; Michael Butchko, president, SGS; Pierre Dufour, president and CEO, Air Liquide America Corporation; David Herst, director, Delta IV Launch Sites; Pamela Gillespie, executive administrator, office of Congressman Dave Weldon; and Col. Samuel Dick, representative of the 45th Space Wing. The nine-mile-long buried pipeline will service launch needs at the new Delta IV Complex 37 at Cape Canaveral Air Force Station. It will also serve as a backup helium resource for Shuttle launches. Nearly one launch's worth of helium will be available in the pipeline to support a Shuttle pad in an emergency. The line originates at the Helium Facility on KSC and terminates in a meter station at the perimeter of the Delta IV launch pad. Others at the ceremony were Jerry Jorgensen, pipeline project manager, Space Gateway Support (SGS), and Ramon Lugo, acting executive director, JPMO.

  9. The First Steps into a "Leaky Pipeline"

    DEFF Research Database (Denmark)

    Emerek, Ruth; Larsen, Britt Østergaard

    2011-01-01

    Research shows that the higher the level of academic positions at universities the lower the percentage of women among employees also applies at Danish universities. This may be due to a historical backlog or merely to a 'Leaky pipeline', as earlier studies have revealed that an increasing propor...

  10. 27 CFR 19.587 - Pipelines.

    Science.gov (United States)

    2010-04-01

    ... THE TREASURY LIQUORS DISTILLED SPIRITS PLANTS Containers and Marks Containers § 19.587 Pipelines... used for (a) the conveyance on bonded premises of spirits, denatured spirits, articles, and wines, and (b) the conveyance to and from bonded premises of spirits, denatured spirits, articles, and wines...

  11. Women Engineering Faculty: Expanding the Pipeline

    Science.gov (United States)

    Greni, Nadene Deiterman

    2006-01-01

    The purpose for this case study was to explore the features of undergraduate engineering departmental and college support that influenced the persistence of women students. Women engineering faculty members were among the participants at three Land Grant universities in the Midwest. The data revealed the theme, Expanding the Pipeline, and…

  12. Runtime Modifications of Spark Data Processing Pipelines

    NARCIS (Netherlands)

    Lazovik, E.; Medema, M.; Albers, T.; Langius, E.A.F.; Lazovik, A.

    2017-01-01

    Distributed data processing systems are the standard means for large-scale data analysis in the Big Data field. These systems are based on processing pipelines where the processing is done via a composition of multiple elements or steps. In current distributed data processing systems, the code and

  13. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  14. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  15. Emergency gas pipeline transportation with computer documentation

    Energy Technology Data Exchange (ETDEWEB)

    None

    1978-08-01

    Methods developed by the staff of the Federal Energy Regulatory Commission in cooperation with the natural gas industry to expedite the emergency transfer of natural gas are described. The majority of the United States' natural gas fields are concentrated in the south central region, comprised of Louisiana, Oklahoma, and Texas together with adjacent areas offshore in the Gulf of Mexico. This is the major source area for gas consumed in the northern, northeastern, southeastern, and far western population/industrial centers. The geographic pattern of gas flow through interstate pipelines emanates in gas producing areas and terminates in gas consuming areas. There are many other areas in the United States which produce gas but the amounts are comparatively small compared with Texas, Louisiana, and offshore Louisiana production. The various interconnections associated with a given pipeline for both receipts and deliveries are defined. The maximum volume capability in MMCFD and the volume being delivered in MMCFD are to be considered as estimated volumes. These volumes do not represent absolute volumes that are available but rather volumes for general planning purposes to define the magnitude of each interconnection. If an actual transportation route is desired, a routing may be derived from this report which then must be checked for actual volumes at a particular point in time. It is always possible that at the time of interest, there is no available capacity or deliveries.The data and information are arranged by pipeline company name followed by which companies supply gas to the named pipeline and to which companies the named pipeline delivers gas. Each receipt or delivery location is defined by the county and state.

  16. Diverless pipeline repair system for deep water

    Energy Technology Data Exchange (ETDEWEB)

    Spinelli, Carlo M. [Eni Gas and Power, Milan (Italy); Fabbri, Sergio; Bachetta, Giuseppe [Saipem/SES, Venice (Italy)

    2009-07-01

    SiRCoS (Sistema Riparazione Condotte Sottomarine) is a diverless pipeline repair system composed of a suite of tools to perform a reliable subsea pipeline repair intervention in deep and ultra deep water which has been on the ground of the long lasting experience of Eni and Saipem in designing, laying and operating deep water pipelines. The key element of SiRCoS is a Connection System comprising two end connectors and a repair spool piece to replace a damaged pipeline section. A Repair Clamp with elastomeric seals is also available for pipe local damages. The Connection System is based on pipe cold forging process, consisting in swaging the pipe inside connectors with suitable profile, by using high pressure seawater. Three swaging operations have to be performed to replace the damaged pipe length. This technology has been developed through extensive theoretical work and laboratory testing, ending in a Type Approval by DNV over pipe sizes ranging from 20 inches to 48 inches OD. A complete SiRCoS system has been realised for the Green Stream pipeline, thoroughly tested in workshop as well as in shallow water and is now ready, in the event of an emergency situation.The key functional requirements for the system are: diverless repair intervention and fully piggability after repair. Eni owns this technology and is now available to other operators under Repair Club arrangement providing stand-by repair services carried out by Saipem Energy Services. The paper gives a description of the main features of the Repair System as well as an insight into the technological developments on pipe cold forging reliability and long term duration evaluation. (author)

  17. Advances in riser and pipeline technologies

    Energy Technology Data Exchange (ETDEWEB)

    Kan, Wan C.; Mortazavi, Mehrdad; Weir, Michael S. [ExxonMobil Development Company, Dallas, TX (United States)

    2009-12-19

    As oil and gas production continues to move into new frontier areas, novel applications of the existing riser and pipeline technologies need to be developed to meet the often more stringent requirements encountered in these environments. The challenges include ultra deep water, harsh environments, aggressive fluid conditions, and local content objectives, etc. They will require industry to constantly extend, expand, and enhance the broad range of solution options. Also, the existing design criteria in industry may need to be revised or new criteria may need to be developed to satisfy these needs. Exxon Mobil (Em) employs, and works with others in industry to promote robust design and operating practices. This approach requires in-depth understanding, sound engineering principles, advanced analysis, uncertainty management, and supportive qualification test data. It enables confident selection, extrapolation, and innovation of technologies to address new riser system and pipeline challenges. Focus on fundamental is imperative to ensure integrity of the selected systems during fabrication, installation, and operation phases. Recent and past project experience in deep water Gulf of Mexico and West Africa provides many successful examples of this approach. This paper reviews several examples of the key riser system and pipeline technology enhancements recently achieved by EM to provide confidence in addressing technical and project application challenges. Riser system technology enhancements addressed in this paper include steel catenary riser (SCR) application on turret-moored FPSO with severe motions, pipe-in-pipe (PIP) hybrid production riser to effectively manage gas lift and flow assurance requirements, irregular wave analysis methodology for flexible risers and umbilicals to reduce conservatism, and qualification of riser and pipeline VIV prediction and mitigation methods. Pipeline technology enhancements detailed in this paper include lateral buckling prediction

  18. Energy cost reduction in oil pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Limeira, Fabio Machado; Correa, Joao Luiz Lavoura; Costa, Luciano Macedo Josino da; Silva, Jose Luiz da; Henriques, Fausto Metzger Pessanha [Petrobras Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    One of the key questions of modern society consists on the rational use of the planet's natural resources and energy. Due to the lack of energy, many companies are forced to reduce their workload, especially during peak hours, because residential demand reaches its top and there is not enough energy to fulfill the needs of all users, which affects major industries. Therefore, using energy more wisely has become a strategic issue for any company, due to the limited supply and also for the excessive cost it represents. With the objective of saving energy and reducing costs for oil pipelines, it has been identified that the increase in energy consumption is primordially related to pumping stations and also by the way many facilities are operated, that is, differently from what was originally designed. Realizing this opportunity, in order to optimize the process, this article intends to examine the possibility of gains evaluating alternatives regarding changes in the pump scheme configuration and non-use of pump stations at peak hours. Initially, an oil pipeline with potential to reduce energy costs was chosen being followed by a history analysis, in order to confirm if there was sufficient room to change the operation mode. After confirming the pipeline choice, the system is briefly described and the literature is reviewed, explaining how the energy cost is calculated and also the main characteristics of a pumping system in series and in parallel. In that sequence, technically feasible alternatives are studied in order to operate and also to negotiate the energy demand contract. Finally, costs are calculated to identify the most economical alternative, that is, for a scenario with no increase in the actual transported volume of the pipeline and for another scenario that considers an increase of about 20%. The conclusion of this study indicates that the chosen pipeline can achieve a reduction on energy costs of up to 25% without the need for investments in new

  19. ASPCAP: THE APOGEE STELLAR PARAMETER AND CHEMICAL ABUNDANCES PIPELINE

    Energy Technology Data Exchange (ETDEWEB)

    García Pérez, Ana E.; Majewski, Steven R.; Shane, Neville; Sobeck, Jennifer; Troup, Nicholas [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Prieto, Carlos Allende; Carrera, Ricardo; García-Hernández, D. A.; Zamora, Olga [Instituto de Astrofísica de Canarias, E-38205 La Laguna, Tenerife (Spain); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Shetrone, Matthew [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States); Mészáros, Szabolcs [ELTE Gothard Astrophysical Observatory, H-9704 Szombathely, Szent Imre Herceg St. 112 (Hungary); Bizyaev, Dmitry [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349-0059 (United States); Cunha, Katia [Observatório Nacional, São Cristóvão, Rio de Janeiro (Brazil); Johnson, Jennifer A.; Weinberg, David H. [Department of Astronomy, The Ohio State University, Columbus, OH 43210 (United States); Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Schiavon, Ricardo P. [Astrophysics Research Institute, Liverpool John Moores University, Egerton Wharf, Birkenhead, Wirral CH41 1LD (United Kingdom); Smith, Verne V. [National Optical Astronomy Observatories, Tucson, AZ 85719 (United States); Bovy, Jo, E-mail: agp@iac.es [Institute for Advanced Study, Einstein Drive, Princeton, NJ 08540 (United States); and others

    2016-06-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE) has built the largest moderately high-resolution ( R  ≈ 22,500) spectroscopic map of the stars across the Milky Way, and including dust-obscured areas. The APOGEE Stellar Parameter and Chemical Abundances Pipeline (ASPCAP) is the software developed for the automated analysis of these spectra. ASPCAP determines atmospheric parameters and chemical abundances from observed spectra by comparing observed spectra to libraries of theoretical spectra, using χ {sup 2} minimization in a multidimensional parameter space. The package consists of a fortran90 code that does the actual minimization and a wrapper IDL code for book-keeping and data handling. This paper explains in detail the ASPCAP components and functionality, and presents results from a number of tests designed to check its performance. ASPCAP provides stellar effective temperatures, surface gravities, and metallicities precise to 2%, 0.1 dex, and 0.05 dex, respectively, for most APOGEE stars, which are predominantly giants. It also provides abundances for up to 15 chemical elements with various levels of precision, typically under 0.1 dex. The final data release (DR12) of the Sloan Digital Sky Survey III contains an APOGEE database of more than 150,000 stars. ASPCAP development continues in the SDSS-IV APOGEE-2 survey.

  20. 77 FR 16052 - Information Collection Activities: Pipelines and Pipeline Rights-of-Way; Submitted for Office of...

    Science.gov (United States)

    2012-03-19

    ... through the submerged lands of the OCS for pipelines ``* * * for the transportation of oil, natural gas... ensure that the pipeline, as constructed, will provide for safe transportation of oil and gas and other... pipeline would not conflict with any State requirements or unduly interfere with other OCS activities. BSEE...

  1. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  2. HTPheno: an image analysis pipeline for high-throughput plant phenotyping.

    Science.gov (United States)

    Hartmann, Anja; Czauderna, Tobias; Hoffmann, Roberto; Stein, Nils; Schreiber, Falk

    2011-05-12

    In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. This paper presents an image analysis pipeline (HTPheno) for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view) during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  3. HTPheno: An image analysis pipeline for high-throughput plant phenotyping

    Directory of Open Access Journals (Sweden)

    Stein Nils

    2011-05-01

    Full Text Available Abstract Background In the last few years high-throughput analysis methods have become state-of-the-art in the life sciences. One of the latest developments is automated greenhouse systems for high-throughput plant phenotyping. Such systems allow the non-destructive screening of plants over a period of time by means of image acquisition techniques. During such screening different images of each plant are recorded and must be analysed by applying sophisticated image analysis algorithms. Results This paper presents an image analysis pipeline (HTPheno for high-throughput plant phenotyping. HTPheno is implemented as a plugin for ImageJ, an open source image processing software. It provides the possibility to analyse colour images of plants which are taken in two different views (top view and side view during a screening. Within the analysis different phenotypical parameters for each plant such as height, width and projected shoot area of the plants are calculated for the duration of the screening. HTPheno is applied to analyse two barley cultivars. Conclusions HTPheno, an open source image analysis pipeline, supplies a flexible and adaptable ImageJ plugin which can be used for automated image analysis in high-throughput plant phenotyping and therefore to derive new biological insights, such as determination of fitness.

  4. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Directory of Open Access Journals (Sweden)

    Christian Held

    2013-01-01

    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  5. The pipeline system for Octave and Matlab (PSOM): a lightweight scripting framework and execution engine for scientific workflows.

    Science.gov (United States)

    Bellec, Pierre; Lavoie-Courchesne, Sébastien; Dickinson, Phil; Lerch, Jason P; Zijdenbos, Alex P; Evans, Alan C

    2012-01-01

    The analysis of neuroimaging databases typically involves a large number of inter-connected steps called a pipeline. The pipeline system for Octave and Matlab (PSOM) is a flexible framework for the implementation of pipelines in the form of Octave or Matlab scripts. PSOM does not introduce new language constructs to specify the steps and structure of the workflow. All steps of analysis are instead described by a regular Matlab data structure, documenting their associated command and options, as well as their input, output, and cleaned-up files. The PSOM execution engine provides a number of automated services: (1) it executes jobs in parallel on a local computing facility as long as the dependencies between jobs allow for it and sufficient resources are available; (2) it generates a comprehensive record of the pipeline stages and the history of execution, which is detailed enough to fully reproduce the analysis; (3) if an analysis is started multiple times, it executes only the parts of the pipeline that need to be reprocessed. PSOM is distributed under an open-source MIT license and can be used without restriction for academic or commercial projects. The package has no external dependencies besides Matlab or Octave, is straightforward to install and supports of variety of operating systems (Linux, Windows, Mac). We ran several benchmark experiments on a public database including 200 subjects, using a pipeline for the preprocessing of functional magnetic resonance images (fMRI). The benchmark results showed that PSOM is a powerful solution for the analysis of large databases using local or distributed computing resources.

  6. An integrated system for pipeline condition monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Strong, Andrew P.; Lees, Gareth; Hartog, Arthur; Twohig, Richard; Kader, Kamal; Hilton, Graeme; Mullens, Stephen; Khlybov, Artem [Schlumberger, Southampton (United Kingdom); Sanderson, Norman [BP Exploration, Sunbury (United Kingdom)

    2009-07-01

    In this paper we present the unique and innovative 'Integriti' pipeline and flow line integrity monitoring system developed by Schlumberger in collaboration with BP. The system uses optical fiber distributed sensors to provide simultaneous distributed measurements of temperature, strain and vibration for the detection, monitoring, and location of events including: Third Party Interference (TPI), including multiple simultaneous disturbances; geo-hazards and landslides; gas and oil leaks; permafrost protection. The Integriti technology also provides a unique means for tracking the progress of cleaning and instrumented pigs using existing optical telecom and data communications cables buried close to pipelines. The Integriti solution provides a unique and proactive approach to pipeline integrity management. It performs analysis of a combination of measurands to provide the pipeline operator with an event recognition and location capability, in effect providing a hazard warning system, and offering the operator the potential to take early action to prevent loss. Through the use of remote, optically powered amplification, an unprecedented detection range of 100 km is possible without the need for any electronics and therefore remote power in the field. A system can thus monitor 200 km of pipeline when configured to monitor 100 km upstream and downstream from a single location. As well as detecting conditions and events leading to leaks, this fully integrated system provides a means of detecting and locating small leaks in gas pipelines below the threshold of present online leak detection systems based on monitoring flow parameters. Other significant benefits include: potential reductions in construction costs; enhancement of the operator's existing integrity management program; potential reductions in surveillance costs and HSE risks. In addition to onshore pipeline systems this combination of functionality and range is available for practicable

  7. INNOVATIVE ELECTROMAGNETIC SENSORS FOR PIPELINE CRAWLERS

    Energy Technology Data Exchange (ETDEWEB)

    J. Bruce Nestleroth; Richard J. Davis

    2005-05-23

    Internal inspection of pipelines is an important tool for ensuring safe and reliable delivery of fossil energy products. Current inspection systems that are propelled through the pipeline by the product flow cannot be used to inspect all pipelines because of the various physical barriers they encounter. Recent development efforts include a new generation of powered inspection platforms that crawl slowly inside a pipeline and are able to maneuver past the physical barriers that can limit inspection. At Battelle, innovative electromagnetic sensors are being designed and tested for these new pipeline crawlers. The various sensor types can be used to assess a wide range of pipeline anomalies including corrosion, mechanical damage, and cracks. The Applied Energy Systems Group at Battelle is in the second year of work on a projected three-year development effort. In the first year, two innovative electromagnetic inspection technologies were designed and tested. Both were based on moving high-strength permanent magnets to generate inspection energy. One system involved translating permanent magnets towards the pipe. A pulse of electric current would be induced in the pipe to oppose the magnetization according to Lenz's Law. The decay of this pulse would indicate the presence of defects in the pipe wall. This inspection method is similar to pulsed eddy current inspection methods, with the fundamental difference being the manner in which the current is generated. Details of this development effort were reported in the first semiannual report on this project. The second inspection methodology is based on rotating permanent magnets. The rotating exciter unit produces strong eddy currents in the pipe wall. At distances of a pipe diameter or more from the rotating exciter, the currents flow circumferentially. These circumferential currents are deflected by pipeline defects such as corrosion and axially aligned cracks. Simple sensors are used to detect the change in current

  8. Bayesian automated cortical segmentation for neonatal MRI

    Science.gov (United States)

    Chou, Zane; Paquette, Natacha; Ganesh, Bhavana; Wang, Yalin; Ceschin, Rafael; Nelson, Marvin D.; Macyszyn, Luke; Gaonkar, Bilwaj; Panigrahy, Ashok; Lepore, Natasha

    2017-11-01

    Several attempts have been made in the past few years to develop and implement an automated segmentation of neonatal brain structural MRI. However, accurate automated MRI segmentation remains challenging in this population because of the low signal-to-noise ratio, large partial volume effects and inter-individual anatomical variability of the neonatal brain. In this paper, we propose a learning method for segmenting the whole brain cortical grey matter on neonatal T2-weighted images. We trained our algorithm using a neonatal dataset composed of 3 fullterm and 4 preterm infants scanned at term equivalent age. Our segmentation pipeline combines the FAST algorithm from the FSL library software and a Bayesian segmentation approach to create a threshold matrix that minimizes the error of mislabeling brain tissue types. Our method shows promising results with our pilot training set. In both preterm and full-term neonates, automated Bayesian segmentation generates a smoother and more consistent parcellation compared to FAST, while successfully removing the subcortical structure and cleaning the edges of the cortical grey matter. This method show promising refinement of the FAST segmentation by considerably reducing manual input and editing required from the user, and further improving reliability and processing time of neonatal MR images. Further improvement will include a larger dataset of training images acquired from different manufacturers.

  9. CERES: A Set of Automated Routines for Echelle Spectra

    Science.gov (United States)

    Brahm, Rafael; Jordán, Andrés; Espinoza, Néstor

    2017-03-01

    We present the Collection of Elemental Routines for Echelle Spectra (CERES). These routines were developed for the construction of automated pipelines for the reduction, extraction, and analysis of spectra acquired with different instruments, allowing the obtention of homogeneous and standardized results. This modular code includes tools for handling the different steps of the processing: CCD image reductions; identification and tracing of the echelle orders; optimal and rectangular extraction; computation of the wavelength solution; estimation of radial velocities; and rough and fast estimation of the atmospheric parameters. Currently, CERES has been used to develop automated pipelines for 13 different spectrographs, namely CORALIE, FEROS, HARPS, ESPaDOnS, FIES, PUCHEROS, FIDEOS, CAFE, DuPont/Echelle, Magellan/Mike, Keck/HIRES, Magellan/PFS, and APO/ARCES, but the routines can be easily used to deal with data coming from other spectrographs. We show the high precision in radial velocity that CERES achieves for some of these instruments, and we briefly summarize some results that have already been obtained using the CERES pipelines.

  10. WASS: An open-source pipeline for 3D stereo reconstruction of ocean waves

    Science.gov (United States)

    Bergamasco, Filippo; Torsello, Andrea; Sclavo, Mauro; Barbariol, Francesco; Benetazzo, Alvise

    2017-10-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community and industry. Indeed, recent advances of both computer vision algorithms and computer processing power now allow the study of the spatio-temporal wave field with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner, so that the implementation of a sea-waves 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well tested software package that automates the reconstruction process from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS (http://www.dais.unive.it/wass), an Open-Source stereo processing pipeline for sea waves 3D reconstruction. Our tool completely automates all the steps required to estimate dense point clouds from stereo images. Namely, it computes the extrinsic parameters of the stereo rig so that no delicate calibration has to be performed on the field. It implements a fast 3D dense stereo reconstruction procedure based on the consolidated OpenCV library and, lastly, it includes set of filtering techniques both on the disparity map and the produced point cloud to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface. In this paper, we describe the architecture of WASS and the internal algorithms involved. The pipeline workflow is shown step-by-step and demonstrated on real datasets acquired at sea.

  11. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  12. A software pipeline for processing and identification of fungal ITS sequences

    Directory of Open Access Journals (Sweden)

    Kristiansson Erik

    2009-01-01

    Full Text Available Abstract Background Fungi from environmental samples are typically identified to species level through DNA sequencing of the nuclear ribosomal internal transcribed spacer (ITS region for use in BLAST-based similarity searches in the International Nucleotide Sequence Databases. These searches are time-consuming and regularly require a significant amount of manual intervention and complementary analyses. We here present software – in the form of an identification pipeline for large sets of fungal ITS sequences – developed to automate the BLAST process and several additional analysis steps. The performance of the pipeline was evaluated on a dataset of 350 ITS sequences from fungi growing as epiphytes on building material. Results The pipeline was written in Perl and uses a local installation of NCBI-BLAST for the similarity searches of the query sequences. The variable subregion ITS2 of the ITS region is extracted from the sequences and used for additional searches of higher sensitivity. Multiple alignments of each query sequence and its closest matches are computed, and query sequences sharing at least 50% of their best matches are clustered to facilitate the evaluation of hypothetically conspecific groups. The pipeline proved to speed up the processing, as well as enhance the resolution, of the evaluation dataset considerably, and the fungi were found to belong chiefly to the Ascomycota, with Penicillium and Aspergillus as the two most common genera. The ITS2 was found to indicate a different taxonomic affiliation than did the complete ITS region for 10% of the query sequences, though this figure is likely to vary with the taxonomic scope of the query sequences. Conclusion The present software readily assigns large sets of fungal query sequences to their respective best matches in the international sequence databases and places them in a larger biological context. The output is highly structured to be easy to process, although it still needs

  13. Sensor Network Architectures for Monitoring Underwater Pipelines

    Science.gov (United States)

    Mohamed, Nader; Jawhar, Imad; Al-Jaroodi, Jameela; Zhang, Liren

    2011-01-01

    This paper develops and compares different sensor network architecture designs that can be used for monitoring underwater pipeline infrastructures. These architectures are underwater wired sensor networks, underwater acoustic wireless sensor networks, RF (Radio Frequency) wireless sensor networks, integrated wired/acoustic wireless sensor networks, and integrated wired/RF wireless sensor networks. The paper also discusses the reliability challenges and enhancement approaches for these network architectures. The reliability evaluation, characteristics, advantages, and disadvantages among these architectures are discussed and compared. Three reliability factors are used for the discussion and comparison: the network connectivity, the continuity of power supply for the network, and the physical network security. In addition, the paper also develops and evaluates a hierarchical sensor network framework for underwater pipeline monitoring. PMID:22346669

  14. Installation Capacity Assessment of Damaged Deepwater Pipelines

    Directory of Open Access Journals (Sweden)

    Ramasamy R.

    2014-07-01

    Full Text Available The worldwide exploration and development of subsea and deepwater reservoirs has laid down some new and old engineering challenges to the offshore pipeline industry. This requires large D/t pipelines to be installed at water depths in the vicinity of up to 2700m. The deepwater collapse and buckle propagation event is almost unavoidable as the pipe wall thickness cannot be always determined from the codes and standards due to the limit state criteria. These codes also do not consider any fabrication imperfections and sustained damages emanating from transportation and handling. The objective of this paper is to present the Finite Element Analysis (FEA of dented pipes with D/t ratio more than 45, which is outside the applicability of current design codes, and to investigate the effects on installation capacity of these various damage sizes in terms of collapse and buckle propagation.

  15. Development of an automatic pipeline scanning system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae H.; Lee, Jae C.; Moon, Soon S.; Eom, Heung S.; Choi, Yu R

    1999-11-01

    Pressure pipe inspection in nuclear power plants is one of the mandatory regulation items. Comparing to manual ultrasonic inspection, automatic inspection has the benefits of more accurate and reliable inspection results and reduction of radiation disposal. final object of this project is to develop an automatic pipeline inspection system of pressure pipe welds in nuclear power plants. We developed a pipeline scanning robot with four magnetic wheels and 2-axis manipulator for controlling ultrasonic transducers, and developed the robot control computer which controls the robot to navigate along inspection path exactly. We expect our system can contribute to reduction of inspection time, performance enhancement, and effective management of inspection results. The system developed by this project can be practically used for inspection works after field tests. (author)

  16. Wave Pipelining Using Self Reset Logic

    Directory of Open Access Journals (Sweden)

    Miguel E. Litvin

    2008-01-01

    Full Text Available This study presents a novel design approach combining wave pipelining and self reset logic, which provides an elegant solution at high-speed data throughput with significant savings in power and area as compared with other dynamic CMOS logic implementations. To overcome some limitations in SRL art, we employ a new SRL family, namely, dual-rail self reset logic with input disable (DRSRL-ID. These gates depict fairly constant timing parameters, specially the width of the output pulse, for varying fan-out and logic depth, helping accommodate process, supply voltage, and temperature variations (PVT. These properties simplify the implementation of wave pipelined circuits. General timing analysis is provided and compared with previous implementations. Results of circuit implementation are presented together with conclusions and future work.

  17. Phage Genome Annotation Using the RAST Pipeline.

    Science.gov (United States)

    McNair, Katelyn; Aziz, Ramy Karam; Pusch, Gordon D; Overbeek, Ross; Dutilh, Bas E; Edwards, Robert

    2018-01-01

    Phages are complex biomolecular machineries that have to survive in a bacterial world. Phage genomes show many adaptations to their lifestyle such as shorter genes, reduced capacity for redundant DNA sequences, and the inclusion of tRNAs in their genomes. In addition, phages are not free-living, they require a host for replication and survival. These unique adaptations provide challenges for the bioinformatics analysis of phage genomes. In particular, ORF calling, genome annotation, noncoding RNA (ncRNA) identification, and the identification of transposons and insertions are all complicated in phage genome analysis. We provide a road map through the phage genome annotation pipeline, and discuss the challenges and solutions for phage genome annotation as we have implemented in the rapid annotation using subsystems (RAST) pipeline.

  18. Acoustic energy transmission in cast iron pipelines

    Science.gov (United States)

    Kiziroglou, Michail E.; Boyle, David E.; Wright, Steven W.; Yeatman, Eric M.

    2015-12-01

    In this paper we propose acoustic power transfer as a method for the remote powering of pipeline sensor nodes. A theoretical framework of acoustic power propagation in the ceramic transducers and the metal structures is drawn, based on the Mason equivalent circuit. The effect of mounting on the electrical response of piezoelectric transducers is studied experimentally. Using two identical transducer structures, power transmission of 0.33 mW through a 1 m long, 118 mm diameter cast iron pipe, with 8 mm wall thickness is demonstrated, at 1 V received voltage amplitude. A near-linear relationship between input and output voltage is observed. These results show that it is possible to deliver significant power to sensor nodes through acoustic waves in solid structures. The proposed method may enable the implementation of acoustic - powered wireless sensor nodes for structural and operation monitoring of pipeline infrastructure.

  19. Homology Groups of a Pipeline Petri Net

    Directory of Open Access Journals (Sweden)

    A. A. Husainov

    2013-01-01

    Full Text Available Petri net is said to be elementary if every place can contain no more than one token. In this paper, it is studied topological properties of the elementary Petri net for a pipeline consisting of n functional devices. If the work of the functional devices is considered continuous, we can come to some topological space of “intermediate” states. In the paper, it is calculated the homology groups of this topological space. By induction on n, using the Addition Sequence for homology groups of semicubical sets, it is proved that in dimension 0 and 1 the integer homology groups of these nets are equal to the group of integers, and in the remaining dimensions are zero. Directed homology groups are studied. A connection of these groups with deadlocks and newsletters is found. This helps to prove that all directed homology groups of the pipeline elementary Petri nets are zeroth.

  20. Security Support in Continuous Deployment Pipeline

    DEFF Research Database (Denmark)

    Ullah, Faheem; Raft, Adam Johannes; Shahin, Mojtaba

    2017-01-01

    Continuous Deployment (CD) has emerged as a new practice in the software industry to continuously and automatically deploy software changes into production. Continuous Deployment Pipeline (CDP) supports CD practice by transferring the changes from the repository to production. Since most of the C...... penetration tools. Our findings indicate that the applied tactics improve the security of the major components (i.e., repository, continuous integration server, main server) of a CDP by controlling access to the components and establishing secure connections....

  1. Optimizing the TESS Planet Finding Pipeline

    Science.gov (United States)

    Chitamitara, Aerbwong; Smith, Jeffrey C.; Tenenbaum, Peter; TESS Science Processing Operations Center

    2017-10-01

    The Transiting Exoplanet Survey Satellite (TESS) is a new NASA planet finding all-sky survey that will observe stars within 200 light years and 10-100 times brighter than that of the highly successful Kepler mission. TESS is expected to detect ~1000 planets smaller than Neptune and dozens of Earth size planets. As in the Kepler mission, the Science Processing Operations Center (SPOC) processing pipeline at NASA Ames Research center is tasked with calibrating the raw pixel data, generating systematic error corrected light curves and then detecting and validating transit signals. The Transiting Planet Search (TPS) component of the pipeline must be modified and tuned for the new data characteristics in TESS. For example, due to each sector being viewed for as little as 28 days, the pipeline will be identifying transiting planets based on a minimum of two transit signals rather than three, as in the Kepler mission. This may result in a significantly higher false positive rate. The study presented here is to measure the detection efficiency of the TESS pipeline using simulated data. Transiting planets identified by TPS are compared to transiting planets from the simulated transit model using the measured epochs, periods, transit durations and the expected detection statistic of injected transit signals (expected MES). From the comparisons, the recovery and false positive rates of TPS is measured. Measurements of recovery in TPS are then used to adjust TPS configuration parameters to maximize the planet recovery rate and minimize false detections. The improvements in recovery rate between initial TPS conditions and after various adjustments will be presented and discussed.

  2. Natural gas pipeline leaks across Washington, DC.

    Science.gov (United States)

    Jackson, Robert B; Down, Adrian; Phillips, Nathan G; Ackley, Robert C; Cook, Charles W; Plata, Desiree L; Zhao, Kaiguang

    2014-01-01

    Pipeline safety in the United States has increased in recent decades, but incidents involving natural gas pipelines still cause an average of 17 fatalities and $133 M in property damage annually. Natural gas leaks are also the largest anthropogenic source of the greenhouse gas methane (CH4) in the U.S. To reduce pipeline leakage and increase consumer safety, we deployed a Picarro G2301 Cavity Ring-Down Spectrometer in a car, mapping 5893 natural gas leaks (2.5 to 88.6 ppm CH4) across 1500 road miles of Washington, DC. The δ(13)C-isotopic signatures of the methane (-38.2‰ ± 3.9‰ s.d.) and ethane (-36.5 ± 1.1 s.d.) and the CH4:C2H6 ratios (25.5 ± 8.9 s.d.) closely matched the pipeline gas (-39.0‰ and -36.2‰ for methane and ethane; 19.0 for CH4/C2H6). Emissions from four street leaks ranged from 9200 to 38,200 L CH4 day(-1) each, comparable to natural gas used by 1.7 to 7.0 homes, respectively. At 19 tested locations, 12 potentially explosive (Grade 1) methane concentrations of 50,000 to 500,000 ppm were detected in manholes. Financial incentives and targeted programs among companies, public utility commissions, and scientists to reduce leaks and replace old cast-iron pipes will improve consumer safety and air quality, save money, and lower greenhouse gas emissions.

  3. Improving bioinformatic pipelines for exome variant calling

    OpenAIRE

    Ji, Hanlee P.

    2012-01-01

    Exome sequencing analysis is a cost-effective approach for identifying variants in coding regions. However, recognizing the relevant single nucleotide variants, small insertions and deletions remains a challenge for many researchers and diagnostic laboratories typically do not have access to the bioinformatic analysis pipelines necessary for clinical application. The Atlas2 suite, recently released by Baylor Genome Center, is designed to be widely accessible, runs on desktop computers but is ...

  4. Energy consumption in the pipeline industry

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.

    1977-12-31

    Estimates are developed of the energy consumption and energy intensity (EI) of five categories of U.S. pipeline industries: natural gas, crude oil, petroleum products, coal slurry, and water. For comparability with other transportation modes, it is desirable to calculate EI in Btu/Ton-Mile, and this is done, although the necessary unit conversions introduce additional uncertainties. Since water and sewer lines operate by lift and gravity, a comparable EI is not definable.

  5. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling

    Directory of Open Access Journals (Sweden)

    Sergi Valverde

    2015-01-01

    Full Text Available Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM and white matter (WM using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  6. Quantifying brain tissue volume in multiple sclerosis with automated lesion segmentation and filling.

    Science.gov (United States)

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Sastre-Garriga, Jaume; Montalban, Xavier; Rovira, Àlex; Lladó, Xavier

    2015-01-01

    Lesion filling has been successfully applied to reduce the effect of hypo-intense T1-w Multiple Sclerosis (MS) lesions on automatic brain tissue segmentation. However, a study of fully automated pipelines incorporating lesion segmentation and lesion filling on tissue volume analysis has not yet been performed. Here, we analyzed the % of error introduced by automating the lesion segmentation and filling processes in the tissue segmentation of 70 clinically isolated syndrome patient images. First of all, images were processed using the LST and SLS toolkits with different pipeline combinations that differed in either automated or manual lesion segmentation, and lesion filling or masking out lesions. Then, images processed following each of the pipelines were segmented into gray matter (GM) and white matter (WM) using SPM8, and compared with the same images where expert lesion annotations were filled before segmentation. Our results showed that fully automated lesion segmentation and filling pipelines reduced significantly the % of error in GM and WM volume on images of MS patients, and performed similarly to the images where expert lesion annotations were masked before segmentation. In all the pipelines, the amount of misclassified lesion voxels was the main cause in the observed error in GM and WM volume. However, the % of error was significantly lower when automatically estimated lesions were filled and not masked before segmentation. These results are relevant and suggest that LST and SLS toolboxes allow the performance of accurate brain tissue volume measurements without any kind of manual intervention, which can be convenient not only in terms of time and economic costs, but also to avoid the inherent intra/inter variability between manual annotations.

  7. Estimation of efficiency of hydrotransport pipelines polyurethane coating application in comparison with steel pipelines

    Science.gov (United States)

    Aleksandrov, V. I.; Vasilyeva, M. A.; Pomeranets, I. B.

    2017-10-01

    The paper presents analytical calculations of specific pressure loss in hydraulic transport of the Kachkanarsky GOK iron ore processing tailing slurry. The calculations are based on the results of the experimental studies on specific pressure loss dependence upon hydraulic roughness of pipelines internal surface lined with polyurethane coating. The experiments proved that hydraulic roughness of polyurethane coating is by the factor of four smaller than that of steel pipelines, resulting in a decrease of hydraulic resistance coefficients entered into calculating formula of specific pressure loss - the Darcy-Weisbach formula. Relative and equivalent roughness coefficients are calculated for pipelines with polyurethane coating and without it. Comparative calculations show that hydrotransport pipelines polyurethane coating application is conductive to a specific energy consumption decrease in hydraulic transport of the Kachkanarsky GOC iron ore processing tailings slurry by the factor of 1.5. The experiments were performed on a laboratory hydraulic test rig with a view to estimate the character and rate of physical roughness change in pipe samples with polyurethane coating. The experiments showed that during the following 484 hours of operation, roughness changed in all pipe samples inappreciably. As a result of processing of the experimental data by the mathematical statistics methods, an empirical formula was obtained for the calculation of operating roughness of polyurethane coating surface, depending on the pipeline operating duration with iron ore processing tailings slurry.

  8. NABUCCO pipeline route selection through Turkey

    Energy Technology Data Exchange (ETDEWEB)

    Yildirim, Volkan [Karadeniz Technical Univ., Trabzon (Turkey). Dept. of Geomatics Engineering; Yomralioglu, Tahsin [Istanbul Technical Univ. (Turkey). Dept. of Geomatics Engineering

    2011-03-15

    Pipelines are one of the most effective methods of transferring energy sources like petroleum and gas. In pipeline projects, decreasing the cost, reducing environmental issues and shortening the construction time are related to determining the right route at the beginning of the project. Route determination is usually carried out manually with the help of traditional methods. However, this technique is not effective in many situations because it does not evaluate the factors that affect the route as a whole. In fact, technique, economy, environmental and sociological issues should be examined as a whole in the route determination process. Evaluation of the factors affecting the route is possible with the analysis of many spatial datasets from the same system. Geographical Information Systems (GIS) have been shown to be an effective way for analyzing these types of intensive datasets. In this study, a suggestion related to the NABUCCO Natural Gas Transmission Pipeline project has been made using the Analytic Hierarchy Process (AHP), which is a technique using GIS and multi-criteria decision making. The effectiveness of this method was proven by comparing part of the determined Optimal Route with a length of 557 km with the Proposed Route, which has currently been planned. (orig.)

  9. Fibre optics improving deepwater rov pipeline inspection

    Energy Technology Data Exchange (ETDEWEB)

    McGregor, D.

    1983-09-01

    Pipeline inspection is a complex test requiring a variety of sensors. The trend in recent times has been to fit, simultaneously, all of the above sensors to a vehicle in order to maximise data collection from the pipeline in a single pass. This data is then processed in real time as the ROV travels the pipeline. Thus, a chart representing all the available data can be made available shortly after completion of a dive. The current generation of ROVs uses umbilicals containing various combinations of power conductors, co-axia and twisted pairs to carry the sensor data. These umbilicals, however, have inherent disadvantages which become apparent as sensor data increase in quantity and complexity. This disadvantage is the incompatibility of required high quality-data being transmitted to the surface and the large amounts of electrical energy demanded by the vehicle. Another disadvantage is the incompatibility between sensor signals in terms of frequency and power. However, to eliminate these problems, and to provide for future developments in ROV technology, the new generation of ROVs utilise fibre-optic conductors, the advantages being that they are immune from electro-magnetic interference, they offer wider band-widths with lower power losses (typically 5 dB or less per km) than conventional copper conductors, and are easier to handle as umbilicals lengthen due to demand for vehicles to reach greater depths. Typically, these new umbilicals will be 1.5 km in length.

  10. KENYA’S OIL PIPELINE AND TERRORISM

    Directory of Open Access Journals (Sweden)

    E.O.S.ODHIAMBO

    2014-04-01

    Full Text Available The threat of Al-Shabaab and Al-Qaeda terrorist attacks on the critical infrastructure (oil pipeline in Kenya has brought to the attention the strategic issue of the energy sector security, highlighting the potential vulnerabilities of this sector. Critical Infrastructure Protection (CIP should be a key component of the national security especially after the Kenya Defence Forces’ (KDF incursion into Somalia. The merger of Al-Shabaab and Al-Qaeda terrorist groups and the accelerated grenades attack against Kenya in retaliation has become the centre of the debate on terrorism and internal security of the Kenya. The energy resources are strategic assets from the security, political and economic point of view. Kenya as an oil transit country is considered of primary strategic importance at international level. International terrorism has always looked with interest at the oil resource in order to meet its political and economic targets. We argue that Kenya’s oil pipelines are vulnerable to Al-Shabaab and Al-Qaeda terrorist attack. In summary, the article looks at the concept of terrorism within the framework of critical infrastructure protection, the dangers of attacks on oil pipelines, Kenya’s government preparedness and recommendations.

  11. Instruction issue logic in pipelined supercomputers

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, S.; Smith, J.E.

    1984-11-01

    Basic principles and design tradeoffs for control of pipelined processors are first discussed. We concentrate on register-register architectures like the CRAY-1 where pipeline control logic is localized to one or two pipeline stages and is referred to as ''instruction issue logic.'' Design tradeoffs are explored by giving designs for a variety of instruction issue methods that represent a range of complexity and sophistication. These vary from the CRAY-1 issue logic to a version of Tomasulo's algorithm, first used in the IBM 360/91 floating point unit. Also studied are Thornton's ''scoreboard'' algorithm used on the CDC 6600 and an algorithm we have devised. To provide a standard for comparison, all the issue methods are used to implement the CRAY-1 scalar architecture. Then, using a simulation model and the Lawrence Livermore Loops compiled with the CRAY Fortran compiler, performance results for the various issue methods are given and discussed.

  12. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  13. Streamlining workflow and automation to accelerate laboratory scale protein production.

    Science.gov (United States)

    Konczal, Jennifer; Gray, Christopher H

    2017-05-01

    Protein production facilities are often required to produce diverse arrays of proteins for demanding methodologies including crystallography, NMR, ITC and other reagent intensive techniques. It is common for these teams to find themselves a bottleneck in the pipeline of ambitious projects. This pressure to deliver has resulted in the evolution of many novel methods to increase capacity and throughput at all stages in the pipeline for generation of recombinant proteins. This review aims to describe current and emerging options to accelerate the success of protein production in Escherichia coli. We emphasize technologies that have been evaluated and implemented in our laboratory, including innovative molecular biology and expression vectors, small-scale expression screening strategies and the automation of parallel and multidimensional chromatography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Automated Object Classification with ClassX

    Science.gov (United States)

    Suchkov, A. A.; Hanisch, R. J.; White, R. L.; Postman, M.; Donahue, M. E.; McGlynn, T. A.; Angelini, L.; Corcoran, M. F.; Drake, S. A.; Pence, W. D.; White, N.; Winter, E. L.; Genova, F.; Ochsenbein, F.; Fernique, P.; Derriere, S.

    ClassX is a project aimed at creating an automated system to classify X-ray sources and is envisaged as a prototype of the Virtual Observatory. As a system, ClassX creates a pipeline by integrating a network of classifiers with an engine that searches and retrieves multi-wavelength counterparts for a given target from the worldwide data storage media. At the start of the project we identified a number of issues that needed to be addressed to make the implementation of such a system possible. The most fundamental are: (a) classification methods and algorithms, (b) selection and definition of classes (object types), and (c) identification of source counterparts across multi-wavelength data. Their relevance to the project objectives will be seen in the results below as we discuss ClassX classifiers.

  15. Early generation pipeline girth welding practices and their implications for integrity management of North American pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Amend, Bill [DNV Columbus, Inc., Yorba Linda, CA (United States)

    2010-07-01

    In response to the interest in ensuring the continued safe operation of vintage pipelines and the integrity management challenges related to those pipelines, PRCI sponsored in 2009 a project called {sup V}intage Girth Weld Defect Assessment - Comprehensive Study{sup .} Its objectives focused on girth welds made with the shielded metal arc welding (SMAW) process, particularly with regard to: review of approaches for evaluating the integrity of these welds; description of the typical characteristics and properties of SMAW vintage welds; determination of gaps in available information and technology that hinder effective integrity assessment and management of vintage girth welds. A very extensive literature review was performed as part of this project. Key findings include the following. The failure rate of early generation girth welds is low, especially when considering the rate of catastrophic failures. Pipeline girth welds are unlikely to fail unless subjected to axial strains that far exceed the strains related to internal pressure alone.

  16. The automated data processing architecture for the GPI Exoplanet Survey

    Science.gov (United States)

    Wang, Jason J.; Perrin, Marshall D.; Savransky, Dmitry; Arriaga, Pauline; Chilcote, Jeffrey K.; De Rosa, Robert J.; Millar-Blanchaer, Maxwell A.; Marois, Christian; Rameau, Julien; Wolff, Schuyler G.; Shapiro, Jacob; Ruffio, Jean-Baptiste; Graham, James R.; Macintosh, Bruce

    2017-09-01

    The Gemini Planet Imager Exoplanet Survey (GPIES) is a multi-year direct imaging survey of 600 stars to discover and characterize young Jovian exoplanets and their environments. We have developed an automated data architecture to process and index all data related to the survey uniformly. An automated and flexible data processing framework, which we term the GPIES Data Cruncher, combines multiple data reduction pipelines together to intelligently process all spectroscopic, polarimetric, and calibration data taken with GPIES. With no human intervention, fully reduced and calibrated data products are available less than an hour after the data are taken to expedite follow-up on potential objects of interest. The Data Cruncher can run on a supercomputer to reprocess all GPIES data in a single day as improvements are made to our data reduction pipelines. A backend MySQL database indexes all files, which are synced to the cloud, and a front-end web server allows for easy browsing of all files associated with GPIES. To help observers, quicklook displays show reduced data as they are processed in real-time, and chatbots on Slack post observing information as well as reduced data products. Together, the GPIES automated data processing architecture reduces our workload, provides real-time data reduction, optimizes our observing strategy, and maintains a homogeneously reduced dataset to study planet occurrence and instrument performance.

  17. Virtual Pipeline System Testbed to Optimize the U.S. Natural Gas Transmission Pipeline System

    Energy Technology Data Exchange (ETDEWEB)

    Kirby S. Chapman; Prakash Krishniswami; Virg Wallentine; Mohammed Abbaspour; Revathi Ranganathan; Ravi Addanki; Jeet Sengupta; Liubo Chen

    2005-06-01

    The goal of this project is to develop a Virtual Pipeline System Testbed (VPST) for natural gas transmission. This study uses a fully implicit finite difference method to analyze transient, nonisothermal compressible gas flow through a gas pipeline system. The inertia term of the momentum equation is included in the analysis. The testbed simulate compressor stations, the pipe that connects these compressor stations, the supply sources, and the end-user demand markets. The compressor station is described by identifying the make, model, and number of engines, gas turbines, and compressors. System operators and engineers can analyze the impact of system changes on the dynamic deliverability of gas and on the environment.

  18. Environmental impact assessment in the pipeline industry. Experiences with the UK north western ethylene pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Ryde, A.

    1997-12-31

    The north western ethylene pipeline is the final link between Shell`s oil and gas fields in the North Sea and its petrochemical complexes in Cheshire. The natural gas from which ethylene is obtained comes from the Brent and central fields in the North Sea. Environmental impacts are discussed in this paper covering topics as follow: Regulatory and legal aspects; environmental assessment during planning and design; environmental control during construction; environmental management during operation; environmental controls at sensitive sites on the north western ethylene pipeline: some examples. 11 refs., 2 figs.

  19. Automated stopcock actuator

    OpenAIRE

    Vandehey, N. T.; O\\'Neil, J. P.

    2015-01-01

    Introduction We have developed a low-cost stopcock valve actuator for radiochemistry automation built using a stepper motor and an Arduino, an open-source single-board microcontroller. The con-troller hardware can be programmed to run by serial communication or via two 5–24 V digital lines for simple integration into any automation control system. This valve actuator allows for automated use of a single, disposable stopcock, providing a number of advantages over stopcock manifold systems ...

  20. Working group 3: upstream pipelines: inspection, corrosion and integrity management

    Energy Technology Data Exchange (ETDEWEB)

    Paez, Jorge; Stephenson, Mark [Talisman Energy, (Canada)

    2011-07-01

    The third topic investigated the latest challenges to upstream pipeline operation and the areas for improvement in upstream integrity in the pipeline industry. The first session of talks reported on the pipeline incident analysis conducted by the CAPP on several companies from 2006 to 2010 in order to identify best management practises and to drive improvement in pipeline integrity management. Reviews of primary failure statistics and failure frequency were conducted with respect to the various materials of pipes. A summary of changes to the CSA standard related to non-metallic pipes was also presented to complete this background overview of the upstream industry. The second session provided more information about these non-metallic pipes, focusing on construction and quality issues with large diameter HDPE pipelines. The third session discussed the ERW pipeline in relation to upstream industry. An integrity management panel discussion was carried out to close this third working group.

  1. Leadership Pipeline på rejse i den offentlige sektor

    DEFF Research Database (Denmark)

    Nielsen, Jeppe; Dahl, Kristian Aagaard

    2015-01-01

    udbredelse af Leadership Pipeline modellen. Forskningslitteraturen om Leadership Pipeline er imidlertid beskeden. Med teoretisk ammunition fra ”idé på rejse” perspektivet og et multi-level case studie i den danske offentlige sektor belyser artiklen udbredelse og implementering af Leadership Pipeline. Ved...... at anvende skiftende teoretiske begreber og henholdsvis zoome-ind (organisationsniveau) og zoome-ud (feltniveau) viser artiklen, hvordan en række gensidigt forbundne teoretiserings- og translationsaktiviteter sikrede Leadership Pipeline legitimitet og fremskyndede dens udbredelse samtidig med, at forskellige...... praksisvarianter af Leadership Pipeline udkrystalliserede sig i offentlige organisationer med hver deres kendetegn og problemdefinitioner. Med afsæt i denne empiri argumenterer vi for, at implementering af Leadership Pipeline udspillede sig i overlappende teoretiserings- og translationsaktiviteter, der forstærkede...

  2. Crack detection in pipelines using multiple electromechanical impedance sensors

    Science.gov (United States)

    Zuo, Chunyuan; Feng, Xin; Zhang, Yu; Lu, Lu; Zhou, Jing

    2017-10-01

    An extensive network of pipeline systems is used to transport and distribute national energy resources that heavily influence a nation’s economy. Therefore, the structural integrity of these pipeline systems must be monitored and maintained. However, structural damage detection remains a challenge in pipeline engineering. To this end, this study developed a modified electromechanical impedance (EMI) technique for crack detection that involves fusing information from multiple sensors. We derived a new damage-sensitive feature factor based on a pipeline EMI model that considers the influence of the bonding layer between the EMI sensors and pipeline. We experimentally validated the effectiveness of the proposed method. Finally, we used a damage index—root mean square deviation—to examine the degree and position of crack damage in a pipeline.

  3. Influence of remanent magnetization on pitting corrosion in pipeline steel

    Energy Technology Data Exchange (ETDEWEB)

    Espina-Hernandez, J. H. [ESIME Zacatenco, SEPI Electronica Instituto Politecnico Nacional Mexico, D. F. (Mexico); Caleyo, F.; Hallen, J. M. [DIM-ESIQIE, Instituto Politecnico Nacional Mexico D. F. (Mexico); Lopez-Montenegro, A.; Perez-Baruch, E. [Pemex Exploracion y Produccion, Region Sur Villahermosa, Tabasco (Mexico)

    2010-07-01

    Statistical studies performed in Mexico indicate that leakage due to external pitting corrosion is the most likely cause of failure of buried pipelines. When pipelines are inspected with the magnetic flux leakage (MFL) technology, which is routinely used, the magnetization level of every part of the pipeline changes as the MFL tool travels through it. Remanent magnetization stays in the pipeline wall after inspection, at levels that may differ from a point to the next. This paper studies the influence of the magnetic field on pitting corrosion. Experiments were carried out on grade 52 steel under a level of remanent magnetization and other laboratory conditions that imitated the conditions of a pipeline after an MLF inspection. Non-magnetized control samples and magnetized samples were subjected to pitting by immersion in a solution containing chlorine and sulfide ions for seven days, and then inspected with optical microscopy. Results show that the magnetic field in the pipeline wall significantly increases pitting corrosion.

  4. Managing changes of location classes of gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Cunha, Sergio B.; Sousa, Antonio Geraldo de [PETROBRAS Transporte S.A. (TRANSPETRO), Rio de Janeiro, RJ (Brazil)

    2009-12-19

    Most of the gas pipeline design codes utilize a class location system, where the design safety factor and the hydrostatic test factor are determined according to the population density in the vicinities of the pipeline route. Consequently, if an operator is requested or desires to maintain an existing gas pipeline in compliance with its design code, it will reduce the operational pressure or replace pipe sections to increase the wall thickness whenever a change in location class takes place. This article introduces an alternative methodology to deal with changes in location classes of gas pipelines. Initially, selected codes that utilize location class systems are reviewed. Afterwards, a model for the area affected by an ignition following a natural gas pipeline leak is described. Finally, a methodology to determine the MAOP and third part damage mitigation measures for gas transport pipelines that underwent changes in location class is presented. (author)

  5. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Complacency and Automation Bias in the Use of Imperfect Automation.

    Science.gov (United States)

    Wickens, Christopher D; Clegg, Benjamin A; Vieane, Alex Z; Sebok, Angelia L

    2015-08-01

    We examine the effects of two different kinds of decision-aiding automation errors on human-automation interaction (HAI), occurring at the first failure following repeated exposure to correctly functioning automation. The two errors are incorrect advice, triggering the automation bias, and missing advice, reflecting complacency. Contrasts between analogous automation errors in alerting systems, rather than decision aiding, have revealed that alerting false alarms are more problematic to HAI than alerting misses are. Prior research in decision aiding, although contrasting the two aiding errors (incorrect vs. missing), has confounded error expectancy. Participants performed an environmental process control simulation with and without decision aiding. For those with the aid, automation dependence was created through several trials of perfect aiding performance, and an unexpected automation error was then imposed in which automation was either gone (one group) or wrong (a second group). A control group received no automation support. The correct aid supported faster and more accurate diagnosis and lower workload. The aid failure degraded all three variables, but "automation wrong" had a much greater effect on accuracy, reflecting the automation bias, than did "automation gone," reflecting the impact of complacency. Some complacency was manifested for automation gone, by a longer latency and more modest reduction in accuracy. Automation wrong, creating the automation bias, appears to be a more problematic form of automation error than automation gone, reflecting complacency. Decision-aiding automation should indicate its lower degree of confidence in uncertain environments to avoid the automation bias. © 2015, Human Factors and Ergonomics Society.

  7. Streak detection and analysis pipeline for optical images

    Science.gov (United States)

    Virtanen, J.; Granvik, M.; Torppa, J.; Muinonen, K.; Poikonen, J.; Lehti, J.; Säntti, T.; Komulainen, T.; Flohrer, T.

    2014-07-01

    We describe a novel data processing and analysis pipeline for optical observations of moving objects, either of natural (asteroids, meteors) or artificial origin (satellites, space debris). The monitoring of the space object populations requires reliable acquisition of observational data to support the development and validation of population models, and to build and maintain catalogues of orbital elements. The orbital catalogues are, in turn, needed for the assessment of close approaches (for asteroids, with the Earth; for satellites, with each other) and for the support of contingency situations or launches. For both types of populations, there is also increasing interest to detect fainter objects corresponding to the small end of the size distribution. We focus on the low signal-to-noise (SNR) detection of objects with high angular velocities, resulting in long and faint object trails, or streaks, in the optical images. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and, particularly for satellites, within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a 'track-before-detect' problem, resulting in streaks of arbitrary lengths. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, algorithms are not readily available yet. In the ESA-funded StreakDet (Streak detection and astrometric reduction) project, we develop and evaluate an automated processing pipeline applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The algorithmic

  8. Quantitative risk analysis in two pipelines operated by TRANSPETRO

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Claudio B. [PETROBRAS Transporte S/A (TRANSPETRO), Rio de Janeiro, RJ (Brazil); Pinho, Edson [Universidade Federal Rural do Rio de Janeiro (UFRRJ), Seropedica, RJ (Brazil); Bittencourt, Euclides [Centro Universitario FIB, Salvador , BA (Brazil)

    2009-07-01

    Transportation risk analysis techniques were used to study two pipelines operated by TRANSPETRO. The Pipeline A is for the simultaneous transportation of diesel, gasoline and LPG and comprises three parts, all of them crossing rural areas. The Pipeline B is for oil transportation and one of its ends is located in an area of a high density population. Both pipelines had their risk studied using the PHAST RISK{sup R} software and the individual risk measures, the only considered measures for license purposes for this type of studies, presented level far below the maximum tolerable levels considered. (author)

  9. Computer models of pipeline systems based on electro hydraulic analogy

    Science.gov (United States)

    Kolesnikov, S. V.; Kudinov, V. A.; Trubitsyn, K. V.; Tkachev, V. K.; Stefanyuk, E. V.

    2017-10-01

    This paper describes the results of the development of mathematical and computer models of complex multi-loop branched pipeline networks for various purposes (water-oil-gas pipelines, heating networks, etc.) based on the electro hydraulic analogy of current spread in conductors and fluids in pipelines described by the same equations. Kirchhoff’s laws used in the calculation of electrical networks are applied in the calculations for pipeline systems. To maximize the approximation of the computer model to the real network concerning its resistance to the process of transferring the medium, the method of automatic identification of the model is applied.

  10. Pipeline bottoming cycle study. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-01

    The technical and economic feasibility of applying bottoming cycles to the prime movers that drive the compressors of natural gas pipelines was studied. These bottoming cycles convert some of the waste heat from the exhaust gas of the prime movers into shaft power and conserve gas. Three typical compressor station sites were selected, each on a different pipeline. Although the prime movers were different, they were similar enough in exhaust gas flow rate and temperature that a single bottoming cycle system could be designed, with some modifications, for all three sites. Preliminary design included selection of the bottoming cycle working fluid, optimization of the cycle, and design of the components, such as turbine, vapor generator and condensers. Installation drawings were made and hardware and installation costs were estimated. The results of the economic assessment of retrofitting bottoming cycle systems on the three selected sites indicated that profitability was strongly dependent upon the site-specific installation costs, how the energy was used and the yearly utilization of the apparatus. The study indicated that the bottoming cycles are a competitive investment alternative for certain applications for the pipeline industry. Bottoming cycles are technically feasible. It was concluded that proper design and operating practices would reduce the environmental and safety hazards to acceptable levels. The amount of gas that could be saved through the year 2000 by the adoption of bottoming cycles for two different supply projections was estimated as from 0.296 trillion ft/sup 3/ for a low supply projection to 0.734 trillion ft/sup 3/ for a high supply projection. The potential market for bottoming cycle equipment for the two supply projections varied from 170 to 500 units of varying size. Finally, a demonstration program plan was developed.

  11. Materials Solutions for Hydrogen Delivery in Pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Ningileri, Shridas T.; Boggess, Todd A; Stalheim, Douglas

    2013-01-02

    The main objective of the study is as follows: Identify steel compositions/microstructures suitable for construction of new pipeline infrastructure and evaluate the potential use of the existing steel pipeline infrastructure in high pressure gaseous hydrogen applications. The microstructures of four pipeline steels were characterized and tensile testing was conducted in gaseous hydrogen and helium at pressures of 5.5 MPa (800 psi), 11 MPa (1600 psi) and 20.7 MPa (3000 psi). Based on reduction of area, two of the four steels that performed the best across the pressure range were selected for evaluation of fracture and fatigue performance in gaseous hydrogen at 5.5 MPa (800 psi) and 20.7 MPa (3000 psi). The basic format for this phase of the study is as follows: Microstructural characterization of volume fraction of phases in each alloy; Tensile testing of all four alloys in He and H{sub 2} at 5.5 MPa (800 psi), 11 MPa (1600 psi), and 20.7 MPa (3000 psi). RA performance was used to choose the two best performers for further mechanical property evaluation; Fracture testing (ASTM E1820) of two best tensile test performers in H{sub 2} at 5.5 MPa (800 psi) and 20.7 MPa (3000 psi); Fatigue testing (ASTM E647) of two best tensile test performers in H2 at 5.5 MPa (800 psi) and 20.7 MPa (3000 psi) with frequency =1.0 Hz and R-ratio=0.5 and 0.1.

  12. Automated evaluation of service oriented architecture systems: a case study

    Science.gov (United States)

    Fouad, Hesham; Gilliam, Antonio; Guleyupoglu, Suleyman; Russell, Stephen M.

    2017-05-01

    The Service Oriented Architecture (SOA) model is fast gaining dominance in how software applications are built. They allow organizations to capitalize on existing services and share data amongst distributed applications. The automatic evaluation of SOA systems poses a challenging problem due to three factors: technological complexity, organizational incompatibility, and integration into existing development pipelines. In this paper we describe our experience in developing and deploying an automated evaluation capability for the Marine Corps' Tactical Service Oriented Architecture (TSOA). We outline the technological, policy, and operational challenges we face and how we are addressing them.

  13. 75 FR 63774 - Pipeline Safety: Safety of On-Shore Hazardous Liquid Pipelines

    Science.gov (United States)

    2010-10-18

    ...-operated on the outer continental shelf (OCS); breakout tanks that receive and store hazardous liquid, but... the pipeline safety regulations have been inspected with an in-line inspection tool (i.e., a ``smart...? ] E.6 Should PHMSA adopt standards for conducting in-line inspections using ``smart pigs,'' the...

  14. 77 FR 36606 - Pipeline Safety: Government/Industry Pipeline Research and Development Forum, Public Meeting

    Science.gov (United States)

    2012-06-19

    ... pipeline safety and with protecting the environment. The forum allows public, government and industry... sessions to provide introductory, panel, and summary presentations, and concurrent working group sessions...--Advancing Technology into the Market (General Session) Working Group Overview (General Session) Working...

  15. Forties field pipeline major technical project

    Energy Technology Data Exchange (ETDEWEB)

    1973-05-01

    Laying of the submarine North Sea line to connect the westerly platform in the Forties field to a landfall at Cruden Bay is a critical area in British Petroleum Co. Ltd.'s development program. The length of the route is approx. 106 miles. The pipeline of 32-in. 3/4-in. wall X65 grade will be laid for most of its length in water depths between 300 and 400 ft. The problems center on control of buoyancy and tension in the line while laying, to avoid overstressing and possible buckling. Both contracted lay barges were in position during April and work has begun.

  16. Wax deposition in crude oil pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Assuncao, Pablo Morelato; Rodrigues, Lorennzo Marrochi Nolding [Universidade Federal do Espirito Santo, Sao Mateus, ES (Brazil). Centro Universitario Norte do Espirito Santo. Engenharia de Petroleo; Romero, Mao Ilich [University of Wyoming, Laramie, WY (United States). Enhanced Oil Recovery Institute], e-mail: mromerov@uwyo.edu

    2010-07-01

    Crude oil is a complex mixture of hydrocarbons which consists of aromatics, paraffins, naphthenics, resins asphaltenes, etc. When the temperature of crude oil is reduced, the heavy components, like paraffin, will precipitate and deposit on the pipe internal wall in the form of a wax-oil gel. The gel deposit consists of wax crystals that trap some amount of oil. As the temperature gets cooler, more wax will precipitate and the thickness of the wax gel will increase, causing gradual solidification of the crude and eventually the oil stop moving inside the offshore pipeline. Crude oil may not be able to be re-mobilized during re-startup. The effective diameter will be reduced with wax deposition, resulting in several problems, for example, higher pressure drop which means additional pumping energy costs, poor oil quality, use of chemical components like precipitation inhibitors or flowing facilitators, equipment failure, risk of leakage, clogging of the ducts and process equipment. Wax deposition problems can become so sever that the whole pipeline can be completely blocked. It would cost millions of dollars to remediate an offshore pipeline that is blocked by wax. Wax solubility decreases drastically with decreasing temperature. At low temperatures, as encountered in deep water production, is easy to wax precipitate. The highest temperature below which the paraffins begins to precipitate as wax crystals is defined as wax appearance temperature (WAT). Deposition process is a complex free surface problem involving thermodynamics, fluid dynamics, mass and heat transfer. In this work, a numerical analysis of wax deposition by molecular diffusion and shear dispersion mechanisms in crude oil pipeline is studied. Diffusion flux of wax toward the wall is estimated by Fick's law of diffusion, in similar way the shear dispersion; wax concentration gradient at the solid-liquid interface is obtained by the volume fraction conservation equation; and since the wax deposition

  17. Crude value management through pipeline systems

    Energy Technology Data Exchange (ETDEWEB)

    Segato, R. [Suncor Energy Marketing Inc., Calgary, AB (Canada)

    2009-07-01

    This presentation reviewed Suncor's integrated oil flow operations with particular focus on the best practices in crude oil quality management from source rocks to refineries. Suncor produces synthetic crude at its operations in Fort McMurray, Alberta. The crude reaches destinations across North America. The quality of injected and delivered crude varies because of pipeline and terminal logistics, which implies changes in valuation. Refinery planners, engineers and crude traders are faced with the challenge of maximizing profitability while minimizing risk. Refiners face a continuously changing landscape in terms of crude classifications, new commodity developments, batch interferences, shared tank bottoms and sampling limitations. tabs., figs.

  18. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and

  19. Library Automation in Pakistan.

    Science.gov (United States)

    Haider, Syed Jalaluddin

    1998-01-01

    Examines the state of library automation in Pakistan. Discusses early developments; financial support by the Netherlands Library Development Project (Pakistan); lack of automated systems in college/university and public libraries; usage by specialist libraries; efforts by private-sector libraries and the National Library in Pakistan; commonly used…

  20. Library Automation Style Guide.

    Science.gov (United States)

    Gaylord Bros., Liverpool, NY.

    This library automation style guide lists specific terms and names often used in the library automation industry. The terms and/or acronyms are listed alphabetically and each is followed by a brief definition. The guide refers to the "Chicago Manual of Style" for general rules, and a notes section is included for the convenience of individual…

  1. Identity Management Processes Automation

    Directory of Open Access Journals (Sweden)

    A. Y. Lavrukhin

    2010-03-01

    Full Text Available Implementation of identity management systems consists of two main parts, consulting and automation. The consulting part includes development of a role model and identity management processes description. The automation part is based on the results of consulting part. This article describes the most important aspects of IdM implementation.

  2. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  3. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  4. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  5. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  6. Risk Analysis using Corrosion Rate Parameter on Gas Transmission Pipeline

    Science.gov (United States)

    Sasikirono, B.; Kim, S. J.; Haryadi, G. D.; Huda, A.

    2017-05-01

    In the oil and gas industry, the pipeline is a major component in the transmission and distribution process of oil and gas. Oil and gas distribution process sometimes performed past the pipeline across the various types of environmental conditions. Therefore, in the transmission and distribution process of oil and gas, a pipeline should operate safely so that it does not harm the surrounding environment. Corrosion is still a major cause of failure in some components of the equipment in a production facility. In pipeline systems, corrosion can cause failures in the wall and damage to the pipeline. Therefore it takes care and periodic inspections or checks on the pipeline system. Every production facility in an industry has a level of risk for damage which is a result of the opportunities and consequences of damage caused. The purpose of this research is to analyze the level of risk of 20-inch Natural Gas Transmission Pipeline using Risk-based inspection semi-quantitative based on API 581 associated with the likelihood of failure and the consequences of the failure of a component of the equipment. Then the result is used to determine the next inspection plans. Nine pipeline components were observed, such as a straight pipes inlet, connection tee, and straight pipes outlet. The risk assessment level of the nine pipeline’s components is presented in a risk matrix. The risk level of components is examined at medium risk levels. The failure mechanism that is used in this research is the mechanism of thinning. Based on the results of corrosion rate calculation, remaining pipeline components age can be obtained, so the remaining lifetime of pipeline components are known. The calculation of remaining lifetime obtained and the results vary for each component. Next step is planning the inspection of pipeline components by NDT external methods.

  7. 78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension

    Science.gov (United States)

    2013-09-12

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension AGENCY: Pipeline and Hazardous Materials Safety Administration, DOT... Register a notice announcing a public workshop on ``Integrity Verification Process'' which took place on...

  8. 77 FR 28331 - Standards for Business Practices for Interstate Natural Gas Pipelines

    Science.gov (United States)

    2012-05-14

    ... Natural Gas Pipelines AGENCY: Federal Energy Regulatory Commission, DOE. ACTION: Request for additional... North American Energy Standards Board (NAESB) applicable to natural gas pipelines. The Commission... American Energy Standards Board (NAESB) applicable to natural gas pipelines. The Commission, however, did...

  9. Bio-corrosion of water pipeline by sulphate-reducing bacteria in a ...

    African Journals Online (AJOL)

    esiri

    2013-11-13

    Nov 13, 2013 ... This study investigates the presence of SRB in water, in a water pipeline and in ... Key words: Sulphate-reducing bacteria, corrosion, water pipeline, biocide. INTRODUCTION ...... tubercles in distribution pipelines. J. Am. Water ...

  10. Transportation of heavy crude by pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, G.; Lachmann, J.

    1977-11-01

    The purpose of this work is to present a general summary of the operations which should be performed from the technical point of view, as well as the economic, for arriving at an optimum design for a pipeline and its utilization in the transportation of heavy crude oil. The characteristics are presented of the Boscan field crude which is one of the most difficult in the world to handle. Many of the different insulating coatings are named that have been used for covering piping or equipment which has to operate in a hot environment. Other parts of the study present some of the experiences which have occurred during the transport of heavy crude in cold water. The properties of the crude are listed as well as those of some of the interior coatings including polymers. Equations are given with which to calculate heat loss from heated lines and to determine pressure drop. Various insulation thicknesses are calculated with selection of the most economic value. In some pipelines of considerable length which transport heated heavy crude, it is necessary to provide reheating equipment at one or more locations prior to destination.

  11. Pipeline FFT Architectures Optimized for FPGAs

    Directory of Open Access Journals (Sweden)

    Bin Zhou

    2009-01-01

    Full Text Available This paper presents optimized implementations of two different pipeline FFT processors on Xilinx Spartan-3 and Virtex-4 FPGAs. Different optimization techniques and rounding schemes were explored. The implementation results achieved better performance with lower resource usage than prior art. The 16-bit 1024-point FFT with the R22SDF architecture had a maximum clock frequency of 95.2 MHz and used 2802 slices on the Spartan-3, a throughput per area ratio of 0.034 Msamples/s/slice. The R4SDC architecture ran at 123.8 MHz and used 4409 slices on the Spartan-3, a throughput per area ratio of 0.028 Msamples/s/slice. On Virtex-4, the 16-bit 1024-point R22SDF architecture ran at 235.6 MHz and used 2256 slice, giving a 0.104 Msamples/s/slice ratio; the 16-bit 1024-point R4SDC architecture ran at 219.2 MHz and used 3064 slices, giving a 0.072 Msamples/s/slice ratio. The R22SDF was more efficient than the R4SDC in terms of throughput per area due to a simpler controller and an easier balanced rounding scheme. This paper also shows that balanced stage rounding is an appropriate rounding scheme for pipeline FFT processors.

  12. A study of processes for welding pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Weston, J. (ed.)

    1991-07-01

    A review was made of exisiting and potential processes for welding pipelines: fusion welding (arc, electron beam, laser, thermit) and forge welding (friction, flash, magnetically impelled arc butt, upset butt, explosive, shielded active gas, gas pressure). Consideration of J-lay operations gave indications that were reflections of the status of the processes in terms of normal land and offshore S-lay operation: forge welding processes, although having promise require considerable development; fusion welding processes offer several possibilities (mechanized GMA welding likely to be used in 1991-2); laser welding requires development in all pipeline areas: a production machine for electron beam welding will involve high costs. Nondestructive testing techniques are also reviewed. Demand for faster quality assessment is being addressed by speeding radiographic film processing and through the development of real time radiography and automatic ultrasonic testing. Conclusions on most likely future process developments are: SMAW with cellulosic electrodes is best for tie-ins, short pip runs; SMAW continues to be important for small-diameter lines, although mechanized GMA could be used, along with mechanical joining, MIAB, radial fraction, and flash butt; mechanized GMA welding is likely to predominate for large diameter lines and probably will be used for the first J-lay line (other techniques could be used too); and welding of piping for station facilities involves both shop welding of sub-assemblies and on-site welding of pipe and sub-assemblies to each other (site welding uses both SMAW and GMAW). Figs, tabs.

  13. Designing integrated computational biology pipelines visually.

    Science.gov (United States)

    Jamil, Hasan M

    2013-01-01

    The long-term cost of developing and maintaining a computational pipeline that depends upon data integration and sophisticated workflow logic is too high to even contemplate "what if" or ad hoc type queries. In this paper, we introduce a novel application building interface for computational biology research, called VizBuilder, by leveraging a recent query language called BioFlow for life sciences databases. Using VizBuilder, it is now possible to develop ad hoc complex computational biology applications at throw away costs. The underlying query language supports data integration and workflow construction almost transparently and fully automatically, using a best effort approach. Users express their application by drawing it with VizBuilder icons and connecting them in a meaningful way. Completed applications are compiled and translated as BioFlow queries for execution by the data management system LifeDB, for which VizBuilder serves as a front end. We discuss VizBuilder features and functionalities in the context of a real life application after we briefly introduce BioFlow. The architecture and design principles of VizBuilder are also discussed. Finally, we outline future extensions of VizBuilder. To our knowledge, VizBuilder is a unique system that allows visually designing computational biology pipelines involving distributed and heterogeneous resources in an ad hoc manner.

  14. Visualization Multi-Pipeline for Communicating Biology.

    Science.gov (United States)

    Mindek, Peter; Kouril, David; Sorger, Johannes; Toloudis, Daniel; Lyons, Blair; Johnson, Graham; Groller, M Eduard; Viola, Ivan

    2018-01-01

    We propose a system to facilitate biology communication by developing a pipeline to support the instructional visualization of heterogeneous biological data on heterogeneous user-devices. Discoveries and concepts in biology are typically summarized with illustrations assembled manually from the interpretation and application of heterogenous data. The creation of such illustrations is time consuming, which makes it incompatible with frequent updates to the measured data as new discoveries are made. Illustrations are typically non-interactive, and when an illustration is updated, it still has to reach the user. Our system is designed to overcome these three obstacles. It supports the integration of heterogeneous datasets, reflecting the knowledge that is gained from different data sources in biology. After pre-processing the datasets, the system transforms them into visual representations as inspired by scientific illustrations. As opposed to traditional scientific illustration these representations are generated in real-time - they are interactive. The code generating the visualizations can be embedded in various software environments. To demonstrate this, we implemented both a desktop application and a remote-rendering server in which the pipeline is embedded. The remote-rendering server supports multi-threaded rendering and it is able to handle multiple users simultaneously. This scalability to different hardware environments, including multi-GPU setups, makes our system useful for efficient public dissemination of biological discoveries.

  15. Using Tracer Technology to Characterize Contaminated Pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Maresca, Joseph, W., Jr., Ph.D.; Bratton, Wesley, L., Ph.D., P.E.; Dickerson, Wilhelmina; Hales, Rochelle

    2005-12-30

    The Pipeline Characterization Using Tracers (PCUT) technique uses conservative and partitioning, reactive or other interactive tracers to remotely determine the amount of contaminant within a run of piping or ductwork. The PCUT system was motivated by a method that has been successfully used to characterize subsurface soil contaminants and is similar in operation to that of a gas chromatography column. By injecting a ?slug? of both conservative and partitioning tracers at one end (or section) of the piping and measuring the time history of the concentration of the tracers at the other end (or another section) of the pipe, the presence, location, and amount of contaminant within the pipe or duct can be determined. The tracers are transported along the pipe or duct by a gas flow field, typically air or nitrogen, which has a velocity that is slow enough so that the partitioning tracer has time to interact with the contaminant before the tracer slug completely passes over the contaminate region. PCUT not only identifies the presence of contamination, it also can locate the contamination along the pipeline and quantify the amount of residual. PCUT can be used in support of deactivation and decommissioning (D&D) of piping and ducts that may have been contaminated with hazardous chemicals such as chlorinated solvents, petroleum products, radioactive materials, or heavy metals, such as mercury.

  16. Chimenea and other tools: Automated imaging of multi-epoch radio-synthesis data with CASA

    Science.gov (United States)

    Staley, T. D.; Anderson, G. E.

    2015-11-01

    In preparing the way for the Square Kilometre Array and its pathfinders, there is a pressing need to begin probing the transient sky in a fully robotic fashion using the current generation of radio telescopes. Effective exploitation of such surveys requires a largely automated data-reduction process. This paper introduces an end-to-end automated reduction pipeline, AMIsurvey, used for calibrating and imaging data from the Arcminute Microkelvin Imager Large Array. AMIsurvey makes use of several component libraries which have been packaged separately for open-source release. The most scientifically significant of these is chimenea, which implements a telescope-agnostic algorithm for automated imaging of pre-calibrated multi-epoch radio-synthesis data, of the sort typically acquired for transient surveys or follow-up. The algorithm aims to improve upon standard imaging pipelines by utilizing iterative RMS-estimation and automated source-detection to avoid so called 'Clean-bias', and makes use of CASA subroutines for the underlying image-synthesis operations. At a lower level, AMIsurvey relies upon two libraries, drive-ami and drive-casa, built to allow use of mature radio-astronomy software packages from within Python scripts. While targeted at automated imaging, the drive-casa interface can also be used to automate interaction with any of the CASA subroutines from a generic Python process. Additionally, these packages may be of wider technical interest beyond radio-astronomy, since they demonstrate use of the Python library pexpect to emulate terminal interaction with an external process. This approach allows for rapid development of a Python interface to any legacy or externally-maintained pipeline which accepts command-line input, without requiring alterations to the original code.

  17. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  18. Automated X-ray image analysis for cargo security: Critical review and future promise.

    Science.gov (United States)

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  19. 75 FR 4550 - Black Marlin Pipeline Company; Notice of Filing

    Science.gov (United States)

    2010-01-28

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Black Marlin Pipeline Company; Notice of Filing January 21, 2010. Take notice that on November 4, 2009, Black Marlin Pipeline Company submitted a request for a waiver of the...

  20. 76 FR 4892 - Black Marlin Pipeline Company; Notice of Filing

    Science.gov (United States)

    2011-01-27

    ... From the Federal Register Online via the Government Publishing Office ] DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Black Marlin Pipeline Company; Notice of Filing January 20, 2011. Take notice that on January 19, 2010, Black Marlin Pipeline Company submitted a request for a waiver of...

  1. Anti-Thixotropic Analysis of Pipeline Metal Losses in Welded ...

    African Journals Online (AJOL)

    This paper examines the causes of metal loss induced by cutting wear within the internal walls of pipelines which could lead to unpredicted and unexpected pipeline failure and the attendant oil spillage in Nigeria. To determine the rate of wear, the flow properties were determined. Flow was found to be turbulent containing ...

  2. PIPELINE CORROSION CONTROL IN OIL AND GAS INDUSTRY: A ...

    African Journals Online (AJOL)

    ... severe mutilation of the pipeline coatings, substrates due to vandalization and coating failures. The data from cathodic protection control method from Nigeria National Petroleum Corporation (NNPC)/ Pipeline and Product Marketing Company (PPMC) for system 2A line was analyzed and it was deduced that about 10.3km ...

  3. Leadership Pipeline – en neo-weberiansk revitalisering af bureaukratiet?

    DEFF Research Database (Denmark)

    Dahl, Kristian Aagaard; Nielsen, Jeppe

    2015-01-01

    Selvom Leadership Pipeline har vundet indpas i mange private og offentlige organisationer, er den akademiske litteratur om ledelsesmodellen indtil videre forbavsende beskeden. I denne artikel søger vi at indkredse svar på, hvorvidt Leadership Pipeline udgør et (gammeldags) bureaukratisk perspekti...

  4. PIPELINE CORROSION CONTROL IN OIL AND GAS INDUSTRY: A ...

    African Journals Online (AJOL)

    user

    (NNPC)/ Pipeline and Product Marketing Company (PPMC) for system 2A line was analyzed and it was deduced that about 10.3km of the pipeline was well protected and possibly fit for use and about 62.7km is experiencing under protection which means corrosion is predicted to take place in that segment in a short time ...

  5. 75 FR 45696 - Pipeline Safety: Personal Electronic Device Related Distractions

    Science.gov (United States)

    2010-08-03

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Personal Electronic Device Related... personal electronic devices (PEDs) by individuals performing operations and maintenance activities on a... Electronic Devices, 75 FR 9754, May 18, 2010; Limiting the Use of Wireless Communication Devices, 75 FR 16391...

  6. 75 FR 35516 - Pipeline Safety: Request for Special Permit

    Science.gov (United States)

    2010-06-22

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF TRANSPORTATION... the Federal pipeline safety laws, PHMSA is publishing this notice of special permit request we have... seeking relief from compliance with certain plastic pipe design requirements in the Federal pipeline...

  7. 76 FR 51963 - Cobra Pipeline Ltd.; Notice of Baseline Filings

    Science.gov (United States)

    2011-08-19

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Cobra Pipeline Ltd.; Notice of Baseline Filings Take notice that on August 12, 2011, Cobra Pipeline Ltd. submitted a revised baseline filing of their Statement of Operating...

  8. 78 FR 13662 - Cobra Pipeline Ltd.; Notice of Petition

    Science.gov (United States)

    2013-02-28

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Cobra Pipeline Ltd.; Notice of Petition Take notice that on February 4, 2013, Cobra Pipeline Ltd. (Cobra) filed in Docket No. PR13-32-000 to correct a Tariff Record filed on November...

  9. Investigation of Corrosion of Buried Oil Pipeline by the Electrical ...

    African Journals Online (AJOL)

    MICHAEL

    ABSTRACT: The delineation of possible areas of corrosion along an underground oil pipeline in Ubeji,. Delta State, Nigeria was investigated using the horizontal electrical resistivity profiling technique and the. Spontaneous Potential geophysical method. The resistivity and self potential values of the soil along the pipeline ...

  10. Analytical modeling of pipeline failure in multiphase flow due to ...

    African Journals Online (AJOL)

    Pipeline could be said to be the safest and the most economical means of transportation of hydrocarbon fluids. Pipelines carrying oil and gas may suffer from internal corrosion when water is present. The corrosivity varies due to several factors such as; temperature, total pressure, CO2 and H2S content in the gas, pH of the ...

  11. The "Learning Disabilities to Juvenile Detention" Pipeline: A Case Study

    Science.gov (United States)

    Mallett, Christopher A.

    2014-01-01

    Adolescents becoming formally involved with a juvenile court because of school-related behavior and discipline problems is a phenomenon known as the school-to-prison pipeline. Adolescents with learning disabilities are disproportionately represented within this pipeline. A study was conducted to review the outcomes for a population of youthful…

  12. Centrifuge modeling of buried continuous pipelines subjected to normal faulting

    Science.gov (United States)

    Moradi, Majid; Rojhani, Mahdi; Galandarzadeh, Abbas; Takada, Shiro

    2013-03-01

    Seismic ground faulting is the greatest hazard for continuous buried pipelines. Over the years, researchers have attempted to understand pipeline behavior mostly via numerical modeling such as the finite element method. The lack of well-documented field case histories of pipeline failure from seismic ground faulting and the cost and complicated facilities needed for full-scale experimental simulation mean that a centrifuge-based method to determine the behavior of pipelines subjected to faulting is best to verify numerical approaches. This paper presents results from three centrifuge tests designed to investigate continuous buried steel pipeline behavior subjected to normal faulting. The experimental setup and procedure are described and the recorded axial and bending strains induced in a pipeline are presented and compared to those obtained via analytical methods. The influence of factors such as faulting offset, burial depth and pipe diameter on the axial and bending strains of pipes and on ground soil failure and pipeline deformation patterns are also investigated. Finally, the tensile rupture of a pipeline due to normal faulting is investigated.

  13. Quantitative Risk Mapping of Urban Gas Pipeline Networks Using GIS

    Science.gov (United States)

    Azari, P.; Karimi, M.

    2017-09-01

    Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  14. QUANTITATIVE RISK MAPPING OF URBAN GAS PIPELINE NETWORKS USING GIS

    Directory of Open Access Journals (Sweden)

    P. Azari

    2017-09-01

    Full Text Available Natural gas is considered an important source of energy in the world. By increasing growth of urbanization, urban gas pipelines which transmit natural gas from transmission pipelines to consumers, will become a dense network. The increase in the density of urban pipelines will influence probability of occurring bad accidents in urban areas. These accidents have a catastrophic effect on people and their property. Within the next few years, risk mapping will become an important component in urban planning and management of large cities in order to decrease the probability of accident and to control them. Therefore, it is important to assess risk values and determine their location on urban map using an appropriate method. In the history of risk analysis of urban natural gas pipeline networks, the pipelines has always been considered one by one and their density in urban area has not been considered. The aim of this study is to determine the effect of several pipelines on the risk value of a specific grid point. This paper outlines a quantitative risk assessment method for analysing the risk of urban natural gas pipeline networks. It consists of two main parts: failure rate calculation where the EGIG historical data are used and fatal length calculation that involves calculation of gas release and fatality rate of consequences. We consider jet fire, fireball and explosion for investigating the consequences of gas pipeline failure. The outcome of this method is an individual risk and is shown as a risk map.

  15. 78 FR 23972 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2013-04-23

    ... on an information collection under Office of Management and Budget (OMB) Control No. 2137-0047...: 1-202-493-2251. Mail: Docket Management Facility; U.S. Department of Transportation (DOT), 1200 New... API 1130 ``Computational Pipeline Monitoring for Liquid Pipelines'' (API 1130). API 1130 section 4.2...

  16. Pipeline river crossing studies; Estudos de travessias de dutos

    Energy Technology Data Exchange (ETDEWEB)

    Leal, Marcos de Castro; Stasiak, Luciano; Silva, Alessandra de Barros e [ESTEIO Engenharia e Aerolevantamentos S.A., Curitiba, PR (Brazil)

    2005-07-01

    This work has for objective to guide the execution of study of pipelines crossings under courses of water, channels, flooded areas and reservoirs. The study seeks the minimal risks to the pipelines, be them anthropics, geotechnics, hydraulics or environmental, as in the implantation phase as of operation. It also seeks to determine which is the best constructive method for the crossing in question. (author)

  17. Assessment of Millennium Pipeline Project Lake Erie Crossing

    Science.gov (United States)

    2000-08-01

    linear fit to the Ramberg – Osgood formulation, ε = (σ/E)[1 + α(σ/σy)n–1], where ε is the strain, σ is the applied stress, E is the elastic modulus, α...pipeline are all considered. The finite-element analyses were conducted using ABAQUS /Standard. The soil/pipeline interaction model (Fig. 6.1) was

  18. Common Data Analysis Pipeline | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    CPTAC supports analyses of the mass spectrometry raw data (mapping of spectra to peptide sequences and protein identification) for the public using a Common Data Analysis Pipeline (CDAP). The data types available on the public portal are described below. A general overview of this pipeline can be downloaded here. Mass Spectrometry Data Formats RAW (Vendor) Format

  19. Pipelined CPU Design with FPGA in Teaching Computer Architecture

    Science.gov (United States)

    Lee, Jong Hyuk; Lee, Seung Eun; Yu, Heon Chang; Suh, Taeweon

    2012-01-01

    This paper presents a pipelined CPU design project with a field programmable gate array (FPGA) system in a computer architecture course. The class project is a five-stage pipelined 32-bit MIPS design with experiments on the Altera DE2 board. For proper scheduling, milestones were set every one or two weeks to help students complete the project on…

  20. 77 FR 74276 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-12-13

    ..., titled ``Integrity Management in High Consequence Areas for Operators of Hazardous Liquid Pipelines....regulations.gov , including any personal information provided. You should know that anyone is able to search..., titled: ``Integrity Management in High Consequence Areas for Operators of Hazardous Liquid Pipelines...

  1. 77 FR 27279 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2012-05-09

    ... (OMB) for renewal. The information collections relate to the pipeline integrity management requirements... without change to http://www.regulations.gov , including any personal information provided. You should... on the following information collections: 1. Title: Pipeline Integrity Management in High Consequence...

  2. Surface wave propagation effects on buried segmented pipelines

    Directory of Open Access Journals (Sweden)

    Peixin Shi

    2015-08-01

    Full Text Available This paper deals with surface wave propagation (WP effects on buried segmented pipelines. Both simplified analytical model and finite element (FE model are developed for estimating the axial joint pullout movement of jointed concrete cylinder pipelines (JCCPs of which the joints have a brittle tensile failure mode under the surface WP effects. The models account for the effects of peak ground velocity (PGV, WP velocity, predominant period of seismic excitation, shear transfer between soil and pipelines, axial stiffness of pipelines, joint characteristics, and cracking strain of concrete mortar. FE simulation of the JCCP interaction with surface waves recorded during the 1985 Michoacan earthquake results in joint pullout movement, which is consistent with the field observations. The models are expanded to estimate the joint axial pullout movement of cast iron (CI pipelines of which the joints have a ductile tensile failure mode. Simplified analytical equation and FE model are developed for estimating the joint pullout movement of CI pipelines. The joint pullout movement of the CI pipelines is mainly affected by the variability of the joint tensile capacity and accumulates at local weak joints in the pipeline.

  3. 77 FR 26822 - Pipeline Safety: Verification of Records

    Science.gov (United States)

    2012-05-07

    ... Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Verification of Records AGENCY..., verifiable, and complete. If such a document and records search, review, and verification cannot be satisfactorily completed, the operator cannot rely on this method for calculating MAOP or MOP and must instead...

  4. 76 FR 4103 - ETC Tiger Pipeline, LLC; Notice of Filing

    Science.gov (United States)

    2011-01-24

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF ENERGY Federal Energy Regulatory Commission ETC Tiger Pipeline, LLC; Notice of Filing January 13, 2011. Take notice that on December 30, 2010, ETC Tiger Pipeline, LLC submitted a request for a waiver of the reporting...

  5. Evaluating the movement of active faults on buried pipelines | Parish ...

    African Journals Online (AJOL)

    During the earthquake, a buried pipeline may be experienced extreme loading that is the result of the relatively large displacement of the Earth along the pipe. Large movements of ground could occur by faulting, liquefaction, lateral spreading, landslides, and slope failures. Since the pipelines are widely spread, and in ...

  6. Examination of faults active motion on buried pipelines | Parish ...

    African Journals Online (AJOL)

    During an earthquake, a buried pipeline may experience a severe loading as the result of the ground relatively large displacement along the pipe. Large ground movements may occur by faulting, liquefaction, lateral spreading, landslides and slope failures. Since the pipelines are widely spread, and in some areas ...

  7. 335 A Modular, Multimodality Integrative Pipeline for Neurosurgery Simulation and Visualization.

    Science.gov (United States)

    Costa, Anthony Beardsworth; Bederson, Joshua B

    2016-08-01

    The practice of pre- and intraoperative interactive visualization and modeling continues to grow as its value to clinical practice is augmented by new technologies, such as virtual and augmented reality, or 3D printing. Current tools that extract the necessary structural information from medical imaging modalities and allow virtual or other interrogation of the data are either difficult to use in a practical clinical setting, or sufficiently simple as to limit the knowledge available to the operator. Nonetheless, the broader medical visualization and simulation communities have invented tools that enable automated segmentation and interrogation of structures critical to the success of surgery, such as cranial nerves, vasculature, and cortical and subcortical parcellations. We leverage these tools as inputs to a novel pipeline for neurosurgery simulation. Our pipeline is compatible with ATLAS-based subcortical volumetric segmentation (eg, Freesurfer, ANTS), or any structural input in mesh- or voxel-based formats, together with volumetric data. The visualizer, based on VTK7's OpenGL3x rendering backend, is efficient enough to display an arbitrary number of input structures or volumes at interactive refresh rates. Structures can be manipulated by adjusting parameters for each structure independently (eg, color, opacity). Standard ATLAS-based and ITK/VTK-based tools are included in the pipeline directly. Also included is a novel volumetric shift-based segmentation tool, allowing an operating scientist to easily include information detailing aberrant pathologies rapidly and with minimal semantic information. We demonstrate these tools for a variety of cases, including tumor, vascular, hemorrhagic stroke, and spine. Its performance sufficient to run and be used on a laptop computer and capabilities for preoperative planning through 3D printing the generated structures. We find that repurposing the power of existing segmentation tools within a novel modular, multimodal

  8. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  9. Efficiency improvements in pipeline transportation systems. Technical report, Task 3

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.; Horton, J. H.

    1977-01-01

    This report identifies those potential energy-conservative pipeline innovations that are most energy- and cost-effective, and formulates recommendations for the R, D, and D programs needed to exploit those opportunities. From a candidate field of over twenty classes of efficiency improvements, eight systems are recommended for pursuit. Most of these possess two highly important attributes: large potential energy savings and broad applicability outside the pipeline industry. The R, D, and D program for each improvement and the recommended immediate next step are described. The eight programs recommended for pursuit are: gas-fired combined-cycle compressor station; internally cooled internal combustion engine; methanol-coal slurry pipeline; methanol-coal slurry-fired and coal-fired engines; indirect-fired coal-burning combined-cycle pump station; fuel-cycle pump station; internal coatings in pipelines; and drag-reducing additives in liquid pipelines.

  10. Energy study of pipeline transportation systems. Executive summary. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.

    1977-12-31

    The basic objectives of the overall study were to (1) characterize the pipeline industry and understand its energy consumption in each of the five major pipeline-industry segments: gas, oil, slurry, fresh water, and waste water; (2) identify opportunities for energy conservation in the pipeline industry, and to recommend the necessary R, D, and D programs to exploit those opportunities; (3) characterize and understand the influence of the Federal government on introduction of energy conservative innovations into the pipeline industry; and (4) assess the future potential of the pipeline industry for growth and for contribution to the national goal of energy conservation. This project final report is an executive summary presenting the results from the seven task reports.

  11. CFD analysis of onshore oil pipelines in permafrost

    Science.gov (United States)

    Nardecchia, Fabio; Gugliermetti, Luca; Gugliermetti, Franco

    2017-07-01

    Underground pipelines are built all over the world and the knowledge of their thermal interaction with the soil is crucial for their design. This paper studies the "thermal influenced zone" produced by a buried pipeline and the parameters that can influence its extension by 2D-steady state CFD simulations with the aim to improve the design of new pipelines in permafrost. In order to represent a real case, the study is referred to the Eastern Siberia-Pacific Ocean Oil Pipeline at the three stations of Mo'he, Jiagedaqi and Qiqi'har. Different burial depth sand diameters of the pipe are analyzed; the simulation results show that the effect of the oil pipeline diameter on the thermal field increases with the increase of the distance from the starting station.

  12. Automated Inspection of Power Line Corridors to Measure Vegetation Undercut Using Uav-Based Images

    Science.gov (United States)

    Maurer, M.; Hofer, M.; Fraundorfer, F.; Bischof, H.

    2017-08-01

    Power line corridor inspection is a time consuming task that is performed mostly manually. As the development of UAVs made huge progress in recent years, and photogrammetric computer vision systems became well established, it is time to further automate inspection tasks. In this paper we present an automated processing pipeline to inspect vegetation undercuts of power line corridors. For this, the area of inspection is reconstructed, geo-referenced, semantically segmented and inter class distance measurements are calculated. The presented pipeline performs an automated selection of the proper 3D reconstruction method for on the one hand wiry (power line), and on the other hand solid objects (surrounding). The automated selection is realized by performing pixel-wise semantic segmentation of the input images using a Fully Convolutional Neural Network. Due to the geo-referenced semantic 3D reconstructions a documentation of areas where maintenance work has to be performed is inherently included in the distance measurements and can be extracted easily. We evaluate the influence of the semantic segmentation according to the 3D reconstruction and show that the automated semantic separation in wiry and dense objects of the 3D reconstruction routine improves the quality of the vegetation undercut inspection. We show the generalization of the semantic segmentation to datasets acquired using different acquisition routines and to varied seasons in time.

  13. AUTOMATED INSPECTION OF POWER LINE CORRIDORS TO MEASURE VEGETATION UNDERCUT USING UAV-BASED IMAGES

    Directory of Open Access Journals (Sweden)

    M. Maurer

    2017-08-01

    Full Text Available Power line corridor inspection is a time consuming task that is performed mostly manually. As the development of UAVs made huge progress in recent years, and photogrammetric computer vision systems became well established, it is time to further automate inspection tasks. In this paper we present an automated processing pipeline to inspect vegetation undercuts of power line corridors. For this, the area of inspection is reconstructed, geo-referenced, semantically segmented and inter class distance measurements are calculated. The presented pipeline performs an automated selection of the proper 3D reconstruction method for on the one hand wiry (power line, and on the other hand solid objects (surrounding. The automated selection is realized by performing pixel-wise semantic segmentation of the input images using a Fully Convolutional Neural Network. Due to the geo-referenced semantic 3D reconstructions a documentation of areas where maintenance work has to be performed is inherently included in the distance measurements and can be extracted easily. We evaluate the influence of the semantic segmentation according to the 3D reconstruction and show that the automated semantic separation in wiry and dense objects of the 3D reconstruction routine improves the quality of the vegetation undercut inspection. We show the generalization of the semantic segmentation to datasets acquired using different acquisition routines and to varied seasons in time.

  14. ViennaNGS: A toolbox for building efficient next- generation sequencing analysis pipelines.

    Science.gov (United States)

    Wolfinger, Michael T; Fallmann, Jörg; Eggenhofer, Florian; Amman, Fabian

    2015-01-01

    Recent achievements in next-generation sequencing (NGS) technologies lead to a high demand for reuseable software components to easily compile customized analysis workflows for big genomics data. We present ViennaNGS, an integrated collection of Perl modules focused on building efficient pipelines for NGS data processing. It comes with functionality for extracting and converting features from common NGS file formats, computation and evaluation of read mapping statistics, as well as normalization of RNA abundance. Moreover, ViennaNGS provides software components for identification and characterization of splice junctions from RNA-seq data, parsing and condensing sequence motif data, automated construction of Assembly and Track Hubs for the UCSC genome browser, as well as wrapper routines for a set of commonly used NGS command line tools.

  15. PGP: parallel prokaryotic proteogenomics pipeline for MPI clusters, high-throughput batch clusters and multicore workstations.

    Science.gov (United States)

    Tovchigrechko, Andrey; Venepally, Pratap; Payne, Samuel H

    2014-05-15

    We present the first public release of our proteogenomic annotation pipeline. We have previously used our original unreleased implementation to improve the annotation of 46 diverse prokaryotic genomes by discovering novel genes, post-translational modifications and correcting the erroneous annotations by analyzing proteomic mass-spectrometry data. This public version has been redesigned to run in a wide range of parallel Linux computing environments and provided with the automated configuration, build and testing facilities for easy deployment and portability. Source code is freely available from https://bitbucket.org/andreyto/proteogenomics under GPL license. It is implemented in Python and C++. It bundles the Makeflow engine to execute the workflows. atovtchi@jcvi.org.

  16. On-Site School Library Automation: Automation Anywhere with Laptops.

    Science.gov (United States)

    Gunn, Holly; Oxner, June

    2000-01-01

    Four years after the Halifax Regional School Board was formed through amalgamation, over 75% of its school libraries were automated. On-site automation with laptops was a quicker, more efficient way of automating than sending a shelf list to the Technical Services Department. The Eastern Shore School Library Automation Project was a successful…

  17. AGA: Interactive pipeline for reproducible gene expression and DNA methylation data analyses [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Michael Considine

    2015-10-01

    Full Text Available Automated Genomics Analysis (AGA is an interactive program to analyze high-throughput genomic data sets on a variety of platforms. An easy to use, point and click, guided pipeline is implemented to combine, define, and compare datasets, and customize their outputs. In contrast to other automated programs, AGA enables flexible selection of sample groups for comparison from complex sample annotations. Batch correction techniques are also included to further enable the combination of datasets from diverse studies in this comparison. AGA also allows users to save plots, tables and data, and log files containing key portions of the R script run for reproducible analyses. The link between the interface and R supports collaborative research, enabling advanced R users to extend preliminary analyses generated from bioinformatics novices.

  18. FIGENIX: Intelligent automation of genomic annotation: expertise integration in a new software platform

    Directory of Open Access Journals (Sweden)

    Pontarotti Pierre

    2005-08-01

    Full Text Available Abstract Background Two of the main objectives of the genomic and post-genomic era are to structurally and functionally annotate genomes which consists of detecting genes' position and structure, and inferring their function (as well as of other features of genomes. Structural and functional annotation both require the complex chaining of numerous different software, algorithms and methods under the supervision of a biologist. The automation of these pipelines is necessary to manage huge amounts of data released by sequencing projects. Several pipelines already automate some of these complex chaining but still necessitate an important contribution of biologists for supervising and controlling the results at various steps. Results Here we propose an innovative automated platform, FIGENIX, which includes an expert system capable to substitute to human expertise at several key steps. FIGENIX currently automates complex pipelines of structural and functional annotation under the supervision of the expert system (which allows for example to make key decisions, check intermediate results or refine the dataset. The quality of the results produced by FIGENIX is comparable to those obtained by expert biologists with a drastic gain in terms of time costs and avoidance of errors due to the human manipulation of data. Conclusion The core engine and expert system of the FIGENIX platform currently handle complex annotation processes of broad interest for the genomic community. They could be easily adapted to new, or more specialized pipelines, such as for example the annotation of miRNAs, the classification of complex multigenic families, annotation of regulatory elements and other genomic features of interest.

  19. Data as a Service: A Seismic Web Service Pipeline

    Science.gov (United States)

    Martinez, E.

    2016-12-01

    Publishing data as a service pipeline provides an improved, dynamic approach over static data archives. A service pipeline is a collection of micro web services that each perform a specific task and expose the results of that task. Structured request/response formats allow micro web services to be chained together into a service pipeline to provide more complex results. The U.S. Geological Survey adopted service pipelines to publish seismic hazard and design data supporting both specific and generalized audiences. The seismic web service pipeline starts at source data and exposes probability and deterministic hazard curves, response spectra, risk-targeted ground motions, and seismic design provision metadata. This pipeline supports public/private organizations and individual engineers/researchers. Publishing data as a service pipeline provides a variety of benefits. Exposing the component services enables advanced users to inspect or use the data at each processing step. Exposing a composite service enables new users quick access to published data with a very low barrier to entry. Advanced users may re-use micro web services by chaining them in new ways or injecting new micros services into the pipeline. This allows the user to test hypothesis and compare their results to published results. Exposing data at each step in the pipeline enables users to review and validate the data and process more quickly and accurately. Making the source code open source, per USGS policy, further enables this transparency. Each micro service may be scaled independent of any other micro service. This ensures data remains available and timely in a cost-effective manner regardless of load. Additionally, if a new or more efficient approach to processing the data is discovered, this new approach may replace the old approach at any time, keeping the pipeline running while not affecting other micro services.

  20. A risk assessment model for pipelines exposed to geohazards

    Energy Technology Data Exchange (ETDEWEB)

    Esford, F; Porter, M.; Savigny, K.W. [BGC Engineering Inc., Vancouver, BC (Canada); Muhlbauer, W.K. [WKM Consultancy, Swindon (United Kingdom); Dunlop, C. [Transredes (Bolivia)

    2004-07-01

    The challenges facing Trans porte de Hidrocarburos Sociedad Anonima (Transredes) in maintaining and protecting its aging pipeline infrastructure were discussed. Transredes currently operates 2,744 km of liquids pipelines and 2,920 km of gas pipelines in Bolivia, across terrain that is subject to earthquakes, floods and landslides. The 4 to 36 inch inch diameter pipelines are 40 to 50 years old. A quantitative risk assessment procedure was developed to rank the threats to the pipelines and identify locations facing the highest level of risk. The purpose was to prioritize capital and maintenance activities based on risk management principles. The pilot study customized the risk assessment procedures to address the OSSA-1 pipeline's elevated exposure to geohazards. It was shown that the probability of geohazard-related pipeline failure is very site specific and can vary over several orders of magnitude. Twelve sites received high annual probability of failure estimates due to geohazards. These were among the highest ratings along the pipeline due to any hazard type. Although other hazard types also showed high probability of failure scores, many were related to information gaps or uncertainties in the condition of the pipeline. The following 3 initiatives have been identified for the initial OSSA-1 risk mitigation plan: geohazard mitigation; pipe strength uncertainty resolution; and, general uncertainty reduction. Transredes is using the information from this risk assessment to set capital and maintenance budgets and to develop programs to improve the operational safety of the OSSA-1 pipeline. 14 refs., 1 tab., 7 figs.

  1. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  2. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  3. Automation synthesis modules review.

    Science.gov (United States)

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. 76 FR 54531 - Pipeline Safety: Potential for Damage to Pipeline Facilities Caused by the Passage of Hurricanes

    Science.gov (United States)

    2011-09-01

    ... Facilities Caused by the Passage of Hurricanes AGENCY: Pipeline and Hazardous Materials Safety Administration... to pipeline facilities caused by the passage of Hurricanes. ADDRESSES: This document can be viewed on...-related issues that can result from the passage of hurricanes. That includes the potential for damage to...

  5. Problematizing the STEM Pipeline Metaphor: Is the STEM Pipeline Metaphor Serving Our Students and the STEM Workforce?

    Science.gov (United States)

    Cannady, Matthew A.; Greenwald, Eric; Harris, Kimberly N.

    2014-01-01

    Researchers and policy makers often use the metaphor of an ever-narrowing pipeline to describe the trajectory to a science, technology, engineering or mathematics (STEM) degree or career. This study interrogates the appropriateness of the STEM pipeline as the dominant frame for understanding and making policies related to STEM career trajectories.…

  6. A general pipeline for the development of anchor markers for comparative genomics in plants

    Directory of Open Access Journals (Sweden)

    Stougaard Jens

    2006-08-01

    Full Text Available Abstract Background Complete or near-complete genomic sequence information is presently only available for a few plant species representing a large phylogenetic diversity among plants. In order to effectively transfer this information to species lacking sequence information, comparative genomic tools need to be developed. Molecular markers permitting cross-species mapping along co-linear genomic regions are central to comparative genomics. These "anchor" markers, defining unique loci in genetic linkage maps of multiple species, are gene-based and possess a number of features that make them relatively sparse. To identify potential anchor marker sequences more efficiently, we have established an automated bioinformatic pipeline that combines multi-species Expressed Sequence Tags (EST and genome sequence data. Results Taking advantage of sequence data from related species, the pipeline identifies evolutionarily conserved sequences that are likely to define unique orthologous loci in most species of the same phylogenetic clade. The key features are the identification of evolutionarily conserved sequences followed by automated design of intron-flanking Polymerase Chain Reaction (PCR primer pairs. Polymorphisms can subsequently be identified by size- or sequence variation of PCR products, amplified from mapping parents or populations. We illustrate our procedure in legumes and grasses and exemplify its application in legumes, where model plant studies and the genome- and EST-sequence data available have a potential impact on the breeding of crop species and on our understanding of the evolution of this large and diverse family. Conclusion We provide a database of 459 candidate anchor loci which have the potential to serve as map anchors in more than 18,000 legume species, a number of which are of agricultural importance. For grasses, the database contains 1335 candidate anchor loci. Based on this database, we have evaluated 76 candidate anchor loci

  7. Multi-atlas segmentation of subcortical brain structures via the AutoSeg software pipeline

    Science.gov (United States)

    Wang, Jiahui; Vachet, Clement; Rumple, Ashley; Gouttard, Sylvain; Ouziel, Clémentine; Perrot, Emilie; Du, Guangwei; Huang, Xuemei; Gerig, Guido; Styner, Martin

    2014-01-01

    Automated segmenting and labeling of individual brain anatomical regions, in MRI are challenging, due to the issue of individual structural variability. Although atlas-based segmentation has shown its potential for both tissue and structure segmentation, due to the inherent natural variability as well as disease-related changes in MR appearance, a single atlas image is often inappropriate to represent the full population of datasets processed in a given neuroimaging study. As an alternative for the case of single atlas segmentation, the use of multiple atlases alongside label fusion techniques has been introduced using a set of individual “atlases” that encompasses the expected variability in the studied population. In our study, we proposed a multi-atlas segmentation scheme with a novel graph-based atlas selection technique. We first paired and co-registered all atlases and the subject MR scans. A directed graph with edge weights based on intensity and shape similarity between all MR scans is then computed. The set of neighboring templates is selected via clustering of the graph. Finally, weighted majority voting is employed to create the final segmentation over the selected atlases. This multi-atlas segmentation scheme is used to extend a single-atlas-based segmentation toolkit entitled AutoSeg, which is an open-source, extensible C++ based software pipeline employing BatchMake for its pipeline scripting, developed at the Neuro Image Research and Analysis Laboratories of the University of North Carolina at Chapel Hill. AutoSeg performs N4 intensity inhomogeneity correction, rigid registration to a common template space, automated brain tissue classification based skull-stripping, and the multi-atlas segmentation. The multi-atlas-based AutoSeg has been evaluated on subcortical structure segmentation with a testing dataset of 20 adult brain MRI scans and 15 atlas MRI scans. The AutoSeg achieved mean Dice coefficients of 81.73% for the subcortical structures

  8. Vegetation assessment in a pipeline influence area: the case study of PETROBRAS ammonia pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Basbaum, Marcos A.; Porciano, Patricia P.; Bonafini, Fabio L. [SEEBLA - Servicos de Engenharia Emilio Baumgart Ltda., Rio de Janeiro, RJ (Brazil)], e-mail: mbasbaum.seebla@petrobras.com.br, e-mail: patriciapp.seebla@petrobras.com.br, e-mail: bonafini.seebla@petrobras.com.br; Guimaraes, Ricardo Z.P.; Torggler, Bianca F.; Fernandes, Renato; Vieira, Elisa D.R. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)], e-mail: rzaluar@petrobras.com.br, e-mail: torggler@petrobras.com.br, e-mail: renatofer@petrobras.com.br, e-mail: elisav@petrobras.com.br

    2009-12-19

    This ammonia pipeline is about 30 km long and links the Fertilizer Plant (FAFEN-BA) to the Urea Marine Terminal (TMU) at the Port of Aratu in Candeias (Bahia State, Brazil). In this study, we characterize the remnants of vegetation and quantify the Permanent Preservation Areas. Furthermore, we propose areas and techniques for their recovery and / or management. The methodology was based on the Rapid Ecological Assessment, which combines selection of areas through remote sensing image analysis, with rapid field campaigns in the selected points. This methodology, successfully applied in PETROBRAS refineries, is first applied in a pipeline influence area. During these campaigns, the main aspects of vegetation, such as phyto physiognomy and ecological succession stages, were registered in field data sheets prepared for this purpose. The most representative remnants of vegetation that could be quantified were Atlantic Forest fragments, as well as those in the Permanent Preservation Areas. (author)

  9. Mortise terrorism on the main pipelines

    Science.gov (United States)

    Komarov, V. A.; Nigrey, N. N.; Bronnikov, D. A.; Nigrey, A. A.

    2018-01-01

    The research aim of the work is to analyze the effectiveness of the methods of physical protection of main pipelines proposed in the article from the "mortise terrorism" A mathematical model has been developed that made it possible to predict the dynamics of "mortise terrorism" in the short term. An analysis of the effectiveness of physical protection methods proposed in the article to prevent unauthorized impacts on the objects under investigation is given. A variant of a video analytics system has been developed that allows detecting violators with recognition of the types of work they perform at a distance of 150 meters in conditions of complex natural backgrounds and precipitation. Probability of detection is 0.959.

  10. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  11. Key Design Properties for Shipping Information Pipeline

    DEFF Research Database (Denmark)

    Jensen, Thomas; Tan, Yao-Hua

    2015-01-01

    . The contribution of the paper is to expand previous research with complementary key design properties. The paper starts with a review of existing literature on previous proposed solutions for increased collaboration in the supply chain for international trade, Inter-Organization Systems and Information......This paper reports on the use of key design properties for development of a new approach towards a solution for sharing shipping information in the supply chain for international trade. Information exchange in international supply chain is extremely inefficient, rather uncoordinated, based largely...... on paper, e-mail, phone and text message, and far too costly. This paper explores the design properties for a shared information infrastructure to exchange information between all parties in the supply chain, commercial parties as well as authorities, which is called a Shipping Information Pipeline...

  12. Fixed automated spray technology.

    Science.gov (United States)

    2011-04-19

    This research project evaluated the construction and performance of Boschungs Fixed Automated : Spray Technology (FAST) system. The FAST system automatically sprays de-icing material on : the bridge when icing conditions are about to occur. The FA...

  13. Automated Vehicles Symposium 2014

    CERN Document Server

    Beiker, Sven; Road Vehicle Automation 2

    2015-01-01

    This paper collection is the second volume of the LNMOB series on Road Vehicle Automation. The book contains a comprehensive review of current technical, socio-economic, and legal perspectives written by experts coming from public authorities, companies and universities in the U.S., Europe and Japan. It originates from the Automated Vehicle Symposium 2014, which was jointly organized by the Association for Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Burlingame, CA, in July 2014. The contributions discuss the challenges arising from the integration of highly automated and self-driving vehicles into the transportation system, with a focus on human factors and different deployment scenarios. This book is an indispensable source of information for academic researchers, industrial engineers, and policy makers interested in the topic of road vehicle automation.

  14. Hydrometeorological Automated Data System

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Office of Hydrologic Development of the National Weather Service operates HADS, the Hydrometeorological Automated Data System. This data set contains the last 48...

  15. SUGOI: automated ontology interchangeability

    CSIR Research Space (South Africa)

    Khan, ZC

    2015-04-01

    Full Text Available : Automated Ontology Interchangeability Zubeida Casmod Khan and C. Maria Abstract. A foundational ontology can solve interoperability issues among the domain ontologies aligned to it. However, several foundational ontologies have been developed, hence...

  16. I-94 Automation FAQs

    Data.gov (United States)

    Department of Homeland Security — In order to increase efficiency, reduce operating costs and streamline the admissions process, U.S. Customs and Border Protection has automated Form I-94 at air and...

  17. Automated Vehicles Symposium 2015

    CERN Document Server

    Beiker, Sven

    2016-01-01

    This edited book comprises papers about the impacts, benefits and challenges of connected and automated cars. It is the third volume of the LNMOB series dealing with Road Vehicle Automation. The book comprises contributions from researchers, industry practitioners and policy makers, covering perspectives from the U.S., Europe and Japan. It is based on the Automated Vehicles Symposium 2015 which was jointly organized by the Association of Unmanned Vehicle Systems International (AUVSI) and the Transportation Research Board (TRB) in Ann Arbor, Michigan, in July 2015. The topical spectrum includes, but is not limited to, public sector activities, human factors, ethical and business aspects, energy and technological perspectives, vehicle systems and transportation infrastructure. This book is an indispensable source of information for academic researchers, industrial engineers and policy makers interested in the topic of road vehicle automation.

  18. PODPS: HST Pipeline to the Universe

    Science.gov (United States)

    Schultz, A. B.; Parsons, S. B.; Swade, D. A.; Dempsey, R. C.; Giovane, E. A.; Kochte, M. C.; Rosenthal, E.; Scott, J. F.; Slowinski, S. E.

    1994-05-01

    The Post Observation Data Processing System (PODPS) Branch, Space Telescope Science Institute, Baltimore, MD is responsible for processing and quality evaluation of all science observations of the Hubble Space Telescope (HST). PODPS staff maintain the Routine Science Data Processing (RSDP) pipeline and are responsible for submitting to the HST archive the science observations, engineering, astrometry, orbit definition files, and other data files generated by OSS (Observation Support System) and SPSS (Science Planning and Scheduling System). Since the launch of HST in April 1990, over 50,000 readouts have been processed, 97% of these within two days of execution. Each HST science observation (WFPC2, FOC, FOS, GHRS) is received at the Data Capture Facility (DCF), GSFC. DCF performs telemetry bit-error correction and transmits the data to PODPS. RSDP pipeline software sorts the data by observation, inserts fill packets as needed, and examines the data structure for errors. These data are converted into a Generic Edited Information Set (GEIS) and calibrated. The GEIS FITS-like header is stored separate from the data array. About 1-2% of observations require some repair usually with standard procedures to correct erroneous bits or keywords. Science observations are represented in a variety of output products: film, gray scale laser printer image, or laser printer plots. The observers can use STSDAS and updated reference files to recalibrate their data. A PODPS OA (Operations Astronomer) performs the quality evaluation of each observation and files a PODPS Data Quality Report (PDQ file). The PDQ file contains a keyword to indicate the overall quality (OK, UNDEREXP, UNKNOWN, ...), a general quality comment for the HST archive, and comments from OSS. DSOB (Data Systems Operations Branch) staff write observations to FITS tape(s) and mail the tape(s) plus OSS and PODPS comments to the PI.

  19. Disassembly automation automated systems with cognitive abilities

    CERN Document Server

    Vongbunyong, Supachai

    2015-01-01

    This book presents a number of aspects to be considered in the development of disassembly automation, including the mechanical system, vision system and intelligent planner. The implementation of cognitive robotics increases the flexibility and degree of autonomy of the disassembly system. Disassembly, as a step in the treatment of end-of-life products, can allow the recovery of embodied value left within disposed products, as well as the appropriate separation of potentially-hazardous components. In the end-of-life treatment industry, disassembly has largely been limited to manual labor, which is expensive in developed countries. Automation is one possible solution for economic feasibility. The target audience primarily comprises researchers and experts in the field, but the book may also be beneficial for graduate students.

  20. CLOTU: An online pipeline for processing and clustering of 454 amplicon reads into OTUs followed by taxonomic annotation

    Directory of Open Access Journals (Sweden)

    Shalchian-Tabrizi Kamran

    2011-05-01

    Full Text Available Abstract Background The implementation of high throughput sequencing for exploring biodiversity poses high demands on bioinformatics applications for automated data processing. Here we introduce CLOTU, an online and open access pipeline for processing 454 amplicon reads. CLOTU has been constructed to be highly user-friendly and flexible, since different types of analyses are needed for different datasets. Results In CLOTU, the user can filter out low quality sequences, trim tags, primers, adaptors, perform clustering of sequence reads, and run BLAST against NCBInr or a customized database in a high performance computing environment. The resulting data may be browsed in a user-friendly manner and easily forwarded to downstream analyses. Although CLOTU is specifically designed for analyzing 454 amplicon reads, other types of DNA sequence data can also be processed. A fungal ITS sequence dataset generated by 454 sequencing of environmental samples is used to demonstrate the utility of CLOTU. Conclusions CLOTU is a flexible and easy to use bioinformatics pipeline that includes different options for filtering, trimming, clustering and taxonomic annotation of high throughput sequence reads. Some of these options are not included in comparable pipelines. CLOTU is implemented in a Linux computer cluster and is freely accessible to academic users through the Bioportal web-based bioinformatics service (http://www.bioportal.uio.no.

  1. Instant Sikuli test automation

    CERN Document Server

    Lau, Ben

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A concise guide written in an easy-to follow style using the Starter guide approach.This book is aimed at automation and testing professionals who want to use Sikuli to automate GUI. Some Python programming experience is assumed.

  2. Automated security management

    CERN Document Server

    Al-Shaer, Ehab; Xie, Geoffrey

    2013-01-01

    In this contributed volume, leading international researchers explore configuration modeling and checking, vulnerability and risk assessment, configuration analysis, and diagnostics and discovery. The authors equip readers to understand automated security management systems and techniques that increase overall network assurability and usability. These constantly changing networks defend against cyber attacks by integrating hundreds of security devices such as firewalls, IPSec gateways, IDS/IPS, authentication servers, authorization/RBAC servers, and crypto systems. Automated Security Managemen

  3. Automated Lattice Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Monahan, Christopher

    2014-11-01

    I review recent developments in automated lattice perturbation theory. Starting with an overview of lattice perturbation theory, I focus on the three automation packages currently "on the market": HiPPy/HPsrc, Pastor and PhySyCAl. I highlight some recent applications of these methods, particularly in B physics. In the final section I briefly discuss the related, but distinct, approach of numerical stochastic perturbation theory.

  4. Automated training for algorithms that learn from genomic data.

    Science.gov (United States)

    Cilingir, Gokcen; Broschat, Shira L

    2015-01-01

    Supervised machine learning algorithms are used by life scientists for a variety of objectives. Expert-curated public gene and protein databases are major resources for gathering data to train these algorithms. While these data resources are continuously updated, generally, these updates are not incorporated into published machine learning algorithms which thereby can become outdated soon after their introduction. In this paper, we propose a new model of operation for supervised machine learning algorithms that learn from genomic data. By defining these algorithms in a pipeline in which the training data gathering procedure and the learning process are automated, one can create a system that generates a classifier or predictor using information available from public resources. The proposed model is explained using three case studies on SignalP, MemLoci, and ApicoAP in which existing machine learning models are utilized in pipelines. Given that the vast majority of the procedures described for gathering training data can easily be automated, it is possible to transform valuable machine learning algorithms into self-evolving learners that benefit from the ever-changing data available for gene products and to develop new machine learning algorithms that are similarly capable.

  5. Metrology automation reliability

    Science.gov (United States)

    Chain, Elizabeth E.

    1996-09-01

    At Motorola's MOS-12 facility automated measurements on 200- mm diameter wafers proceed in a hands-off 'load-and-go' mode requiring only wafer loading, measurement recipe loading, and a 'run' command for processing. Upon completion of all sample measurements, the data is uploaded to the factory's data collection software system via a SECS II interface, eliminating the requirement of manual data entry. The scope of in-line measurement automation has been extended to the entire metrology scheme from job file generation to measurement and data collection. Data analysis and comparison to part specification limits is also carried out automatically. Successful integration of automated metrology into the factory measurement system requires that automated functions, such as autofocus and pattern recognition algorithms, display a high degree of reliability. In the 24- hour factory reliability data can be collected automatically on every part measured. This reliability data is then uploaded to the factory data collection software system at the same time as the measurement data. Analysis of the metrology reliability data permits improvements to be made as needed, and provides an accurate accounting of automation reliability. This reliability data has so far been collected for the CD-SEM (critical dimension scanning electron microscope) metrology tool, and examples are presented. This analysis method can be applied to such automated in-line measurements as CD, overlay, particle and film thickness measurements.

  6. Fully automated pipeline for detection of sex linked genes using RNA-Seq data

    Czech Academy of Sciences Publication Activity Database

    Michalovová, Monika; Kubát, Zdeněk; Hobza, Roman; Vyskot, Boris; Kejnovský, Eduard

    2015-01-01

    Roč. 16, č. 78 (2015) ISSN 1471-2105 R&D Projects: GA ČR(CZ) GBP501/12/G090; GA MŠk(CZ) LM2010005 Institutional support: RVO:68081707 Keywords : SILENE-LATIFOLIA * RUMEX-ACETOSA * Y-CHROMOSOME Subject RIV: BO - Biophysics; EF - Botanics (UEB-Q) Impact factor: 2.435, year: 2015

  7. The DevOps 2.0 toolkit automating the continuous deployment pipeline with containerized microservices

    CERN Document Server

    Farcic, Viktor

    2016-01-01

    This book is about different techniques that help us architect software in a better and more efficient way with microservices packed as immutable containers, tested and deployed continuously to servers that are automatically provisioned with configuration management tools. It's about fast, reliable and continuous deployments with zero-downtime and ability to roll-back. It's about scaling to any number of servers, design of self-healing systems capable of recuperation from both hardware and software failures and about centralized logging and monitoring of the cluster.In other words, this book envelops the whole microservices development and deployment lifecycle using some of the latest and greatest practices and tools. We'll use Docker, Kubernetes, Ansible, Ubuntu, Docker Swarm and Docker Compose, Consul, etcd, Registrator, confd, and so on. We'll go through many practices and even more tools. Finally, while there will be a lot of theory, this is a hands-on book. You won't be able to complete it by reading it ...

  8. Development and Evaluation of an Automated Annotation Pipeline and cDNA Annotation System

    OpenAIRE

    Kasukawa, Takeya; Furuno, Masaaki; Nikaido, Itoshi; Bono, Hidemasa; Hume, David A.; Bult, Carol; Hill, David P; Baldarelli, Richard; Gough, Julian; Kanapin, Alexander; Matsuda, Hideo; Schriml, Lynn M.; Hayashizaki, Yoshihide; Okazaki, Yasushi; Quackenbush, John

    2003-01-01

    Manual curation has long been held to be the “gold standard” for functional annotation of DNA sequence. Our experience with the annotation of more than 20,000 full-length cDNA sequences revealed problems with this approach, including inaccurate and inconsistent assignment of gene names, as well as many good assignments that were difficult to reproduce using only computational methods. For the FANTOM2 annotation of more than 60,000 cDNA clones, we developed a number of methods and tools ...

  9. NEAT: a framework for building fully automated NGS pipelines and analyses.

    Science.gov (United States)

    Schorderet, Patrick

    2016-02-01

    The analysis of next generation sequencing (NGS) has become a standard task for many laboratories in the life sciences. Though there exists several tools to support users in the manipulation of such datasets on various levels, few are built on the basis of vertical integration. Here, we present the NExt generation Analysis Toolbox (NEAT) that allows non-expert users including wet-lab scientists to comprehensively build, run and analyze NGS data through double-clickable executables without the need of any programming experience. In comparison to many publicly available tools including Galaxy, NEAT provides three main advantages: (1) Through the development of double-clickable executables, NEAT is efficient (completes within NEAT is run on the institution's cluster; (3) NEAT allows users to visualize and summarize NGS data rapidly and efficiently using various built-in exploratory data analysis tools including metagenomic and differentially expressed gene analysis. To simplify the control of the workflow, NEAT projects are built around a unique and centralized file containing sample names, replicates, conditions, antibodies, alignment-, filtering- and peak calling parameters as well as cluster-specific paths and settings. Moreover, the small-sized files produced by NEAT allow users to easily manipulate, consolidate and share datasets from different users and institutions. NEAT provides biologists and bioinformaticians with a robust, efficient and comprehensive tool for the analysis of massive NGS datasets. Frameworks such as NEAT not only allow novice users to overcome the increasing number of technical hurdles due to the complexity of manipulating large datasets, but provide more advance users with tools that ensure high reproducibility standards in the NGS era. NEAT is publically available at https://github.com/pschorderet/NEAT.

  10. SV-AUTOPILOT: optimized, automated construction of structural variation discovery and benchmarking pipelines

    NARCIS (Netherlands)

    W.Y. Leung; T. Marschall (Tobias); Y. Paudel; L. Falquet; H. Mei (Hailiang); A. Schönhuth (Alexander); T.Y. Maoz

    2015-01-01

    htmlabstractBackground Many tools exist to predict structural variants (SVs), utilizing a variety of algorithms. However, they have largely been developed and tested on human germline or somatic (e.g. cancer) variation. It seems appropriate to exploit this wealth of technology available for humans

  11. VoICE: A semi-automated pipeline for standardizing vocal analysis across models

    National Research Council Canada - National Science Library

    Burkett, Zachary D; Day, Nancy F; Peñagarikano, Olga; Geschwind, Daniel H; White, Stephanie A

    2015-01-01

    .... Here, we present VoICE (Vocal Inventory Clustering Engine), an approach to grouping vocal elements by creating a high dimensionality dataset through scoring spectral similarity between all vocalizations within a recording session...

  12. VirusDetect: An automated pipeline for efficient virus discovery using deep sequencing of small RNAs

    Science.gov (United States)

    Accurate detection of viruses in plants and animals is critical for agriculture production and human health. Deep sequencing and assembly of virus-derived siRNAs has proven to be a highly efficient approach for virus discovery. However, to date no computational tools specifically designed for both k...

  13. Condition Monitoring Of Operating Pipelines With Operational Modal Analysis Application

    Directory of Open Access Journals (Sweden)

    Mironov Aleksey

    2015-12-01

    Full Text Available In the petroleum, natural gas and petrochemical industries, great attention is being paid to safety, reliability and maintainability of equipment. There are a number of technologies to monitor, control, and maintain gas, oil, water, and sewer pipelines. The paper focuses on operational modal analysis (OMA application for condition monitoring of operating pipelines. Special focus is on the topicality of OMA for definition of the dynamic features of the pipeline (frequencies and mode shapes in operation. The research was conducted using two operating laboratory models imitated a part of the operating pipeline. The results of finite-element modeling, identification of pipe natural modes and its modification under the influence of virtual failure are discussed. The work considers the results of experimental research of dynamic behavior of the operating pipe models using one of OMA techniques and comparing dynamic properties with the modeled data. The study results demonstrate sensitivity of modal shape parameters to modification of operating pipeline technical state. Two strategies of pipeline repair – with continuously condition-based monitoring with proposed technology and without such monitoring, was discussed. Markov chain reliability models for each strategy were analyzed and reliability improvement factor for proposed technology of monitoring in compare with traditional one was evaluated. It is resumed about ability of operating pipeline condition monitoring by measuring dynamic deformations of the operating pipe and OMA techniques application for dynamic properties extraction.

  14. Achieving Efficiency in Gas Pipeline Connection: Evidence from Ghana

    Directory of Open Access Journals (Sweden)

    Anthony Kudjo Gborgenu

    2016-06-01

    Full Text Available The demand for the use of natural gas is on the increase as an energy source. Natural gas transportation requires a continuous pipeline network from the source of gas across long distance to the various destinations. The main objective involves extending gas pipelines from Takoradi to all the regional capital towns in Ghana to meet the growing demands of its citizenry in order to provide economy and efficiency with regards to cost and environmental sustainability by developing a straight forward method of locating pipeline facilities and designing pipeline networks. The problem is formulated as a network of distances and the solution is presented based on Prim’s Algorithm for minimum connections. Data on distances are obtained from the Ghana Highways Authority. The total distance covered by the pipe line network if the existing road networks were used from Takoradi to all the regional capitals towns in Ghana is 5,094km. After Prim’s Algorithm was used, the total distance covered decreased to 1,590km which is about 68.8% reduction in the distance covered with regards to cost and the environmental damage caused by construction of pipelines (soil, forest, rivers, wetlands, noise from compressor stations during pipeline discharge and risk of pipeline leakage.

  15. The goldfields gas pipeline: opening a new frontier

    Energy Technology Data Exchange (ETDEWEB)

    Ride, B.M. [Goldfields Gas Transmission Joint Venture, West Perth, WA (Australia)

    1996-12-31

    Gas exploration and development in the northwest Pilbara in Western Australia has increased due to the commitment of the Goldfields Gas Transmission (GGT) Joint Venture to a 1,380 km gas pipeline linking the north west Pilbara to the east Pilbara iron ore region and the northern and central Goldfields. Major new mining prospects in these highly prospective minerals provinces also offer potential for increased gas demand and GGT pipeline throughput. This paper describes the GGT pipeline and the access arrangements for its use. It has provided a new focus for gas exploration and development in the Carnarvon Basin, North West Shelf, Western Australia, with the East Spar and Harriet Joint Ventures increasing efforts to find reserves to satisfy this new market. The commercial arrangement for the GGT Pipeline services are the first in Australia to be offered under the open access arrangements espoused by the Australian Government and Western Australian Government, and have set a benchmark for other pipelines in Australia. The innovative distance related pipeline tariff arrangements offer prospective gas shippers a simple method for evaluating use of the GGT pipeline and securing gas transmission services. (author). 1 tab., 3 figs.

  16. Structural reliability analysis applied to pipeline risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gardiner, M. [GL Industrial Services, Loughborough (United Kingdom); Mendes, Renato F.; Donato, Guilherme V.P. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Quantitative Risk Assessment (QRA) of pipelines requires two main components to be provided. These are models of the consequences that follow from some loss of containment incident, and models for the likelihood of such incidents occurring. This paper describes how PETROBRAS have used Structural Reliability Analysis for the second of these, to provide pipeline- and location-specific predictions of failure frequency for a number of pipeline assets. This paper presents an approach to estimating failure rates for liquid and gas pipelines, using Structural Reliability Analysis (SRA) to analyze the credible basic mechanisms of failure such as corrosion and mechanical damage. SRA is a probabilistic limit state method: for a given failure mechanism it quantifies the uncertainty in parameters to mathematical models of the load-resistance state of a structure and then evaluates the probability of load exceeding resistance. SRA can be used to benefit the pipeline risk management process by optimizing in-line inspection schedules, and as part of the design process for new construction in pipeline rights of way that already contain multiple lines. A case study is presented to show how the SRA approach has recently been used on PETROBRAS pipelines and the benefits obtained from it. (author)

  17. Assessment of Common and Emerging Bioinformatics Pipelines for Targeted Metagenomics.

    Science.gov (United States)

    Siegwald, Léa; Touzet, Hélène; Lemoine, Yves; Hot, David; Audebert, Christophe; Caboche, Ségolène

    2017-01-01

    Targeted metagenomics, also known as metagenetics, is a high-throughput sequencing application focusing on a nucleotide target in a microbiome to describe its taxonomic content. A wide range of bioinformatics pipelines are available to analyze sequencing outputs, and the choice of an appropriate tool is crucial and not trivial. No standard evaluation method exists for estimating the accuracy of a pipeline for targeted metagenomics analyses. This article proposes an evaluation protocol containing real and simulated targeted metagenomics datasets, and adequate metrics allowing us to study the impact of different variables on the biological interpretation of results. This protocol was used to compare six different bioinformatics pipelines in the basic user context: Three common ones (mothur, QIIME and BMP) based on a clustering-first approach and three emerging ones (Kraken, CLARK and One Codex) using an assignment-first approach. This study surprisingly reveals that the effect of sequencing errors has a bigger impact on the results that choosing different amplified regions. Moreover, increasing sequencing throughput increases richness overestimation, even more so for microbiota of high complexity. Finally, the choice of the reference database has a bigger impact on richness estimation for clustering-first pipelines, and on correct taxa identification for assignment-first pipelines. Using emerging assignment-first pipelines is a valid approach for targeted metagenomics analyses, with a quality of results comparable to popular clustering-first pipelines, even with an error-prone sequencing technology like Ion Torrent. However, those pipelines are highly sensitive to the quality of databases and their annotations, which makes clustering-first pipelines still the only reliable approach for studying microbiomes that are not well described.

  18. WARP (workflow for automated and rapid production): a framework for end-to-end automated digital print workflows

    Science.gov (United States)

    Joshi, Parag

    2006-02-01

    Publishing industry is experiencing a major paradigm shift with the advent of digital publishing technologies. A large number of components in the publishing and print production workflow are transformed in this shift. However, the process as a whole requires a great deal of human intervention for decision making and for resolving exceptions during job execution. Furthermore, a majority of the best-of-breed applications for publishing and print production are intrinsically designed and developed to be driven by humans. Thus, the human-intensive nature of the current prepress process accounts for a very significant amount of the overhead costs in fulfillment of jobs on press. It is a challenge to automate the functionality of applications built with the model of human driven exectution. Another challenge is to orchestrate various components in the publishing and print production pipeline such that they work in a seamless manner to enable the system to perform automatic detection of potential failures and take corrective actions in a proactive manner. Thus, there is a great need for a coherent and unifying workflow architecture that streamlines the process and automates it as a whole in order to create an end-to-end digital automated print production workflow that does not involve any human intervention. This paper describes an architecture and building blocks that lay the foundation for a plurality of automated print production workflows.

  19. Automated pulmonary lobar ventilation measurements using volume-matched thoracic CT and MRI

    Science.gov (United States)

    Guo, F.; Svenningsen, S.; Bluemke, E.; Rajchl, M.; Yuan, J.; Fenster, A.; Parraga, G.

    2015-03-01

    Objectives: To develop and evaluate an automated registration and segmentation pipeline for regional lobar pulmonary structure-function measurements, using volume-matched thoracic CT and MRI in order to guide therapy. Methods: Ten subjects underwent pulmonary function tests and volume-matched 1H and 3He MRI and thoracic CT during a single 2-hr visit. CT was registered to 1H MRI using an affine method that incorporated block-matching and this was followed by a deformable step using free-form deformation. The resultant deformation field was used to deform the associated CT lobe mask that was generated using commercial software. 3He-1H image registration used the same two-step registration method and 3He ventilation was segmented using hierarchical k-means clustering. Whole lung and lobar 3He ventilation and ventilation defect percent (VDP) were generated by mapping ventilation defects to CT-defined whole lung and lobe volumes. Target CT-3He registration accuracy was evaluated using region- , surface distance- and volume-based metrics. Automated whole lung and lobar VDP was compared with semi-automated and manual results using paired t-tests. Results: The proposed pipeline yielded regional spatial agreement of 88.0+/-0.9% and surface distance error of 3.9+/-0.5 mm. Automated and manual whole lung and lobar ventilation and VDP were not significantly different and they were significantly correlated (r = 0.77, p < 0.0001). Conclusion: The proposed automated pipeline can be used to generate regional pulmonary structural-functional maps with high accuracy and robustness, providing an important tool for image-guided pulmonary interventions.

  20. Pipeline Policy of Kazakhstan in the Caspian Region

    Directory of Open Access Journals (Sweden)

    Lidiya Andreyevna Parkhomchik

    2014-01-01

    Full Text Available The article considers promising development trends for oil and gas infrastructure in the country, as well as defines problem issues of existing pipelines using. At this point, the author notes that the key values for subsequent increase of transport and logistics facilities in Kazakhstan are both implementation of the multi-vector policy in the sphere of pipeline routes diversification and targeted development of cooperation with foreign partners in the oil and gas industry. The article also notes that the geopolitical features of the Caspian region have a direct impact on the formation of country’s strategic course the in the field of pipeline transport.

  1. Threads Pipelining on the CellBE Systems

    Directory of Open Access Journals (Sweden)

    TANASE, C. A.

    2013-08-01

    Full Text Available This article aims to describe a model to accelerate the execution of a parallel algorithm implemented on a Cell B.E. processor. The algorithm implements a technique of finding a moving target in a maze with dynamic architecture, using another technique of pipelining the data transfers between the PPU and SPU threads. We have shown that by using the pipelining technique, we can achieve an improvement of the computing time (around 40%. It can be also seen that the pipelining technique with one SPU is about as good as the parallel technique with four SPUs.

  2. Corrosivity Sensor for Exposed Pipelines Based on Wireless Energy Transfer.

    Science.gov (United States)

    Lawand, Lydia; Shiryayev, Oleg; Al Handawi, Khalil; Vahdati, Nader; Rostron, Paul

    2017-05-30

    External corrosion was identified as one of the main causes of pipeline failures worldwide. A solution that addresses the issue of detecting and quantifying corrosivity of environment for application to existing exposed pipelines has been developed. It consists of a sensing array made of an assembly of thin strips of pipeline steel and a circuit that provides a visual sensor reading to the operator. The proposed sensor is passive and does not require a constant power supply. Circuit design was validated through simulations and lab experiments. Accelerated corrosion experiment was conducted to confirm the feasibility of the proposed corrosivity sensor design.

  3. Routing, construction and maintenance of pipelines in landslide terrain

    Energy Technology Data Exchange (ETDEWEB)

    Stepanek, M. [Geo-Enginering (M.S.T.) Ltd., Calgary, AB (Canada)

    1999-07-01

    Factors influencing the location and routing of pipelines, and the effects of unstable terrain features on construction and maintenance are discussed. Examples from northwestern Alberta and northeastern British Columbia are cited to illustrate the various techniques, such as overhead crossings of valleys and directional drilling to install pipelines below streams, adopted by pipeline construction companies to deal with the challenges presented by sensitive and unstable terrain. The case histories cited also serve to emphasize the importance of terrain analysis and the proper identification of potential problem areas. 5 refs., 7 figs.

  4. Slurry pipelines: economic and political issues. A review

    Energy Technology Data Exchange (ETDEWEB)

    Banks, W. F.

    1977-11-30

    In the controversy surrounding the proposal to grant Federal eminent domain to coal-slurry pipelines, the fundamental issue is whether, on balance, such a grant is in the national interest. The principal subissues (peripheral issues) of economics, water supply and disposal, energy consumption and conservation, employment, safety, and environmental impact are analyzed. It is found that, as compared with unit trains, which are the only immediate alternative for movement of large quantities of Western coal, the pipelines are not against the national interest, except in the case of employment. It is concluded that, on balance, the pipelines are in the national interest and should be granted the power of Federal eminent domain.

  5. Pipeline external corrosion direct assessment methodology: lessons learned - part 1

    Energy Technology Data Exchange (ETDEWEB)

    Kowalski, Angel R. [DNV Columbus, Inc., OH (United States)

    2009-07-01

    DNV Columbus (Former CC Technologies) played a key role in the development of Direct Assessment (DA) methodologies, providing leadership in the NACE technical committees charged with development of DA standards. Since the first publication of NACE Standard RP-0502-2002, External Corrosion Direct Assessment (ECDA) has been successfully applied over a great number of pipelines to evaluate the impact of external corrosion on the pipeline integrity. This paper summarizes the results of applying ECDA over a selected number of underground pipelines and presents interesting facts about the methodology. (author)

  6. Bpipe: a tool for running and managing bioinformatics pipelines.

    Science.gov (United States)

    Sadedin, Simon P; Pope, Bernard; Oshlack, Alicia

    2012-06-01

    Bpipe is a simple, dedicated programming language for defining and executing bioinformatics pipelines. It specializes in enabling users to turn existing pipelines based on shell scripts or command line tools into highly flexible, adaptable and maintainable workflows with a minimum of effort. Bpipe ensures that pipelines execute in a controlled and repeatable fashion and keeps audit trails and logs to ensure that experimental results are reproducible. Requiring only Java as a dependency, Bpipe is fully self-contained and cross-platform, making it very easy to adopt and deploy into existing environments. Bpipe is freely available from http://bpipe.org under a BSD License.

  7. Automated protein subfamily identification and classification.

    Directory of Open Access Journals (Sweden)

    Duncan P Brown

    2007-08-01

    Full Text Available Function prediction by homology is widely used to provide preliminary functional annotations for genes for which experimental evidence of function is unavailable or limited. This approach has been shown to be prone to systematic error, including percolation of annotation errors through sequence databases. Phylogenomic analysis avoids these errors in function prediction but has been difficult to automate for high-throughput application. To address this limitation, we present a computationally efficient pipeline for phylogenomic classification of proteins. This pipeline uses the SCI-PHY (Subfamily Classification in Phylogenomics algorithm for automatic subfamily identification, followed by subfamily hidden Markov model (HMM construction. A simple and computationally efficient scoring scheme using family and subfamily HMMs enables classification of novel sequences to protein families and subfamilies. Sequences representing entirely novel subfamilies are differentiated from those that can be classified to subfamilies in the input training set using logistic regression. Subfamily HMM parameters are estimated using an information-sharing protocol, enabling subfamilies containing even a single sequence to benefit from conservation patterns defining the family as a whole or in related subfamilies. SCI-PHY subfamilies correspond closely to functional subtypes defined by experts and to conserved clades found by phylogenetic analysis. Extensive comparisons of subfamily and family HMM performances show that subfamily HMMs dramatically improve the separation between homologous and non-homologous proteins in sequence database searches. Subfamily HMMs also provide extremely high specificity of classification and can be used to predict entirely novel subtypes. The SCI-PHY Web server at http://phylogenomics.berkeley.edu/SCI-PHY/ allows users to upload a multiple sequence alignment for subfamily identification and subfamily HMM construction. Biologists wishing to

  8. Proof of pipeline strength based on measurements of inspection pigs; Festigkeitsnachweis von Pipelines aufgrund der Messergebnisse von Pruefmolchen

    Energy Technology Data Exchange (ETDEWEB)

    De la Camp, H.J.; Feser, G.; Hofmann, A.; Wolf, B.; Schmidt, H. [TUeV Sueddeutschland Bau und Betrieb GmbH, Muenchen (Germany); Herforth, H.E.; Juengling, K.H.; Schmidt, W. [TUeV Anlagentechnik GmbH, Berlin-Schoeneberg (Germany). Unternehmensgruppe TUeV Rheinland/Berlin-Brandenburg

    2002-01-01

    The report is aimed at collecting and documenting the state of the art and the extensive know how of experts and pipeline operators with regard to judging the structural integrity of pipelines. In order to assess the actual mechanical strength of pipelines based on measurement results obtained by inspection pigs, guidance is given for future processing, which eventually can be used as a basis for an industry standard. A literature study of the commercially available types of inspection pigs describes and synoptically lists the respective pros and cons. In essence the report comprises besides check lists of operating data for the pipeline and the pig runs mainly the evaluation of defects and respective calculating procedures. Included are recommendations regarding maintenance planning, verification of defects as well as repetition of pig runs. (orig.) [German] Ziel des Berichtes ist die Erfassung und Dokumentation zum derzeitigen Stand der Technik und des vorhandenen umfangreichen Know-how von Sachverstaendigen und Pipelinebetreibern auf dem Gebiet der sicherheitstechnischen Beurteilung von Pipelines. Fuer den Festigkeitsnachweis von Pipelines aufgrund der Messergebnisse von Pruefmolchen wurde ein Leitfaden als Basis fuer die zukuenftige Vorgehensweise erstellt, der eventuell die Grundlage eines normativen Regelwerkes bilden kann. In einer Literaturstudie wurden die auf dem Markt befindlichen Pruefmolchtypen zusammenfassend beschrieben und ihre Vor- und Nachteile tabellarisch gegenuebergestellt und bewertet. Neben der Erstellung von Checklisten fuer notwendige Daten zum Betrieb der Pipeline und der Molchlaeufe bildet die Fehlerbewertung mit entsprechenden Berechnungsverfahren den Hauptteil dieses Berichtes. Hinweise zur Instandhaltungsplanung (Fehlerverifikation und Molchlaufwiederholung) werden gegeben. (orig.)

  9. 77 FR 48112 - Pipeline Safety: Administrative Procedures; Updates and Technical Corrections

    Science.gov (United States)

    2012-08-13

    ... process for pipeline enforcement matters to conform to current law, amends other administrative procedures... and Enforcement Process Maximum administrative civil penalties. Section 2 of the Pipeline Safety Act...-AE29 Pipeline Safety: Administrative Procedures; Updates and Technical Corrections AGENCY: Pipeline and...

  10. Outfall Pipeline Lines, Tutuila AS, 2009, US EPA Region 9

    Data.gov (United States)

    U.S. Environmental Protection Agency — The outfalls extend from coastal points, originating from canneries and sewage treatment plants.Tafuna Outfall: This polyethylene pipeline installed in 1996, is...

  11. Wave-induced fatigue of multi-span pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Tao Xu [Marine Engineers and Consultants, Freemont, CA (United States); Lauridsen, B. [Danish Maritime Institute, Copenhagen (Denmark); Yong Bai [J P Kenny A/S, Forus (Norway)

    1999-02-01

    Free spanning of the multi-span pipeline is an important subject for design of pipeline in uneven seabed. The seabed intervention costs are largely influenced by pipeline spanning design, which includes assessment of trawl pullover response, vortex-induced vibrations and wave-induced fatigue. The objective of this paper is to develop a rational design methodology for the determination of the free span lengths based on the multi-span pipeline in-line fatigue assessment. Following the summary of the procedure, a detailed mathematical model for the free span movement and its analytical closed form solution are developed. The fatigue damage models are detailed both in time domain and frequency domain approaches. A numerical example is presented to illustrate the technical models. (author)

  12. 75 FR 76077 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2010-12-07

    ... Federal offices in Washington, DC, we recommend that persons consider an alternative method (Internet, fax... pipelines in waters less than 15 feet (4.6 meters) deep as measured from mean low water that are at risk of...

  13. Developing a leadership pipeline: the Cleveland Clinic experience

    National Research Council Canada - National Science Library

    Hess, Caryl A; Barss, Christina; Stoller, James K

    2014-01-01

    .... Because competencies to lead differ from clinical or research skills, there is a compelling need to develop leaders and create a talent pipeline, perhaps especially in physician-led organizations like Cleveland Clinic...

  14. Reference diameter in calculations of creep strain for steam pipelines

    Directory of Open Access Journals (Sweden)

    Łopata Stanisław

    2017-01-01

    Full Text Available Recommended methods of the operational safety assessment of high-pressure steam pipelines include periodic testing that enables determination of creep strain and the creep rate. The accuracy with which the two quantities are determined depends, among others, on the control of the testing conditions and on the assumed reference diameter. The analysis conducted herein concerns the impact of the reference diameter on the results characterizing the creep phenomenon in the pipeline elements (straight sections and bends. In this respect, the initial ovality of the pipeline cross-section is an important parameter. The calculations are made using own results obtained from many years of the steam pipeline creep testing, including tests performed after the expiry of the computational service life.

  15. The Kepler Science Data Processing Pipeline Source Code Road Map

    Science.gov (United States)

    Wohler, Bill; Jenkins, Jon M.; Twicken, Joseph D.; Bryson, Stephen T.; Clarke, Bruce Donald; Middour, Christopher K.; Quintana, Elisa Victoria; Sanderfer, Jesse Thomas; Uddin, Akm Kamal; Sabale, Anima; hide

    2016-01-01

    We give an overview of the operational concepts and architecture of the Kepler Science Processing Pipeline. Designed, developed, operated, and maintained by the Kepler Science Operations Center (SOC) at NASA Ames Research Center, the Science Processing Pipeline is a central element of the Kepler Ground Data System. The SOC consists of an office at Ames Research Center, software development and operations departments, and a data center which hosts the computers required to perform data analysis. The SOC's charter is to analyze stellar photometric data from the Kepler spacecraft and report results to the Kepler Science Office for further analysis. We describe how this is accomplished via the Kepler Science Processing Pipeline, including, the software algorithms. We present the high-performance, parallel computing software modules of the pipeline that perform transit photometry, pixel-level calibration, systematic error correction, attitude determination, stellar target management, and instrument characterization.

  16. Energy geopolitics and Iran-Pakistan-India gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Verma, Shiv Kumar [Political Geography Division, Center for International Politics, Organization and Disarmament, School of International Studies, Jawaharlal Nehru University, New Delhi 110067 (India)]. E-mail: vermajnu@gmail.com

    2007-06-15

    With the growing energy demands in India and its neighboring countries, Iran-Pakistan-India (IPI) gas pipeline assumes special significance. Energy-deficient countries such as India, China, and Pakistan are vying to acquire gas fields in different parts of the world. This has led to two conspicuous developments: first, they are competing against each other and secondly, a situation is emerging where they might have to confront the US and the western countries in the near future in their attempt to control energy bases. The proposed IPI pipeline is an attempt to acquire such base. However, Pakistan is playing its own game to maximize its leverages. Pakistan, which refuses to establish even normal trading ties with India, craves to earn hundreds of millions of dollars in transit fees and other annual royalties from a gas pipeline which runs from Iran's South Pars fields to Barmer in western India. Pakistan promises to subsidize its gas imports from Iran and thus also become a major forex earner. It is willing to give pipeline related 'international guarantees' notwithstanding its record of covert actions in breach of international law (such as the export of terrorism) and its reluctance to reciprocally provide India what World Trade Organization (WTO) rules obligate it to do-Most Favored Nation (MFN) status. India is looking at the possibility of using some set of norms for securing gas supply through pipeline as the European Union has already initiated a discussion on the issue. The key point that is relevant to India's plan to build a pipeline to source gas from Iran relates to national treatment for pipeline. Under the principle of national treatment which also figures in relation to foreign direct investment (FDI), the country through which a pipeline transits should provide some level of security to the transiting pipeline as it would have provided to its domestic pipelines. This paper will endeavor to analyze, first, the significance of this

  17. Optimal Energy Consumption Analysis of Natural Gas Pipeline

    OpenAIRE

    Liu, Enbin; Li, Changjun; Yang, Yi

    2014-01-01

    There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. There...

  18. Deliverability on the interstate natural gas pipeline system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    Deliverability on the Interstate Natural Gas Pipeline System examines the capability of the national pipeline grid to transport natural gas to various US markets. The report quantifies the capacity levels and utilization rates of major interstate pipeline companies in 1996 and the changes since 1990, as well as changes in markets and end-use consumption patterns. It also discusses the effects of proposed capacity expansions on capacity levels. The report consists of five chapters, several appendices, and a glossary. Chapter 1 discusses some of the operational and regulatory features of the US interstate pipeline system and how they affect overall system design, system utilization, and capacity expansions. Chapter 2 looks at how the exploration, development, and production of natural gas within North America is linked to the national pipeline grid. Chapter 3 examines the capability of the interstate natural gas pipeline network to link production areas to market areas, on the basis of capacity and usage levels along 10 corridors. The chapter also examines capacity expansions that have occurred since 1990 along each corridor and the potential impact of proposed new capacity. Chapter 4 discusses the last step in the transportation chain, that is, deliverability to the ultimate end user. Flow patterns into and out of each market region are discussed, as well as the movement of natural gas between States in each region. Chapter 5 examines how shippers reserve interstate pipeline capacity in the current transportation marketplace and how pipeline companies are handling the secondary market for short-term unused capacity. Four appendices provide supporting data and additional detail on the methodology used to estimate capacity. 32 figs., 15 tabs.

  19. Studies of the Beetle 1.2 Pipeline Homogeneity

    CERN Document Server

    Agari, M; Blouw, J; Schmelling, M; Hofmann, W; Schwingenheuer, B; Pugatch, V; Volyanskyy, D; Jiménez-Otero, S; Tran, M T; Voss, H; Bernhard, R P; Köstner, S; Lehner, F; Lois, C; Needham, M; Steinkamp, O; Straumann, U; Vollhardt, A

    2003-01-01

    The pipeline homogeneity in general and the behaviour of the edge channels of the Beetle 1.2 readout chip [1] were studied with data taken during the Silicon Tracker test beam period in May 2003. A contribution of roughly 10\\% from pipeline inhomogeneities to the strip noise was observed. All channels including the first and the last one were found to be fully functional.

  20. 76 FR 70217 - Pipeline Safety: Information Collection Activities

    Science.gov (United States)

    2011-11-10

    ...-4566, by email at [email protected]dot.gov , or by mail at U.S. Department of Transportation, Pipeline and... email at [email protected]dot.gov , or by mail at U.S. Department of Transportation, PHMSA, 1200 New Jersey... elements with the degree to which pipelines are subject to part 195. A2. In Step 2, API-AOPL requested...

  1. Piko: A Design Framework for Programmable Graphics Pipelines

    OpenAIRE

    Patney, Anjul; Tzeng, Stanley; Seitz Jr., Kerry A.; Owens, John D.

    2014-01-01

    We present Piko, a framework for designing, optimizing, and retargeting implementations of graphics pipelines on multiple architectures. Piko programmers express a graphics pipeline by organizing the computation within each stage into spatial bins and specifying a scheduling preference for these bins. Our compiler, Pikoc, compiles this input into an optimized implementation targeted to a massively-parallel GPU or a multicore CPU. Piko manages work granularity in a programmable and flexible ma...

  2. Tunnels: different construction methods and its use for pipelines installation

    Energy Technology Data Exchange (ETDEWEB)

    Mattos, Tales; Soares, Ana Cecilia; Assis, Slow de; Bolsonaro, Ralfo; Sanandres, Simon [Petroleo do Brasil S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    In a continental dimensions country like Brazil, the pipeline modal faces the challenge of opening ROW's in the most different kind of soils with the most different geomorphology. To safely fulfill the pipeline construction demand, the ROW opening uses all techniques in earthworks and route definition and, where is necessary, no digging techniques like horizontal directional drilling, micro tunneling and also full size tunnels design for pipelines installation in high topography terrains to avoid geotechnical risks. PETROBRAS has already used the tunnel technique to cross higher terrains with great construction difficult, and mainly to make it pipeline maintenance and operation easier. For the GASBOL Project, in Aparados da Serra region and in GASYRG, in Bolivia, two tunnels were opened with approximately 700 meters and 2,000 meters each one. The GASBOL Project had the particularity of being a gallery with only one excavation face, finishing under the hill and from this point was drilled a vertical shaft was drilled until the top to install the pipeline section, while in GASYRG Project the tunnel had two excavation faces. Currently, two projects are under development with tunnels, one of then is the Caraguatatuba-Taubate gas pipeline (GASTAU), with a 5 km tunnel, with the same concepts of the GASBOL tunnel, with a gallery to be opened with the use of a TBM (Tunneling Boring Machine), and a shaft to the surface, and the gas pipeline Cabiunas-Reduc III (GASDUC III) project is under construction with a 3.7 km tunnel, like the GASYRG tunnel with two faces. This paper presents the main excavation tunneling methods, conventional and mechanized, presenting the most relevant characteristics from both and, in particular, the use of tunnels for pipelines installation. (author)

  3. Automated HAZOP revisited

    DEFF Research Database (Denmark)

    Taylor, J. R.

    2017-01-01

    Hazard and operability analysis (HAZOP) has developed from a tentative approach to hazard identification for process plants in the early 1970s to an almost universally accepted approach today, and a central technique of safety engineering. Techniques for automated HAZOP analysis were developed...... in the 1970s, but still have not displaced expensive manual approaches. Reasons for this were investigated and conclusions are drawn. The author's actual experience in applying automated HAZOP techniques over a period of more than 30 years is revisited, including results from several full-scale validation...... studies and many industrial applications. Automated techniques, when combined with manual approaches, were found to provide significant improvements in HAZOP quality and a limited but valuable improvement in efficiency....

  4. SINFONI Pipeline: Data reduction pipeline for the Very Large Telescope SINFONI spectrograph

    Science.gov (United States)

    ESO

    2017-08-01

    The SINFONI pipeline reduces data from the Very Large Telescope's SINFONI (Spectrograph for INtegral Field Observations in the Near Infrared) instrument. It can evaluate the detector linearity and generate a corresponding non linear pixel map, create a master dark and a hot-pixel map, a master flat and a map of pixels which have intensities greater than a given threshold. It can also compute the optical distortions and slitlets distances, and perform wavelength calibration, PSF, telluric standard and other science data reduction, and can coadd bad pixel maps, collapse a cube to an image over a given wavelength range, perform cube arithmetics, among other useful tasks.

  5. Environmental analysis for pipeline gas demonstration plants

    Energy Technology Data Exchange (ETDEWEB)

    Stinton, L.H.

    1978-09-01

    The Department of Energy (DOE) has implemented programs for encouraging the development and commercialization of coal-related technologies, which include coal gasification demonstration-scale activities. In support of commercialization activities the Environmental Analysis for Pipeline Gas Demonstration Plants has been prepared as a reference document to be used in evaluating potential environmental and socioeconomic effects from construction and operation of site- and process-specific projects. Effluents and associated impacts are identified for six coal gasification processes at three contrasting settings. In general, impacts from construction of a high-Btu gas demonstration plant are similar to those caused by the construction of any chemical plant of similar size. The operation of a high-Btu gas demonstration plant, however, has several unique aspects that differentiate it from other chemical plants. Offsite development (surface mining) and disposal of large quantities of waste solids constitute important sources of potential impact. In addition, air emissions require monitoring for trace metals, polycyclic aromatic hydrocarbons, phenols, and other emissions. Potential biological impacts from long-term exposure to these emissions are unknown, and additional research and data analysis may be necessary to determine such effects. Possible effects of pollutants on vegetation and human populations are discussed. The occurrence of chemical contaminants in liquid effluents and the bioaccumulation of these contaminants in aquatic organisms may lead to adverse ecological impact. Socioeconomic impacts are similar to those from a chemical plant of equivalent size and are summarized and contrasted for the three surrogate sites.

  6. Leadership Succession: Future-proofing Pipelines.

    Science.gov (United States)

    Taylor, Saul; Youngs, Howard

    2018-01-01

    The challenges in deaf education illustrate the requirement and importance of leadership in this specialized field. The significant and impending talent depletion unfolding as baby-boomers retire, positions leadership succession planning as a strategic issue. This mixed methods study is the first of its kind in New Zealand. The aim is to understand leadership demographics and assumptions to determine the need for strategic succession planning to identify and address leaky pipelines. The findings from 82% of the deaf education workforce through a questionnaire and interviews with seven senior leaders reveal that senior leaders do not appear aware of four key areas that dissuade and shrink the pool of potential leadership aspirants. The four areas are prioritizing family; safeguarding health; concerns about bureaucracy, paperwork, and workload; and, a reluctance to move away from teaching. Aspirant identification appears informal, as there is no formal succession plan in place, which suggests a leadership crisis is imminent in New Zealand deaf education provision. Recommendations are provided that may help address this situation in New Zealand and other first-world nations if sufficient leaders are in place to deal with the challenges facing deaf education today and in the future. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. The VLITE Post-Processing Pipeline

    Science.gov (United States)

    Richards, Emily E.; Clarke, Tracy; Peters, Wendy; Polisensky, Emil; Kassim, Namir E.

    2018-01-01

    A post-processing pipeline to adaptively extract and catalog point sources is being developed to enhance the scientific value and accessibility of data products generated by the VLA Low-band Ionosphere and Transient Experiment (VLITE; http://vlite.nrao.edu/>) on the Karl G. Jansky Very Large Array (VLA). In contrast to other radio sky surveys, the commensal observing mode of VLITE results in varying depths, sensitivities, and spatial resolutions across the sky based on the configuration of the VLA, location on the sky, and time on source specified by the primary observer for their independent science objectives. Therefore, previously developed tools and methods for generating source catalogs and survey statistics are not always appropriate for VLITE's diverse and growing set of data. A raw catalog of point sources extracted from every VLITE image will be created from source fit parameters stored in a queryable database. Point sources will be measured using the Python Blob Detector and Source Finder software (PyBDSF; Mohan & Rafferty 2015). Sources in the raw catalog will be associated with previous VLITE detections in a resolution- and sensitivity-dependent manner, and cross-matched to other radio sky surveys to aid in the detection of transient and variable sources. Final data products will include separate, tiered point source catalogs grouped by sensitivity limit and spatial resolution.

  8. New and Pipeline Drugs for Gout.

    Science.gov (United States)

    Keenan, Robert T; Schlesinger, Naomi

    2016-06-01

    Gout is the most common inflammatory arthropathy in the western world. Affecting millions and accounting for lost wages, increased health care costs, and significant disability, it remains a burden for those afflicted, their families, and the health care system. Despite the availability of a number of effective therapies, gout is often inadequately treated, and its impact on the patients overall health and well-being is underestimated by physicians and patients alike. For many decades, controlling acute flares was the priority in the management of gout. More recently, however, a deeper understanding of gout pathophysiology has resulted in a new appreciation that gout impacts the patient with consequences well beyond the episodes of acute inflammatory arthritis. Reflecting the chronic nature of the disease, gout treatment needs to be chronic as well, and aimed at reducing the underlying cause of gout-hyperuricemia-as well as the symptom of acute attacks. Therapy therefore requires both urate lowering and anti-inflammatory strategies. Unfortunately, the most commonly used urate lowering and anti-inflammatory treatments may be problematic in some gout patients, who often have multiple comorbidities that establish relative contraindications. Novel urate lowering therapies, and new medications to treat and prevent acute gouty flares, can not only improve care of the individual; they can also lead to a better discourse for the edification of those who manage and are managed for this underestimated disease. In this paper, we discuss new and pipeline drugs for acute gout, prophylactic anti-inflammatory therapies as well as urate lowering therapies.

  9. An Overview of New Progresses in Understanding Pipeline Corrosion

    Energy Technology Data Exchange (ETDEWEB)

    Tan, M. YJ; Varela, F.; Huo, Y.; Gupta, R.; Abreu, D.; Mahdavi, F.; Hinton, B.; Forsyth, M. [Deakin University, Victoria (Australia)

    2016-12-15

    An approach to achieving the ambitious goal of cost effectively extending the safe operation life of energy pipeline to 100 years is the application of health monitoring and life prediction tools that are able to provide both long-term remnant pipeline life prediction and in-situ pipeline condition monitoring. A critical step is the enhancement of technological capabilities that are required for understanding and quantifying the effects of key factors influencing buried steel pipeline corrosion and environmentally assisted materials degradation, and the development of condition monitoring technologies that are able to provide in-situ monitoring and site-specific warning of pipeline damage. This paper provides an overview of our current research aimed at developing new sensors and electrochemical cells for monitoring, categorising and quantifying the level and nature of external pipeline and coating damages under the combined effects of various inter-related variables and processes such as localised corrosion, coating cracking and disbondment, cathodic shielding, transit loss of cathodic protection.

  10. Pipelined multiprocessor system-on-chip for multimedia

    CERN Document Server

    Javaid, Haris

    2014-01-01

    This book describes analytical models and estimation methods to enhance performance estimation of pipelined multiprocessor systems-on-chip (MPSoCs).  A framework is introduced for both design-time and run-time optimizations. For design space exploration, several algorithms are presented to minimize the area footprint of a pipelined MPSoC under a latency or a throughput constraint.  A novel adaptive pipelined MPSoC architecture is described, where idle processors are transitioned into low-power states at run-time to reduce energy consumption. Multi-mode pipelined MPSoCs are introduced, where multiple pipelined MPSoCs optimized separately are merged into a single pipelined MPSoC, enabling further reduction of the area footprint by sharing the processors and communication buffers. Readers will benefit from the authors’ combined use of analytical models, estimation methods and exploration algorithms and will be enabled to explore billions of design points in a few minutes.   ·         Describes the ...

  11. Nondestructive inspection of the condition of oil pipeline cleaning units

    Energy Technology Data Exchange (ETDEWEB)

    Berdonosov, V.A.; Boiko, D.A.; Lapshin, B.M.; Chakhlov, V.L.

    1989-02-01

    One of the reasons for shutdowns of main oil pipelines is stoppage of the cleaning unit in cleaning of the inner surface of paraffin deposits caused by damage to the cleaning unit. The authors propose a method of searching for and determining the condition of the cleaning unit not requiring dismantling of the pipeline according to which the initial search for the cleaning unit is done with acoustic instruments (the increased acoustic noise at the point of stoppage of its is recorded) and subsequent inspection by a radiographic method. An experimental model of an instrument was developed making it possible to determine the location of a cleaning unit in an oil pipeline in stoppage of it from the acoustic noise. The instrument consists of two blocks, the remote sensor and the indicator block, which are connected to each other with a cable up to 10 m long. The design makes it possible to place the sensor at any accessible point of a linear part of the pipeline (in a pit, on a valve, etc.) while the indicator block may remain on the surface of the ground. The results obtained make it possible to adopt the optimum solutions on elimination of their malfunctioning and to prevent emergency situations without dismantling of the pipeline. With the equipment developed it is possible to inspect oil and gas pipelines with different reasons for a reduction in their throughput.

  12. Pipeline coating inspection in Mexico applying surface electromagnetic technology

    Energy Technology Data Exchange (ETDEWEB)

    Delgado, O.; Mousatov, A.; Nakamura, E.; Villarreal, J.M. [Instituto Mexicano del Petroleo (IMP), Mexico City (Mexico); Shevnin, V. [Moscow State University (Russian Federation); Cano, B. [Petroleos Mexicanos (PEMEX), Mexico City (Mexico)

    2009-07-01

    The main problems in the pipeline systems in Mexico include: extremely aggressive soil characterized by a high clay content and low resistivity, interconnection between several pipes, including electrical contacts of active pipelines with out of service pipes, and short distances between pipes in comparison with their depths which reduce the resolution of coating inspection. The results presented in this work show the efficiency of the Surface Electromagnetic Pipeline Inspection (SEMPI) technology to determine the technical condition of pipelines in situations before mentioned. The SEMPI technology includes two stages: regional and detailed measurements. The regional stage consists of magnetic field measurements along the pipeline using large distances (10 - 100 m) between observation points to delimit zones with damaged coating. For quantitative assessing the leakage and coating resistances along pipeline, additional measurements of voltage and soil resistivity measurements are performed. The second stage includes detailed measurements of the electric field on the pipe intervals with anomalous technical conditions identified in the regional stage. Based on the distribution of the coating electric resistance and the subsoil resistivity values, the delimitation of the zones with different grade of coating quality and soil aggressiveness are performed. (author)

  13. Performance of the SDSS-III MARVELS New Data Pipeline

    Science.gov (United States)

    Li, Rui; Ge, J.; Thomas, N. B.; Shi, J.; Petersen, E.; Ouyang, Y.; Wang, J.; Ma, B.; Sithajan, S.

    2013-01-01

    As one of the four surveys in the SDSS-III program, MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey) had monitored over 3,300 stars during 2008-2012 with each observed about 27 times over a 2-year window. MARVELS has successfully produced over 20 brown dwarf candidates and several hundreds of binaries. However, the early data pipeline has large long term systematic errors and cannot reliably produce giant planet candidates. Our new MARVELS pipeline team, with the assistance of UF Department of Mathematics, has made great progress in dealing with the long-term systematic errors over the past 9 months. We redesigned the entire pre-processing procedure to handle various types of systematic effects caused by the instrument (such as trace, slant and distortion) and observation condition changes (such as illumination profile). We explored several advanced methods to precisely extract the RV signal from the processed spectra. We also developed a new simulation program to model all of these effects and used it to test the performance of our new pipeline. Our goal is to deliver a new pipeline to meet the survey baseline performance 10-35 m/s for the survey stars) by the end of 2012. We will report the fundamental performance of the pipeline and lessons learned from the pipeline development.

  14. Automating the CMS DAQ

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, G.; et al.

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90% and to even improve it to 95% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  15. Automating the CMS DAQ

    CERN Document Server

    Bauer, Gerry; Behrens, Ulf; Branson, James; Chaze, Olivier; Cittolin, Sergio; Coarasa Perez, Jose Antonio; Darlea, Georgiana Lavinia; Deldicque, Christian; Dobson, Marc; Dupont, Aymeric; Erhan, Samim; Gigi, Dominique; Glege, Frank; Gomez Ceballos, Guillelmo; Gomez-Reino Garrido, Robert; Hartl, Christian; Hegeman, Jeroen Guido; Holzner, Andre Georg; Masetti, Lorenzo; Meijers, Franciscus; Meschi, Emilio; Mommsen, Remigius; Morovic, Srecko; Nunez Barranco Fernandez, Carlos; O'Dell, Vivian; Orsini, Luciano; Ozga, Wojciech Andrzej; Paus, Christoph Maria Ernst; Petrucci, Andrea; Pieri, Marco; Racz, Attila; Raginel, Olivier; Sakulin, Hannes; Sani, Matteo; Schwick, Christoph; Spataru, Andrei Cristian; Stieger, Benjamin Bastian; Sumorok, Konstanty; Veverka, Jan; Wakefield, Christopher Colin; Zejdl, Petr

    2014-01-01

    We present the automation mechanisms that have been added to the Data Acquisition and Run Control systems of the Compact Muon Solenoid (CMS) experiment during Run 1 of the LHC, ranging from the automation of routine tasks to automatic error recovery and context-sensitive guidance to the operator. These mechanisms helped CMS to maintain a data taking efficiency above 90\\% and to even improve it to 95\\% towards the end of Run 1, despite an increase in the occurrence of single-event upsets in sub-detector electronics at high LHC luminosity.

  16. Terrorism impact on the security of international pipelines; L'impact du terrorisme sur la securite des pipelines internationaux

    Energy Technology Data Exchange (ETDEWEB)

    Simonet, L. [Ministere de la Defense, Dir. des Affaires Juridiques, 75 - Paris (France)

    2006-03-15

    International pipelines, sometimes several thousands of km long, are today more and more the target of terror attacks. The sabotage of oil pipelines has been a recurrent problem in the history of Middle-East, but this risk has been enhanced after the September 11, 2001 event. From Africa to China, Caucasus and Central Asia, no pipeline can escape this threat. In front of this challenge, with strong consequences for consuming countries and investors, the crossed countries cannot find reliable solutions. Regional initiatives have been proposed to ensure pipelines protections but they remain insufficient to reassure the international community. For this reason, the consuming countries are tempted to ensure themselves this protection through NATO interventions or by interference-like unilateral actions. (J.S.)

  17. Idaho: Library Automation and Connectivity.

    Science.gov (United States)

    Bolles, Charles

    1996-01-01

    Provides an overview of the development of cooperative library automation and connectivity in Idaho, including telecommunications capacity, library networks, the Internet, and the role of the state library. Information on six shared automation systems in Idaho is included. (LRW)

  18. On the batch cut and products quality procedures in multi-product pipelines; Corte de interfaces e integridade de produtos em polidutos e terminais

    Energy Technology Data Exchange (ETDEWEB)

    Tepedino, A. [TRANSPETRO - PETROBRAS Transportes, Rio de Janeiro, RJ (Brazil); Baptista, R.M. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Rachid, F.B.Freitas; Araujo, J.H.Carneiro de [Universidade Federal Fluminense, Niteroi, RJ (Brazil)

    2005-07-01

    The increasing complexity in specifying the quality of petroleum products has a great impact on the way it is transported in pipelines so as to avoid degradation and contamination. This work aims to presenting a global and preliminary approach about the transport of petroleum products in batches in pipelines and in multi-product terminal lines towards its automation. Emphasis is placed on the batch formation and batch cut procedures, as well as on the features associated with the installation and instrumentation required to perform this task. The main factors responsible for increasing the mixing volume in batch transfers and for threatening the integrity of the products are highlighted. Several effective measures are pointed out to minimize the mixing volume generation. Among other benefits, they can promote reduction in the cost of transportation and in the amount of degraded products, simplified operational procedures and a minimum need for stocks to cope with its logistic. (author)

  19. Using Geographic Information System - GIS - for pipeline management: case of Urucu-Coari LPG pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Furquim, Maysa P.O. [ESTEIO Engenharia e Aerolevantamentos S.A, Curitiba, PR (Brazil)

    2009-07-01

    This technical paper seeks to demonstrate the stages run during the GIS - Geographic Information System accomplishment as for the follow-up of a pipeline work. The GLPDUTO (LPG Pipeline) Urucu-Coari work shall be the focus of this paper. The main challenges in the compilation of data generated in the work site will be presented, as well the importance for the definition of which data should be relevant, so that the construction company and PETROBRAS could follow up its evolution. The GIS development has been performed since January 2007 and should be finished by the first semester of 2009. The following stages for GIS definition for the work management will be presented: brief history of the project - project conception, purpose, structure implemented and accomplishment expectations; survey data in loco - raw data obtained directly during the carrying out of the work and generated in the project and implantation stage; treated data - data resulting from raw data, but already treated as for the GIS environment; routines developed - specific tools created for the consolidation of the data to be manipulated on GIS in an optimized and functional way; result presented - GIS in its final conception, developed and input with the routines and data regarding the project. (author)

  20. West Virginia: Library Automation.

    Science.gov (United States)

    Brown, Thomas M., Ed.

    1996-01-01

    Reviews library automation efforts in West Virginia. Topics include information equity and rural areas, library systems and resource sharing, statewide connectivity, a higher education telecomputing network that has expanded to include elementary and secondary school libraries, multitype library cooperation and staff training, and academic library…

  1. Library Automation in Australia.

    Science.gov (United States)

    Blank, Karen L.

    1984-01-01

    Discussion of Australia's move toward library automation highlights development of a national bibliographic network, local and regional cooperation, integrated library systems, telecommunications, and online systems, as well as microcomputer usage, ergonomics, copyright issues, and national information policy. Information technology plans of the…

  2. Myths of Library Automation.

    Science.gov (United States)

    Hegarty, Kevin

    1985-01-01

    This analysis of nine myths of library automation highlights cost effectiveness, circulation control and delinquency rates, budget allocation, staff needs, technical services productivity, the online catalog, need for consultants, the MARC format, and turnkey systems. Views of the reality regarding each myth are offered. (EJS)

  3. Oregon: Library Automation Developments.

    Science.gov (United States)

    Brandis, Rushton

    1996-01-01

    Discusses Oregon library automation projects, including Internet connectivity and a statewide multitype library network; a bibliographic information system with college and university libraries, including a union catalog; a Portland Area Library System that connects multitype libraries; and library staff training for the Internet. (LRW)

  4. Myths in test automation

    Directory of Open Access Journals (Sweden)

    Jazmine Francis

    2015-01-01

    Full Text Available Myths in automation of software testing is an issue of discussion that echoes about the areas of service in validation of software industry. Probably, the first though that appears in knowledgeable reader would be Why this old topic again? What's New to discuss the matter? But, for the first time everyone agrees that undoubtedly automation testing today is not today what it used to be ten or fifteen years ago, because it has evolved in scope and magnitude. What began as a simple linear scripts for web applications today has a complex architecture and a hybrid framework to facilitate the implementation of testing applications developed with various platforms and technologies. Undoubtedly automation has advanced, but so did the myths associated with it. The change in perspective and knowledge of people on automation has altered the terrain. This article reflects the points of views and experience of the author in what has to do with the transformation of the original myths in new versions, and how they are derived; also provides his thoughts on the new generation of myths.

  5. Automated differential leukocyte counts.

    Science.gov (United States)

    Morse, E E; Nashed, A; Spilove, L

    1989-01-01

    Automated differential counts have the advantage of precision, efficiency, safety, and economy. They could potentially serve effectively in 90 percent of patients with normal counts or in 75 percent of patients with anemia only (64 percent of the total in this study). Even patients with increased white blood cell counts and major population shifts (toward granulocytes or lymphocytes) could be followed with automated differential counts. Such a tactic would decrease turnaround time for results, be less expensive, and reduce exposure of technologists to direct contact with patients' blood. However, presently available instruments fail to detect patients' blood samples with small numbers of abnormal cells, e.g., blasts in early relapse of acute leukemia, atypical lymphocytes in viral diseases such as infectious mononucleosis, eosinophils in allergic or parasitic disease, and band forms in early infectious diseases. Clinical judgment should be used in selectively ordering manual differential counts for these patients. While automated differential counts can be very useful in screening general medical and surgical patients in the ambulatory setting, in referral centers where hematologic abnormalities are more prevalent, the manual differential count and further examination of a smear is particularly necessary at least on initial presentation. Selective manual differential counts may improve efficiency, economy, and safety while not compromising patient care. Further studies of the correlation of clinical disease with automated differential counts are necessary.

  6. Microcomputers in Library Automation.

    Science.gov (United States)

    Simpson, George A.

    As librarians cope with reduced budgets, decreased staff, and increased demands for services, microcomputers will take a significant role in library automation by providing low-cost systems, solving specific library problems, and performing in distributed systems. This report presents an introduction to the technology of this low-cost, miniature…

  7. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions. The s...

  8. Microcontroller for automation application

    Science.gov (United States)

    Cooper, H. W.

    1975-01-01

    The description of a microcontroller currently being developed for automation application was given. It is basically an 8-bit microcomputer with a 40K byte random access memory/read only memory, and can control a maximum of 12 devices through standard 15-line interface ports.

  9. Blastocyst microinjection automation.

    Science.gov (United States)

    Mattos, Leonardo S; Grant, Edward; Thresher, Randy; Kluckman, Kimberly

    2009-09-01

    Blastocyst microinjections are routinely involved in the process of creating genetically modified mice for biomedical research, but their efficiency is highly dependent on the skills of the operators. As a consequence, much time and resources are required for training microinjection personnel. This situation has been aggravated by the rapid growth of genetic research, which has increased the demand for mutant animals. Therefore, increased productivity and efficiency in this area are highly desired. Here, we pursue these goals through the automation of a previously developed teleoperated blastocyst microinjection system. This included the design of a new system setup to facilitate automation, the definition of rules for automatic microinjections, the implementation of video processing algorithms to extract feedback information from microscope images, and the creation of control algorithms for process automation. Experimentation conducted with this new system and operator assistance during the cells delivery phase demonstrated a 75% microinjection success rate. In addition, implantation of the successfully injected blastocysts resulted in a 53% birth rate and a 20% yield of chimeras. These results proved that the developed system was capable of automatic blastocyst penetration and retraction, demonstrating the success of major steps toward full process automation.

  10. Protokoller til Home Automation

    DEFF Research Database (Denmark)

    Kjær, Kristian Ellebæk

    2008-01-01

    computer, der kan skifte mellem foruddefinerede indstillinger. Nogle gange kan computeren fjernstyres over internettet, så man kan se hjemmets status fra en computer eller måske endda fra en mobiltelefon. Mens nævnte anvendelser er klassiske indenfor home automation, er yderligere funktionalitet dukket op...

  11. Automated conflict resolution issues

    Science.gov (United States)

    Wike, Jeffrey S.

    1991-01-01

    A discussion is presented of how conflicts for Space Network resources should be resolved in the ATDRSS era. The following topics are presented: a description of how resource conflicts are currently resolved; a description of issues associated with automated conflict resolution; present conflict resolution strategies; and topics for further discussion.

  12. Automated Web Applications Testing

    Directory of Open Access Journals (Sweden)

    Alexandru Dan CĂPRIŢĂ

    2009-01-01

    Full Text Available Unit tests are a vital part of several software development practicesand processes such as Test-First Programming, Extreme Programming andTest-Driven Development. This article shortly presents the software quality andtesting concepts as well as an introduction to an automated unit testingframework for PHP web based applications.

  13. Automated Accounting. Instructor Guide.

    Science.gov (United States)

    Moses, Duane R.

    This curriculum guide was developed to assist business instructors using Dac Easy Accounting College Edition Version 2.0 software in their accounting programs. The module consists of four units containing assignment sheets and job sheets designed to enable students to master competencies identified in the area of automated accounting. The first…

  14. Federation in genomics pipelines: techniques and challenges.

    Science.gov (United States)

    Chaterji, Somali; Koo, Jinkyu; Li, Ninghui; Meyer, Folker; Grama, Ananth; Bagchi, Saurabh

    2017-08-29

    Federation is a popular concept in building distributed cyberinfrastructures, whereby computational resources are provided by multiple organizations through a unified portal, decreasing the complexity of moving data back and forth among multiple organizations. Federation has been used in bioinformatics only to a limited extent, namely, federation of datastores, e.g. SBGrid Consortium for structural biology and Gene Expression Omnibus (GEO) for functional genomics. Here, we posit that it is important to federate both computational resources (CPU, GPU, FPGA, etc.) and datastores to support popular bioinformatics portals, with fast-increasing data volumes and increasing processing requirements. A prime example, and one that we discuss here, is in genomics and metagenomics. It is critical that the processing of the data be done without having to transport the data across large network distances. We exemplify our design and development through our experience with metagenomics-RAST (MG-RAST), the most popular metagenomics analysis pipeline. Currently, it is hosted completely at Argonne National Laboratory. However, through a recently started collaborative National Institutes of Health project, we are taking steps toward federating this infrastructure. Being a widely used resource, we have to move toward federation without disrupting 50 K annual users. In this article, we describe the computational tools that will be useful for federating a bioinformatics infrastructure and the open research challenges that we see in federating such infrastructures. It is hoped that our manuscript can serve to spur greater federation of bioinformatics infrastructures by showing the steps involved, and thus, allow them to scale to support larger user bases. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  15. Automated tissue segmentation of MR brain images in the presence of white matter lesions.

    Science.gov (United States)

    Valverde, Sergi; Oliver, Arnau; Roura, Eloy; González-Villà, Sandra; Pareto, Deborah; Vilanova, Joan C; Ramió-Torrentà, Lluís; Rovira, Àlex; Lladó, Xavier

    2017-01-01

    Over the last few years, the increasing interest in brain tissue volume measurements on clinical settings has led to the development of a wide number of automated tissue segmentation methods. However, white matter lesions are known to reduce the performance of automated tissue segmentation methods, which requires manual annotation of the lesions and refilling them before segmentation, which is tedious and time-consuming. Here, we propose a new, fully automated T1-w/FLAIR tissue segmentation approach designed to deal with images in the presence of WM lesions. This approach integrates a robust partial volume tissue segmentation with WM outlier rejection and filling, combining intensity and probabilistic and morphological prior maps. We evaluate the performance of this method on the MRBrainS13 tissue segmentation challenge database, which contains images with vascular WM lesions, and also on a set of Multiple Sclerosis (MS) patient images. On both databases, we validate the performance of our method with other state-of-the-art techniques. On the MRBrainS13 data, the presented approach was at the time of submission the best ranked unsupervised intensity model method of the challenge (7th position) and clearly outperformed the other unsupervised pipelines such as FAST and SPM12. On MS data, the differences in tissue segmentation between the images segmented with our method and the same images where manual expert annotations were used to refill lesions on T1-w images before segmentation were lower or similar to the best state-of-the-art pipeline incorporating automated lesion segmentation and filling. Our results show that the proposed pipeline achieved very competitive results on both vascular and MS lesions. A public version of this approach is available to download for the neuro-imaging community. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. A color image processing pipeline for digital microscope

    Science.gov (United States)

    Liu, Yan; Liu, Peng; Zhuang, Zhefeng; Chen, Enguo; Yu, Feihong

    2012-10-01

    Digital microscope has found wide application in the field of biology, medicine et al. A digital microscope differs from traditional optical microscope in that there is no need to observe the sample through an eyepiece directly, because the optical image is projected directly on the CCD/CMOS camera. However, because of the imaging difference between human eye and sensor, color image processing pipeline is needed for the digital microscope electronic eyepiece to get obtain fine image. The color image pipeline for digital microscope, including the procedures that convert the RAW image data captured by sensor into real color image, is of great concern to the quality of microscopic image. The color pipeline for digital microscope is different from digital still cameras and video cameras because of the specific requirements of microscopic image, which should have the characters of high dynamic range, keeping the same color with the objects observed and a variety of image post-processing. In this paper, a new color image processing pipeline is proposed to satisfy the requirements of digital microscope image. The algorithm of each step in the color image processing pipeline is designed and optimized with the purpose of getting high quality image and accommodating diverse user preferences. With the proposed pipeline implemented on the digital microscope platform, the output color images meet the various analysis requirements of images in the medicine and biology fields very well. The major steps of color imaging pipeline proposed include: black level adjustment, defect pixels removing, noise reduction, linearization, white balance, RGB color correction, tone scale correction and gamma correction.

  17. Beyond Order 636: Making the most of existing pipeline capacity

    Energy Technology Data Exchange (ETDEWEB)

    Katz, M.

    1995-06-01

    They`re not moving natural gas through the nation`s vast system of gas pipelines like they used to anymore. The process is becoming more efficient and less costly, but observers say there`s still plenty of room for improvement. A variety of new tools and techniques for optimizing pipeline capacity have taken hold under Federal Energy Regulatory Commission (FERC) Order 636, which deregulated and unbundled pipeline transportation and services in 1993. These new techniques allow local gas distribution companies (LDCs) and big end-users to use a variety of mechanisms to guarantee they will have enough capacity on peak-demand days at an economically attractive price. The LDCs and gas users do have to find their own supplies of gas, rather than purchasing it from the nearest pipeline as they did in the past. But market centers and hubs, along with services like backhauls, storage agreements and on-line contracting through electronic bulletin boards (EBBs), have popped up like dandelions since Order 636 enabling LDCs to buy gas from even distant sources and move it with ease from one pipeline to another to their service territories. Under Order 636, LDCs and users who have leased firm transportation capacity on a long-term basis are being allowed to release this capacity during periods when they don`t need it. A user can release, or resell, its excess pipeline capacity directly to another user, or it can ask the pipeline to find a buyer. If the transaction is for greater than a month, the package must be posted on an EBB and resold to the highest bidder.

  18. Update on the SDSS-III MARVELS data pipeline development

    Science.gov (United States)

    Li, Rui; Ge, J.; Thomas, N. B.; Petersen, E.; Wang, J.; Ma, B.; Sithajan, S.; Shi, J.; Ouyang, Y.; Chen, Y.

    2014-01-01

    MARVELS (Multi-object APO Radial Velocity Exoplanet Large-area Survey), as one of the four surveys in the SDSS-III program, has monitored over 3,300 stars during 2008-2012, with each being visited an average of 26 times over a 2-year window. Although the early data pipeline was able to detect over 20 brown dwarf candidates and several hundreds of binaries, no giant planet candidates have been reliably identified due to its large systematic errors. Learning from past data pipeline lessons, we re-designed the entire pipeline to handle various types of systematic effects caused by the instrument (such as trace, slant, distortion, drifts and dispersion) and observation condition changes (such as illumination profile and continuum). We also introduced several advanced methods to precisely extract the RV signals. To date, we have achieved a long term RMS RV measurement error of 14 m/s for HIP-14810 (one of our reference stars) after removal of the known planet signal based on previous HIRES RV measurement. This new 1-D data pipeline has been used to robustly identify four giant planet candidates within the small fraction of the survey data that has been processed (Thomas et al. this meeting). The team is currently working hard to optimize the pipeline, especially the 2-D interference-fringe RV extraction, where early results show a 1.5 times improvement over the 1-D data pipeline. We are quickly approaching the survey baseline performance requirement of 10-35 m/s RMS for 8-12 solar type stars. With this fine-tuned pipeline and the soon to be processed plates of data, we expect to discover many more giant planet candidates and make a large statistical impact to the exoplanet study.

  19. Control-volume-based finite element modelling of liquefaction around a pipeline

    Directory of Open Access Journals (Sweden)

    Leila Kalani Sarokolayi

    2016-07-01

    Full Text Available Pipelines are generally used to transmit energy sources from production sources to target places. Buried pipelines in saturated deposits during seismic loading could damage from resulting liquefaction due to accumulative excess pore-water pressure generation. In this paper, the evaluation of liquefaction potential in the vicinity of an offshore buried pipeline is studied using the control-volume-based finite element method. The objective of the present work is to study the effect of different governing parameters such as the specific weight of pipeline, Poisson's ratio, permeability of the soil, deformation module, pipeline diameter and pipeline burial depth on the soil liquefaction around a marine pipeline. The results of analysis indicated that with the increase of deformation module, Poisson's ratio and permeability of the soil coefficient, the liquefaction potential reduces. The liquefaction potential around the gas pipeline is much more than oil pipeline.

  20. Toward Automated Interpretation of LC-MS Data for Quality Assurance of a Screening Collection.

    Science.gov (United States)

    Addison, Daniel H

    2016-12-01

    The AstraZeneca Compound Management group uses high-performance liquid chromatography-mass spectrometry for structure elucidation and purity determination of the AstraZeneca compound collection. These activities are conducted in a high-throughput environment where the rate-limiting step is the review and interpretation of analytical results, which is time-consuming and experience dependent. Despite the development of a semiautomated review system, manual interpretation of results remains a bottleneck. Data-mining techniques were applied to archived data to further automate the review process. Various classification models were evaluated using WEKA and Pipeline Pilot (Pipeline Pilot version 8.5.0.200, BIOVIA, San Diego, CA). Results were assessed using criteria including precision, recall, and receiver operating characteristic area. Each model was evaluated as a cost-insensitive classifier and again using MetaCost to apply cost sensitivity. Pruning and variable importance were also investigated. A 10-tree random forest generated with Pipeline Pilot reduced the number of analyses requiring manual review to 36.4% using a threshold of 90% confidence in predictions. This represents a 45% reduction in manual reviews compared with the previous system, delivering an annual savings of $45,000 or an increase in capacity from 25,000 analyses per month up to 45,000 with the same resource levels. © 2015 Society for Laboratory Automation and Screening.

  1. An Automated Images-to-Graphs Framework for High Resolution Connectomics

    Directory of Open Access Journals (Sweden)

    William R Gray Roncal

    2015-08-01

    Full Text Available Reconstructing a map of neuronal connectivity is a critical challenge in contemporary neuroscience. Recent advances in high-throughput serial section electron microscopy (EM have produced massive 3D image volumes of nanoscale brain tissue for the first time. The resolution of EM allows for individual neurons and their synaptic connections to be directly observed. Recovering neuronal networks by manually tracing each neuronal process at this scale is unmanageable, and therefore researchers are developing automated image processing modules. Thus far, state-of-the-art algorithms focus only on the solution to a particular task (e.g., neuron segmentation or synapse identification. In this manuscript we present the first fully automated images-to-graphs pipeline (i.e., a pipeline that begins with an imaged volume of neural tissue and produces a brain graph without any human interaction. To evaluate overall performance and select the best parameters and methods, we also develop a metric to assess the quality of the output graphs. We evaluate a set of algorithms and parameters, searching possible operating points to identify the best available brain graph for our assessment metric. Finally, we deploy a reference end-to-end version of the pipeline on a large, publicly available data set. This provides a baseline result and framework for community analysis and future algorithm development and testing. All code and data derivatives have been made publicly available toward eventually unlocking new biofidelic computational primitives and understanding of neuropathologies.

  2. REALTIME MONITORING OF PIPELINES FOR THIRD-PARTY CONTACT

    Energy Technology Data Exchange (ETDEWEB)

    Gary L. Burkhardt

    2005-12-31

    Third-party contact with pipelines (typically caused by contact with a digging or drilling device) can result in mechanical damage to the pipe, in addition to coating damage that can initiate corrosion. Because this type of damage often goes unreported and can lead to eventual catastrophic failure of the pipe, a reliable, cost-effective method is needed for monitoring the pipeline and reporting third-party contact events. The impressed alternating cycle current (IACC) pipeline monitoring method developed by Southwest Research Institute (SwRI) consists of impressing electrical signals on the pipe by generating a time-varying voltage between the pipe and the soil. The signal voltage between the pipe and ground is monitored continuously at receiving stations located some distance away. Third-party contact to the pipe that breaks through the coating (thus resulting in a signal path to ground) changes the signal received at the receiving stations. The IACC method was shown to be a viable method that can be used to continuously monitor pipelines for third-party contact. Electrical connections to the pipeline can be made through existing cathodic protection (CP) test points without the need to dig up the pipe. The instrumentation is relatively simple, consisting of (1) a transmitting station with a frequency-stable oscillator and amplifier and (2) a receiving station with a filter, lock-in amplifier, frequency-stable oscillator, and remote reporting device (e.g. cell phone system). Maximum distances between the transmitting and receiving stations are approximately 1.61 km (1 mile), although the length of pipeline monitored can be twice this using a single transmitter and one receiver on each side (since the signal travels in both directions). Certain conditions such as poor pipeline coatings or strong induced 60-Hz signals on the pipeline can degrade IACC performance, so localized testing should be performed to determine the suitability for an IACC installation at a given

  3. Corporate social responsibility along pipelines: communities and corporations working together

    Energy Technology Data Exchange (ETDEWEB)

    Carvalho, Edison D.R.; Lopes, Luciano E.; Danciguer, Lucilene; Macarini, Samuel; Souza, Maira de [Grupo de Aplicacao Interdisciplinar a Aprendizagem (GAIA), Campinas, SP (Brazil)

    2009-07-01

    In this paper we present GAIA's findings in three corporate social responsibility projects along pipelines owned by three Brazilian companies in gas, oil and mining sectors. The projects had as the main goal to improve the relationship with communities in the companies' direct influence areas. Clearly, the relationship with communities along pipelines is essential to prevent and reduce industrial hazards. The damage in pipelines due to agriculture, buildings, intentional perforations and traffic of heavy vehicles may cause fatal accidents, environmental and material losses. Such accidents have negative consequences with regard to economy, image and relationship with communities and environmental agencies. From communities' perspective, pipelines deteriorate their life quality due to risk of industrial hazards nearby their houses. The lack of proper information about the pipelines remarkably increases insecurity feelings and discourses against the companies among community leaders. The methodology developed by GAIA comprises companies' and communities' interests and encompasses nine stages. 1. Socio-environmental appraisal or inventory, mapping main risks, communities' needs and their leaders. 2. Communication plan, defining strategies, languages and communication vehicles for each stakeholder group. 3. Inter-institutional meetings to include other institutions in the program. 4. Launching seminar in partnership with local authorities, divulging companies' actions in the cities with pipelines. 5. Multiplier agents formation, enabling teachers, local leaders and government representatives to disseminate correct information about the pipelines such as their functioning, hazard prevention, maintenance actions, and restrictions of activities over the pipelines. 6. Formation on project management, enabling teachers, local leaders and government representatives to elaborate, fund raise and manage socio environmental projects aimed at

  4. Challenges in developing a comprehensive, automated and flexible oil accounting system

    Energy Technology Data Exchange (ETDEWEB)

    Nordell, L.F.; Ruda, H. [Enbridge Pipelines Inc., Edmonton, AB (Canada)

    2004-07-01

    Enbridge Pipelines Inc. operates a long and complex system of pipelines which transport hydrocarbon liquid commodities, including crude oils, refined products and natural gas liquids across provincial and national boundaries. Due to growing internal demands for the addition of pipelines, the company developed a comprehensive oil accounting (OA) system in 1998. The OA accommodated changing business requirements by incorporating more complex tariff agreements and the demand for customized system reporting for customers. The new OA was also compatible with the company's technology direction, which focused on aligning information technology (IT) with business drivers and upgrading the flexibility of the entire OA system. This paper summarized the business improvement and redevelopment study; the OA system redevelopment project; challenges of integrating new system components with the legacy system during system development; specific process improvement results that were targeted and achieved by the new system development team; and, the project development challenges. The project team adopted a software development approach that was responsive to changes in the requirements and in the project direction introduced by key stakeholders during the life of the project. The project management approaches and logic of the SCRUM and extreme programming (XP) methods were combined to incorporate the principles of agile development to ensure a quality product. The result was an automated crude oil balancing and revenue accounting system that interfaced with other pipeline management software systems. 6 refs., 5 figs.

  5. FUZZY INFERENCE BASED LEAK ESTIMATION IN WATER PIPELINES SYSTEM

    Directory of Open Access Journals (Sweden)

    N. Lavanya

    2015-01-01

    Full Text Available Pipeline networks are the most widely used mode for transporting fluids and gases around the world. Leakage in this pipeline causes harmful effects when the flowing fluid/gas is hazardous. Hence the detection of leak becomes essential to avoid/minimize such undesirable effects. This paper presents the leak detection by spectral analysis methods in a laboratory pipeline system. Transient in the pressure signal in the pipeline is created by opening and closing the exit valve. These pressure variations are captured and power spectrum is obtained by using Fast Fourier Transform (FFT method and Filter Diagonalization Method (FDM. The leaks at various positions are simulated and located using these methods and the results are compared. In order to determine the quantity of leak a 2 × 1 fuzzy inference system is created using the upstream and downstream pressure as input and the leak size as the output. Thus a complete leak detection, localization and quantification are done by using only the pressure variations in the pipeline.

  6. Location class change impact on onshore gas pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Cardoso, Cassia de Oliveira; Oliveira, Luiz Fernando Seixas de [DNV Energy Solutions, Oslo (Norway); Leal, Cesar Antonio [DNV Energy Solutions, Porto Alegre, RS (Brazil); Faertes, Denise [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Gas and Energy

    2009-07-01

    During a pipeline life cycle, some significant changes in the population may happen along its route. Such changes are indirectly evaluated by the increase in the amount of buildings constructed along the route, which determines the so called Location Class. Such changes, after licensing, provoke differences between what is required by the standards and what is actually done. This work has two goals. One is to study the requirements of international standards and legislations as well as some solutions used in the United States, Canada, United Kingdom and Netherlands. This goal intends to provide some technical bases for a comparative analysis on how the location class changes, during the life cycle of a pipeline, are treated in each country. Another goal is to present a risk-based methodology for the guideline development which can be used in decision-making concerning what to do in case of any location class change. Particularly, it has given special attention to the requirements which are imposed for the pipeline operational license continuation. This work is of supreme importance for the Brazilian pipeline segment, since the existing Brazilian design standard, ABNT NBR12712 for transmission and distribution pipeline design, does not deal with that issue. Moreover, a summary of the main solutions found in those countries together with a guideline, customized for the Brazilian reality, is presented. (author)

  7. bicycle: a bioinformatics pipeline to analyze bisulfite sequencing data.

    Science.gov (United States)

    Graña, Osvaldo; López-Fernández, Hugo; Fdez-Riverola, Florentino; González Pisano, David; Glez-Peña, Daniel

    2017-12-01

    High-throughput sequencing of bisulfite-converted DNA is a technique used to measure DNA methylation levels. Although a considerable number of computational pipelines have been developed to analyze such data, none of them tackles all the peculiarities of the analysis together, revealing limitations that can force the user to manually perform additional steps needed for a complete processing of the data. This article presents bicycle, an integrated, flexible analysis pipeline for bisulfite sequencing data. Bicycle analyzes whole genome bisulfite sequencing data, targeted bisulfite sequencing data, and hydroxymethylation data. To show how bicycle overtakes other available pipelines, we compared them on a defined number of features that are summarized in a table. We also tested bicycle with both simulated and real datasets, to show its level of performance, and compared it to different state-of-the-art methylation analysis pipelines. Bicycle is publicly available under GNU LGPL v3.0 license at http://www.sing-group.org/bicycle. Users can also download a customized Ubuntu LiveCD including bicycle and other bisulfite sequencing data pipelines compared here. In addition, a docker image with bicycle and its dependencies, which allows a straightforward use of bicycle in any platform (e.g. Linux, OS X or Windows), is also available. ograna@cnio.es, dgpena@uvigo.es. Supplementary data are available at Bioinformatics online.

  8. Optimal Energy Consumption Analysis of Natural Gas Pipeline

    Science.gov (United States)

    Liu, Enbin; Li, Changjun; Yang, Yi

    2014-01-01

    There are many compressor stations along long-distance natural gas pipelines. Natural gas can be transported using different boot programs and import pressures, combined with temperature control parameters. Moreover, different transport methods have correspondingly different energy consumptions. At present, the operating parameters of many pipelines are determined empirically by dispatchers, resulting in high energy consumption. This practice does not abide by energy reduction policies. Therefore, based on a full understanding of the actual needs of pipeline companies, we introduce production unit consumption indicators to establish an objective function for achieving the goal of lowering energy consumption. By using a dynamic programming method for solving the model and preparing calculation software, we can ensure that the solution process is quick and efficient. Using established optimization methods, we analyzed the energy savings for the XQ gas pipeline. By optimizing the boot program, the import station pressure, and the temperature parameters, we achieved the optimal energy consumption. By comparison with the measured energy consumption, the pipeline now has the potential to reduce energy consumption by 11 to 16 percent. PMID:24955410

  9. Collapse arresters for deep water pipelines: identification of crossover mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Toscano, Rita G.; Mantovano, Luciano; Assanelli, Andrea; Amenta, Pablo; Johnson, Daniel; Charreau, Roberto; Dvorkin, Eduardo [Tenaris Center for industrial Research (CINI), Siderca, Campana (Argentina)

    2005-07-01

    Deep water pipelines, normally subjected to external pressure and bending, fail due to structural collapse when the external loading exceeds the pipes collapse limit surface. For steel pipes, the influence on this limit surface of manufacturing imperfections has been thoroughly studied by CINI using finite element models that have been validated via laboratory full-scale tests. After a steel pipeline collapses, the collapse is restrained to the collapse initiation section or it propagates along the pipeline, being this second alternative the most detrimental one for the pipeline integrity. Therefore, it is necessary to build in the pipeline periodic reinforcements, to act as arresters for the collapse propagation. Using finite element models, we study the crossover of collapse arresters by the propagating collapse. The occurrence of different crossover mechanisms is determined by the geometry of the pipes and of the arresters. Laboratory tests were carried out at CINI in order to obtain experimental results that could be used to validate the numerical models. In this paper, we compare the numerical and experimental results for external pressure lo (author)

  10. Offshore pipelines in cold regions : environmental loadings and geotechnical considerations

    Energy Technology Data Exchange (ETDEWEB)

    Paulin, M.J.; Caines, J. [IMV Projects Atlantic, St. John' s, NL (Canada); Kenny, S.P. [Memorial Univ. of Newfoundland, St. John' s, NL (Canada). Dept. of Engineering and Applied Science; Palmer, A.C. [National Univ. of Singapore, Queenstown, (Singapore). Dept. of Civil Engineering; Been, K. [Golder Associates Inc., Houston, TX (United States)

    2008-09-15

    As a result of the industry's continued search for oil and gas in frontier offshore locations, several developments have occurred in areas characterized by seasonal ice cover including the United States Beaufort, North Caspian, and Sakhalin Island. Pipeline transportation systems have been utilized as a cost-effective, safe and reliable mode of hydrocarbon transport to shore in these projects. However, ice gouging is a primary design issue that affects engineering considerations with respect to strain based design, target burial depth requirements, cost and safety. This paper discussed environmental loadings and geotechnical considerations with respect to offshore pipelines in cold regions. The paper discussed data collection needs such as ice gouge data specific to the pipeline route, and information on sediment transport to determine burial depths for a given ice gouge environment. Ice gouging was also defined in terms of subgouge soil deformations and gouge depth from gouge survey observations. Last, the paper addressed modeling coupled ice keel/seabed/pipeline interaction, trenching and backfilling, and implications for pipeline design and construction. It was concluded that evaluation of the system demand and system capacity influence engineering design considerations that may impact the construction and operational phases. Important issues include material selection; linepipe qualification; welding procedures; limit state criteria for strain based design; leak detection; condition monitoring systems and in-line inspection tools. 34 refs., 10 figs.

  11. The Application of the Semi-quantitative Risk Assessment Method to Urban Natural Gas Pipelines

    Directory of Open Access Journals (Sweden)

    YongQiang BAI

    2013-07-01

    Full Text Available This paper provides a method of semi-quantitative risk assessment for urban gas pipelines, by modifying Kent analysis method. The influence factors of fault frequency and consequence for urban gas pipelines are analyzed, and the grade rules are studied on. The grade rules of fault frequency and consequence for urban natural gas pipelines are provided. Using semi-quantitative risk matrix, the risk grade of the urban gas pipelines is obtained, and the risk primary sort for gas pipelines can be accomplished, so as to find out the high risk pipeline unit.

  12. Constructing Flexible, Configurable, ETL Pipelines for the Analysis of "Big Data" with Apache OODT

    Science.gov (United States)

    Hart, A. F.; Mattmann, C. A.; Ramirez, P.; Verma, R.; Zimdars, P. A.; Park, S.; Estrada, A.; Sumarlidason, A.; Gil, Y.; Ratnakar, V.; Krum, D.; Phan, T.; Meena, A.

    2013-12-01

    A plethora of open source technologies for manipulating, transforming, querying, and visualizing 'big data' have blossomed and matured in the last few years, driven in large part by recognition of the tremendous value that can be derived by leveraging data mining and visualization techniques on large data sets. One facet of many of these tools is that input data must often be prepared into a particular format (e.g.: JSON, CSV), or loaded into a particular storage technology (e.g.: HDFS) before analysis can take place. This process, commonly known as Extract-Transform-Load, or ETL, often involves multiple well-defined steps that must be executed in a particular order, and the approach taken for a particular data set is generally sensitive to the quantity and quality of the input data, as well as the structure and complexity of the desired output. When working with very large, heterogeneous, unstructured or semi-structured data sets, automating the ETL process and monitoring its progress becomes increasingly important. Apache Object Oriented Data Technology (OODT) provides a suite of complementary data management components called the Process Control System (PCS) that can be connected together to form flexible ETL pipelines as well as browser-based user interfaces for monitoring and control of ongoing operations. The lightweight, metadata driven middleware layer can be wrapped around custom ETL workflow steps, which themselves can be implemented in any language. Once configured, it facilitates communication between workflow steps and supports execution of ETL pipelines across a distributed cluster of compute resources. As participants in a DARPA-funded effort to develop open source tools for large-scale data analysis, we utilized Apache OODT to rapidly construct custom ETL pipelines for a variety of very large data sets to prepare them for analysis and visualization applications. We feel that OODT, which is free and open source software available through the Apache

  13. Benchmark datasets for phylogenomic pipeline validation, applications for foodborne pathogen surveillance.

    Science.gov (United States)

    Timme, Ruth E; Rand, Hugh; Shumway, Martin; Trees, Eija K; Simmons, Mustafa; Agarwala, Richa; Davis, Steven; Tillman, Glenn E; Defibaugh-Chavez, Stephanie; Carleton, Heather A; Klimke, William A; Katz, Lee S

    2017-01-01

    As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches). These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and "known" phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Our "outbreak" benchmark datasets represent the four major foodborne bacterial pathogens ( Listeria monocytogenes , Salmonella enterica , Escherichia coli , and Campylobacter jejuni ) and one simulated dataset where the "known tree" can be accurately called the "true tree". The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. These five benchmark datasets will help standardize comparison of current and future phylogenomic pipelines, and facilitate important cross

  14. Benchmark datasets for phylogenomic pipeline validation, applications for foodborne pathogen surveillance

    Directory of Open Access Journals (Sweden)

    Ruth E. Timme

    2017-10-01

    Full Text Available Background As next generation sequence technology has advanced, there have been parallel advances in genome-scale analysis programs for determining evolutionary relationships as proxies for epidemiological relationship in public health. Most new programs skip traditional steps of ortholog determination and multi-gene alignment, instead identifying variants across a set of genomes, then summarizing results in a matrix of single-nucleotide polymorphisms or alleles for standard phylogenetic analysis. However, public health authorities need to document the performance of these methods with appropriate and comprehensive datasets so they can be validated for specific purposes, e.g., outbreak surveillance. Here we propose a set of benchmark datasets to be used for comparison and validation of phylogenomic pipelines. Methods We identified four well-documented foodborne pathogen events in which the epidemiology was concordant with routine phylogenomic analyses (reference-based SNP and wgMLST approaches. These are ideal benchmark datasets, as the trees, WGS data, and epidemiological data for each are all in agreement. We have placed these sequence data, sample metadata, and “known” phylogenetic trees in publicly-accessible databases and developed a standard descriptive spreadsheet format describing each dataset. To facilitate easy downloading of these benchmarks, we developed an automated script that uses the standard descriptive spreadsheet format. Results Our “outbreak” benchmark datasets represent the four major foodborne bacterial pathogens (Listeria monocytogenes, Salmonella enterica, Escherichia coli, and Campylobacter jejuni and one simulated dataset where the “known tree” can be accurately called the “true tree”. The downloading script and associated table files are available on GitHub: https://github.com/WGS-standards-and-analysis/datasets. Discussion These five benchmark datasets will help standardize comparison of current and

  15. ATLAS Distributed Computing Automation

    CERN Document Server

    Schovancova, J; The ATLAS collaboration; Borrego, C; Campana, S; Di Girolamo, A; Elmsheuser, J; Hejbal, J; Kouba, T; Legger, F; Magradze, E; Medrano Llamas, R; Negri, G; Rinaldi, L; Sciacca, G; Serfon, C; Van Der Ster, D C

    2012-01-01

    The ATLAS Experiment benefits from computing resources distributed worldwide at more than 100 WLCG sites. The ATLAS Grid sites provide over 100k CPU job slots, over 100 PB of storage space on disk or tape. Monitoring of status of such a complex infrastructure is essential. The ATLAS Grid infrastructure is monitored 24/7 by two teams of shifters distributed world-wide, by the ATLAS Distributed Computing experts, and by site administrators. In this paper we summarize automation efforts performed within the ATLAS Distributed Computing team in order to reduce manpower costs and improve the reliability of the system. Different aspects of the automation process are described: from the ATLAS Grid site topology provided by the ATLAS Grid Information System, via automatic site testing by the HammerCloud, to automatic exclusion from production or analysis activities.

  16. AUTOMATED API TESTING APPROACH

    OpenAIRE

    SUNIL L. BANGARE; SEEMA BORSE; PALLAVI S. BANGARE; SHITAL NANDEDKAR

    2012-01-01

    Software testing is an investigation conducted to provide stakeholders with information about the quality of the product or service under test. With the help of software testing we can verify or validate the software product. Normally testing will be done after development of software but we can perform the software testing at the time of development process also. This paper will give you a brief introduction about Automated API Testing Tool. This tool of testing will reduce lots of headache ...

  17. Scalable Automated Model Search

    Science.gov (United States)

    2014-05-20

    related to GHOSTFACE is Auto- Weka [38]. As the name suggests, Auto- Weka aims to automate the use of Weka [10] by ap- plying recent derivative-free...algorithm is one of the many optimization algorithms we use as part of GHOST- FACE. However, in contrast to GHOSTFACE, Auto- Weka focuses on single node...performance and does not optimize the parallel ex- ecution of algorithms. Moreover, Auto- Weka treats algorithms as black boxes to be executed and

  18. Automated Cooperative Trajectories

    Science.gov (United States)

    Hanson, Curt; Pahle, Joseph; Brown, Nelson

    2015-01-01

    This presentation is an overview of the Automated Cooperative Trajectories project. An introduction to the phenomena of wake vortices is given, along with a summary of past research into the possibility of extracting energy from the wake by flying close parallel trajectories. Challenges and barriers to adoption of civilian automatic wake surfing technology are identified. A hardware-in-the-loop simulation is described that will support future research. Finally, a roadmap for future research and technology transition is proposed.

  19. Automated Digital Dental Articulation

    OpenAIRE

    Xia, James J.; Chang, Yu-Bing; Gateno, Jaime; Xiong, Zixiang; Zhou, Xiaobo

    2010-01-01

    Articulating digital dental models is often inaccurate and very time-consuming. This paper presents an automated approach to efficiently articulate digital dental models to maximum intercuspation (MI). There are two steps in our method. The first step is to position the models to an initial position based on dental curves and a point matching algorithm. The second step is to finally position the models to the MI position based on our novel approach of using iterative surface-based minimum dis...

  20. Development Of A Centrifugal Hydrogen Pipeline Gas Compressor

    Energy Technology Data Exchange (ETDEWEB)

    Di Bella, Francis A. [Concepts NREC, White River Junction, VY (United States)

    2015-04-16

    Concepts NREC (CN) has completed a Department of Energy (DOE) sponsored project to analyze, design, and fabricate a pipeline capacity hydrogen compressor. The pipeline compressor is a critical component in the DOE strategy to provide sufficient quantities of hydrogen to support the expected shift in transportation fuels from liquid and natural gas to hydrogen. The hydrogen would be generated by renewable energy (solar, wind, and perhaps even tidal or ocean), and would be electrolyzed from water. The hydrogen would then be transported to the population centers in the U.S., where fuel-cell vehicles are expected to become popular and necessary to relieve dependency on fossil fuels. The specifications for the required pipeline hydrogen compressor indicates a need for a small package that is efficient, less costly, and more reliable than what is available in the form of a multi-cylinder, reciprocating (positive displacement) compressor for compressing hydrogen in the gas industry.

  1. Coal log pipeline research at the University of Missouri

    Energy Technology Data Exchange (ETDEWEB)

    Liu, H.

    1992-03-01

    Project tasks: Perform the necessary testing and development to demonstrate that the amount of binder in coal logs can be reduced to 8% or lower to produce logs with adequate strength to eliminate breakage during pipeline transportation, under conditions experienced in long distance pipeline systems. Prior to conducting any testing and demonstration, grantee shall perform an information search and make full determination of all previous attempts to extrude or briquette coal, upon which the testing and demonstration shall be based. Perform the necessary development to demonstrate a small model of the most promising injection system for coal-logs, and test the logs produced from Task 1. Conduct economic analysis of coal-log pipeline, based upon the work to date. Refine and complete the economic model. Prepare a final report for DOE.

  2. Directional drill keys completion of South China Sea pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Callnon, D. [Cherrington Corp., Sacramento, CA (United States); Weeks, K. [KRW Associates, Sacramento, CA (United States)

    1996-04-08

    Directional drilling laid dual 12-in. natural gas pipelines beneath a critical sea wall on Lantau Island, Hong Kong New Territories, to complete a 30-mile gas-pipeline crossing of the South China Sea. The project was part of Towngas Lantau construction for Hong Kong`s new Chek Lap Kok International Airport on the island. To avoid disturbing a newly installed sea wall at Ta Pang Po beach, NKK subcontracted parallel beach approaches to Cherrington Corp., Sacramento. Between July 11 and Aug. 2, 1995, Cherrington Corp. drilled and forward-reamed two, 20 in., 1,294-ft holes to pull back the twin pipelines. The project was completed during typhoon weather, high seas, strong currents, and logistical problems associated with operating in a remote uninhabited area. This paper reviews the design of the beach approach entries; staging and site preparations; drilling equipment used; and overall project operations.

  3. Pipeline monitoring with interferometry in non-arid regions

    Energy Technology Data Exchange (ETDEWEB)

    McCardle, Adrian; Rabus, Bernhard; Ghuman, Parwant [MacDonald Dettwiler, Richmond, BC (Canada); Freymueller, Jeff T. [University of Alaska, Fairbanks (United States)

    2005-07-01

    Interferometry has become a proven technique for accurately measuring ground movements caused by subsidence, landslides, earthquakes and volcanoes. Using space borne sensors such as the ERS, ENVISAT and RADARSAT satellites, ground deformation can be monitored on a millimeter level. Traditionally interferometry has been limited to arid areas however new technology has allowed for successful monitoring in vegetated regions and areas of changing land-cover. Analysis of ground movement of the Trans-Alaskan pipeline demonstrates how these techniques can offer pipeline engineers a new tool for observing potential dangers to pipeline integrity. Results from Interferometric Point Target Analysis were compared with GPS measurements and speckle tracking interferometry was demonstrated to measure a major earthquake. (author)

  4. Simulation of pipeline in the area of the underwater crossing

    Science.gov (United States)

    Burkov, P.; Chernyavskiy, D.; Burkova, S.; Konan, E. C.

    2014-08-01

    The article studies stress-strain behavior of the main oil-pipeline section Alexandrovskoye-Anzhero-Sudzhensk using software system Ansys. This method of examination and assessment of technical conditions of objects of pipeline transport studies the objects and the processes that affect the technical condition of these facilities, including the research on the basis of computer simulation. Such approach allows to develop the theory, methods of calculations and designing of objects of pipeline transport, units and parts of machines, regardless of their industry and destination with a view to improve the existing constructions and create new structures, machines of high performance, durability and reliability, maintainability, low material capacity and cost, which have competitiveness on the world market.

  5. Precommissioning services for RG2 gas pipeline, Roncador Field, Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo Filho, Ayres de [PETROBRAS, Rio de Janeiro, RJ (Brazil); Gualdron, Guillermo [Halliburton Servicos Ltda., RJ (Brazil)

    2003-07-01

    Gas pipelines composed of rigid sections connected to the production units with flexible rises are not uncommon in the Campos Basin, offshore Brazil. The precommissioning activities in these pipelines face the challenge of high pressures during dewatering and the risk of high volumes of remaining water in the flexible sections after pig swabbing. The combined operation of dewatering and conditioning using a pig train of swabbing pigs and glycol propelled by nitrogen has proven to be an efficient method in these cases. This paper describes the theory supporting this methodology and the results of a practical case recently completed: the precommissioning of the RG2 gas export pipeline after connection to the FPSO Brazil production unit that replaced the P36 Platform. (author)

  6. An milp formulation for the scheduling of multiproduct pipeline systems

    Directory of Open Access Journals (Sweden)

    R. Rejowski Jr.

    2002-12-01

    Full Text Available Pipelines provide an economic mode of fluid transportation for petroleum systems, specially when large amounts of these products have to be pumped for large distances. The system discussed in this paper is composed of a petroleum refinery, a multiproduct pipeline connected to several depots and the corresponding consumer markets that receive large amounts of gasoline, diesel, LPG and aviation fuel. An MILP optimization model that is based on a convex-hull formulation is proposed for the scheduling system. The model must satisfy all the operational constraints, such as mass balances, distribution constraints and product demands. Results generated include the inventory levels at all locations, the distribution of products between the depots and the best ordering of products in the pipeline.

  7. Detection of underground pipeline based on Golay waveform design

    Science.gov (United States)

    Dai, Jingjing; Xu, Dazhuan

    2017-08-01

    The detection of underground pipeline is an important problem in the development of the city, but the research about it is not mature at present. In this paper, based on the principle of waveform design in wireless communication, we design an acoustic signal detection system to detect the location of underground pipelines. According to the principle of acoustic localization, we chose DSP-F28335 as the development board, and use DA and AD module as the master control chip. The DA module uses complementary Golay sequence as emission signal. The AD module acquisiting data synchronously, so that the echo signals which containing position information of the target is recovered through the signal processing. The test result shows that the method in this paper can not only calculate the sound velocity of the soil, but also can locate the location of underground pipelines accurately.

  8. University of California, Irvine-Pathology Extraction Pipeline: the pathology extraction pipeline for information extraction from pathology reports.

    Science.gov (United States)

    Ashish, Naveen; Dahm, Lisa; Boicey, Charles

    2014-12-01

    We describe Pathology Extraction Pipeline (PEP)--a new Open Health Natural Language Processing pipeline that we have developed for information extraction from pathology reports, with the goal of populating the extracted data into a research data warehouse. Specifically, we have built upon Medical Knowledge Analysis Tool pipeline (MedKATp), which is an extraction framework focused on pathology reports. Our particular contributions include additional customization and development on MedKATp to extract data elements and relationships from cancer pathology reports in richer detail than at present, an abstraction layer that provides significantly easier configuration of MedKATp for extraction tasks, and a machine-learning-based approach that makes the extraction more resilient to deviations from the common reporting format in a pathology reports corpus. We present experimental results demonstrating the effectiveness of our pipeline for information extraction in a real-world task, demonstrating performance improvement due to our approach for increasing extractor resilience to format deviation, and finally demonstrating the scalability of the pipeline across pathology reports for different cancer types. © The Author(s) 2014.

  9. Automation in biological crystallization.

    Science.gov (United States)

    Stewart, Patrick Shaw; Mueller-Dieckmann, Jochen

    2014-06-01

    Crystallization remains the bottleneck in the crystallographic process leading from a gene to a three-dimensional model of the encoded protein or RNA. Automation of the individual steps of a crystallization experiment, from the preparation of crystallization cocktails for initial or optimization screens to the imaging of the experiments, has been the response to address this issue. Today, large high-throughput crystallization facilities, many of them open to the general user community, are capable of setting up thousands of crystallization trials per day. It is thus possible to test multiple constructs of each target for their ability to form crystals on a production-line basis. This has improved success rates and made crystallization much more convenient. High-throughput crystallization, however, cannot relieve users of the task of producing samples of high quality. Moreover, the time gained from eliminating manual preparations must now be invested in the careful evaluation of the increased number of experiments. The latter requires a sophisticated data and laboratory information-management system. A review of the current state of automation at the individual steps of crystallization with specific attention to the automation of optimization is given.

  10. Building Biochips: A Protein Production Pipeline

    Energy Technology Data Exchange (ETDEWEB)

    de Carvalho-Kavanagh, M; Albala, J S

    2004-02-09

    Protein arrays are emerging as a practical format in which to study proteins in high-throughput using many of the same techniques as that of the DNA microarray. The key advantage to array-based methods for protein study is the potential for parallel analysis of thousands of samples in an automated, high-throughput fashion. Building protein arrays capable of this analysis capacity requires a robust expression and purification system capable of generating hundreds to thousands of purified recombinant proteins. We have developed a method to utilize LLNL-I.M.A.G.E. cDNAs to generate recombinant protein libraries using a baculovirus-insect cell expression system. We have used this strategy to produce proteins for analysis of protein/DNA and protein/protein interactions using protein microarrays in order to understand the complex interactions of proteins involved in homologous recombination and DNA repair. Using protein array techniques, a novel interaction between the DNA repair protein, Rad51B, and histones has been identified.

  11. 77 FR 72905 - Pipeline Safety: Random Drug Testing Rate; Contractor MIS Reporting; and Obtaining DAMIS Sign-In...

    Science.gov (United States)

    2012-12-06

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF TRANSPORTATION Pipeline and Hazardous Materials Safety Administration Pipeline Safety: Random Drug Testing Rate; Contractor MIS Reporting; and Obtaining DAMIS Sign-In Information AGENCY: Pipeline and Hazardous Materials...

  12. WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction

    Science.gov (United States)

    Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro

    2017-04-01

    Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS

  13. Residual stresses evaluation in a gas-pipeline crossing

    Energy Technology Data Exchange (ETDEWEB)

    Fonseca, Maria Cindra [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil); Almeida, Manoel Messias [COMPAGAS, Curitiba, PR (Brazil); Rebello, Joao Marcos Alcoforado [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil); Souza Filho, Byron Goncalves de [PETROBRAS, Rio de Janeiro, RJ (Brazil)

    2009-07-01

    The X-rays diffraction technique is a well established and effectiveness method in the determination of the residual and applied stresses in fine grained crystalline materials. It allows to characterize and to quantify the magnitude and direction of the existing surface stresses in the studied point of the material. The objective of this work is the evaluation of the surface stresses in a 10 in diameter Natural Gas Distribution Pipeline manufactured from API 5 L Gr B steel of COMPAGAS company, in a crossing with a Natural Gas Transportation Pipeline, in Araucaria-PR. This kind of evaluation is important to establish weather you have to perform a repositioning of one of the pipeline or not. The measurements had been made in two transversal sections of the pipe, the one upstream (170 mm of the external wall of the pipeline) and another one downstream (840 mm of the external wall of the pipeline). Each transversal section measurements where carried out in 3 points: 9 hours, 12 hours and 3 hours. In each measured point of the pipe surface, the longitudinal and transversal stresses had been measured. The magnitude of the surface residual stresses in the pipe varied of +180 MPa at the -210 MPa. The residual stress state on the surface of the points 12 hours region is characterized by tensile stresses and by compressive stresses in the points of 3 and 9 hours region. The surface residual stresses in gas-pipeline have been measured using X-ray diffraction method, by double exposure technique, using a portable apparatus, with Cr-K-alpha radiation. (author)

  14. Purging and load operations of Bolivia-Brazil gas pipeline

    Energy Technology Data Exchange (ETDEWEB)

    Frisoli, C.; Senna, F.J.E. [Petrobras Transporte, Rio de Janeiro (Brazil); Carvalho de Faria, J.A. [TNG Pipeline, Rio de Janeiro (Brazil)

    2004-07-01

    This paper provided details of a new purge and load technique that was successfully used at the Bolivia-Brazil gas pipeline. The method used high-speed natural gas injection to expel air in the pipeline. As there are no pigs separating the products, an explosive mixture is formed. When the gas front reaches the end of a segment of pipe and all air has been purged, the gas pipeline section is considered to be gasified. A mathematical model was used to determine the minimum purge speed of the gas as well as the amount of time needed for the purge. The time required was calculated by the model through the use of flow equations to calculate friction loss and flow rate. Minimum purge speed was a direct function of the difference of density between the gases and the gas pipeline diameter. A nitrogen seal was interposed between the gas and the air to avoid the formation of explosive mixtures inside the gas pipeline. Gas used for the northern Brazilian branch gasification and pressurization was stored in the Bolivian pipeline side. A volume of 2000 m{sup 3} of nitrogen was injected at sections where the purge was being carried out. All the gas used for gasification and pressurization of the southern branch was provided by the northern branch. Natural gas injection was performed with pressure reduction in 2 stages. Pressure control of natural gas injection used an assembly of existing valves in an intermediate station interconnecting the pig receiver of the section upstream and the pig launcher at the next station. Pressure recorders, gauges, and gas analyzers were used to monitor and control all gasification procedures. It was concluded that the operation was a success. The nitrogen trapping procedure helped the project to avoid all the inconveniences typically encountered during pigging operations. 1 ref., 2 tabs., 4 figs.

  15. Automated Analysis of Clinical Flow Cytometry Data: A Chronic Lymphocytic Leukemia Illustration.

    Science.gov (United States)

    Scheuermann, Richard H; Bui, Jack; Wang, Huan-You; Qian, Yu

    2017-12-01

    Flow cytometry is used in cell-based diagnostic evaluation for blood-borne malignancies including leukemia and lymphoma. The current practice for cytometry data analysis relies on manual gating to identify cell subsets in complex mixtures, which is subjective, labor-intensive, and poorly reproducible. This article reviews recent efforts to develop, validate, and disseminate automated computational methods and pipelines for cytometry data analysis that could help overcome the limitations of manual analysis and provide for efficient and data-driven diagnostic applications. It demonstrates the performance of an optimized computational pipeline in a pilot study of chronic lymphocytic leukemia data from the authors' clinical diagnostic laboratory. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. In the Pipeline; Georgia's Oil and Gas Transit Revenues

    OpenAIRE

    Jonathan C Dunn; Andreas Billmeier; Bert van Selm

    2004-01-01

    Starting in 2005, nontax revenue in Georgia is expected to rise significantly, in the form of transit fees for oil transported through the Baku-Tbilisi-Ceyhan Oil Pipeline. Transit fees for gas transported through the South Caucasus Pipeline are expected to start in 2007. This paper discusses (1) how much additional revenue can be expected, (2) prospects for monetizing gas that could be received as in-kind transit fees, in the light of pervasive nonpayment in the domestic gas sector, (3) the ...

  17. Pipeline risk management manual ideas, techniques, and resources

    CERN Document Server

    Muhlbauer, W Kent

    2004-01-01

    Here's the ideal tool if you're looking for a flexible, straightforward analysis system for your everyday design and operations decisions. This new third edition includes sections on stations, geographical information systems, ""absolute"" versus ""relative"" risks, and the latest regulatory developments. From design to day-to-day operations and maintenance, this unique volume covers every facet of pipeline risk management, arguably the most important, definitely the most hotly debated, aspect of pipelining today.Now expanded and updated, this widely accepted standard reference guide

  18. VPipe: Virtual Pipelining for Scheduling of DAG Stream Query Plans

    Science.gov (United States)

    Wang, Song; Gupta, Chetan; Mehta, Abhay

    There are data streams all around us that can be harnessed for tremendous business and personal advantage. For an enterprise-level stream processing system such as CHAOS [1] (Continuous, Heterogeneous Analytic Over Streams), handling of complex query plans with resource constraints is challenging. While several scheduling strategies exist for stream processing, efficient scheduling of complex DAG query plans is still largely unsolved. In this paper, we propose a novel execution scheme for scheduling complex directed acyclic graph (DAG) query plans with meta-data enriched stream tuples. Our solution, called Virtual Pipelined Chain (or VPipe Chain for short), effectively extends the "Chain" pipelining scheduling approach to complex DAG query plans.

  19. An algorithm to automate yeast segmentation and tracking.

    Directory of Open Access Journals (Sweden)

    Andreas Doncic

    Full Text Available Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation.

  20. An algorithm to automate yeast segmentation and tracking.

    Science.gov (United States)

    Doncic, Andreas; Eser, Umut; Atay, Oguzhan; Skotheim, Jan M

    2013-01-01

    Our understanding of dynamic cellular processes has been greatly enhanced by rapid advances in quantitative fluorescence microscopy. Imaging single cells has emphasized the prevalence of phenomena that can be difficult to infer from population measurements, such as all-or-none cellular decisions, cell-to-cell variability, and oscillations. Examination of these phenomena requires segmenting and tracking individual cells over long periods of time. However, accurate segmentation and tracking of cells is difficult and is often the rate-limiting step in an experimental pipeline. Here, we present an algorithm that accomplishes fully automated segmentation and tracking of budding yeast cells within growing colonies. The algorithm incorporates prior information of yeast-specific traits, such as immobility and growth rate, to segment an image using a set of threshold values rather than one specific optimized threshold. Results from the entire set of thresholds are then used to perform a robust final segmentation.